Just your normal everyday casual software dev. Nothing to see here.

People can share differing opinions without immediately being on the reverse side. Avoid looking at things as black and white. You can like both waffles and pancakes, just like you can hate both waffles and pancakes.

  • 0 Posts
  • 33 Comments
Joined 2 years ago
cake
Cake day: August 15th, 2023

help-circle

  • I have Proxmox Backup Server backing up to an external drive nightly, and then about every 2 or 3 weeks also backup to a cold storage which I store offsite. (this is bad practice I know but I have enough redundancies in place of personal data that I’m ok with it).

    For critical info like my personal data I have a sync-thing that is syncing to 3 devices, so for personal info I have roughly 4 copies(across different devices) + the PBS + potentially dated offsite.



  • However, Google can still take a service fee on transactions completed with alternative payment methods. For in-app and linked purchases in games, Google can charge a fee of up to 20% for purchases that “impact game outcomes, gameplay progress rate, or player power,” as well as for purchases with “random outcomes” (i.e., loot boxes).

    What the hell even was the point then for Epic fighting this. “Oh yea you can use an alternate store and use an alternate payment system, FYI we are still charging you the google service fee”

    Like sure 20% is less than 30% but like, google isn’t involved the number should be 0%

    Also the fact that google required a time-frame(3 years) on how long it has to be before they are allowed to pay other stores to not put their store on the device… I feel like Epic’s entire fight was just a way to piss money away, only thing it really accomplished was a fake image of software freedom and a control to add even more restrictions in place.


  • Personally, I don’t think either comparison is valid. These two items are nowhere near comparable to the original comment. With tennis if you don’t move, you can’t play. With the examples I gave above most of the game would remain available to the player, just in a single player or PvE environment. Survival RPG’s can easily be made either SP or PvE only, Dune actually came super close, they just decided to heavily limit the end game PvE compatible areas and locked the passage via a PvP area which is why I decided to just not get it.

    BUT ignoring the false equivalence fallacy, if the player is willing to spend money on the game in the first place, it shouldn’t matter. Even more-so when the game is basically Ark Survival on Scorched Earth with a dune skin on it and a few additional mechanics added on. There was no decent reason that the game could not allow for a PvE only mode or at least the ability to self host your servers. They said they couldn’t do either under the excuse that they wanted the game to be an MMO(which arguably they failed to deliver on as well)

    As for New World? As a person who played it from beta(which I do regret because its not my style game, I just really wanted to like it), New Worlds downfall wasn’t the dev’s trying to cater to everyone, it was the lack of a story/ambition to want to play. It was the same gameplay loop over and over with no drive to want to continue the story. This combined with the failure to have a decent “end game” (story line wise) at launch killed it’s userbase. They promoted a very heavily PvP based cooperative system and then massively fell through on the promises. This combined with the inconsistent servers and the boring game-play elements, made player retention extremely difficult. That’s not appealing to masses, that’s failing to deliver on promises and making a shit game.


  • And that’s completely fine. But by the developers choosing to go that route, they just outcast the people like me that will not buy that type of game.

    Being said though, I find it difficult to understand why a studio would want to go that way. Like, I am the player. If I want to make the game easier on myself, then I should be able to. If I’m willing to spend money on your game, It doesn’t really matter how hard it is.

    I get that if a game has an endgame that is heavily PvP based, that it might affect PvP by allowing a PvE only mode. But, to me, I don’t really care because, regardless of their decision, I wouldn’t be in that PvP area anyway. It’s just one outcome is I spend money on their game, and one outcome is I don’t.

    Many games I can see them going this route on, such as Overwatch 2 or Dota, but survival RPG games, I don’t see the point of having that type of system for, And I definitely think they’re losing money by going that route.



  • Honestly, Ark was so close with that aspect with the Obelisks. It would have been so cool to allow for char based PvP toggle (meaning when the char was made it had a setting if it was PvP or PvE), then allow for the Obelisks to teleport you to the designated PvE vs PvP zone. Have PvE invisible to PvP and if the structure is owned by a player in the other zone, it doesn’t exist. Have a designated spot on the map accessible like the boss arena system that allows PvE and PvP players to mix and mingle/fight if they wanted to.

    This would allow for using the same map for both modes, so lower system resources, it’s just the structures itself only show for players in the same PvP mode. So a base could exist in the same location on both PvP and PvE and the two modes would be non the wiser.



  • Yea, but I’m still not interested in spending money on a game without PvE modes. If it requires me to enter a PvP area, I’m not interested regardless of the amount of time I can spend in PvE only areas.

    Honestly, I liked how runescape did PvP areas, you didn’t have to enter them, you could obtain the material via other means like the grand exchange. This is a good way of doing a PvPvE without hindering your PvE audience. How dune did it was more of a slap to the face, since its a small area thats shared with everyone that wants PvE and PvP, requires entering PvP areas to get to and has a limited resource so on congested servers it’s a big nuisance. it felt like more of a “this can say we tried” than an actual implementation.


  • the implication of that is weird to me. I’m not saying that the horse is wrong, but thats such a non-standard solution. That’s implementing a CGNAT restriction without the benefits of CGNAT. They would need to only allow internal to external connections unless the connection was already established. How does standard communication still function if it was that way, I know that would break protocols like basic UDP, since that uses a fire and forget without internal prompting.





  • They don’t /have/ to, but I will say if they don’t it removes any chance of me ever buying it.

    I was up and ready to buy Dune launch week, but then I noticed there was no full PVE mode and I had no way of creating a PVE environment by self hosting or by other means. This blew all interest I had in the game.

    To me it makes logic sense that a studio that offers a PvPvE should offer a PvE experience as well. The framework is basically already there, and in some cases won’t even require more resources to do. In the case of Dune they could easily have made PvE use the same servers, but have players marked as PvE invisible to other players not in the party, or give them a ghost effect to people not in PvE mode so they know not to try and fight them.

    Any studio in my eyes refusing to acknowledge the casual non-pvp group are just throwing money away. I have easily dumped 100$ into both Ark SE and minecraft with how many times i’ve purchased them for different platforms, and these are just the ones I can think of off the top of my head. I would have never have bought either if they lacked the ability to go PvE only.


  • I haven’t used a guide aside from the official getting started with syncthing page.

    It should be similar to these steps though, I’ll use your desktop as the origin device.

    1. install syncthing on all devices you want to be syncing with
    2. on your desktop syncthing page, click “add remote device” and add the device ID of your phone(found on your phones syncthing app), you can also add any other device you want to have communications with (you will need to approve this action on the phone as well so be on the lookout for a notification)
    3. make a backup of your current keepass file just in case these steps shouldn’t cause files to change but, since the end goal is syncing two devices that you have mentioned have differences with files with the same name better safe than sorry
    4. create a keepass share on one of the devices (the folder path of this file should be wherever your keepass file is stored on your device. If this file is in a folder with a bunch of other files, you may want to move the file to it’s own subfolder or you will end up sharing all of the files in that path)
    5. under file versioning chose what type of file version control you want. I prefer staggered since it when a remote device changes the file it moves the old file to a folder, and then deletes them according to the settings
    6. at this point you should double check the name of your mobile devices keepass file name, if its the same as the name of the db on the desktop, you should rename it prior to continuing. Keepass should be able to detect a file conflict and rename it on it’s own but, better safe than sorry.
    7. share the folder with the device you want to sync it(your phone in this case)
    8. Your phone should get a notification that a device wants to share something with it. Approve it, be careful not to clear it because it’s a pain in the butt to get that notification back if you accidentally deny or swipe it away, the mobile app isn’t /amazing/ with it’s UI (but it has gotten better)
    9. once approved configure it to where you wanted the file to be on your mobile device.
    10. You should be done at this point. Syncthing should be automatically syncing the keepass files between the two

    Some things you may want to keep into consideration. Syncthing only operates when there are two devices or more that are online. I would recommend if you are getting into self hosting a server, having the server be the middle man. If you end up going that route these steps stay more or less the same, it’s just instead of sharing with the phone, its sharing with the server, and then moving to the server syncthing page and sharing with the mobile. This makes it so both devices use the server instead of trying to connect to each other. Additionally, if you do go that route, I recommend setting your remote devices on the server’s syncthing instance to “auto approve” this makes it so when you share a folder to the server from one of your devices, it automatically approves and makes a share using the name of the folder shared in the syncthing’s data directory. (ex. if your folder was named documents and you shared it to the server, it would create a share named “documents” in where-ever you have it configured to store data). You would still need to login to the server instance in the case of sharing said files to /another/ device, but if your intent was to only create a backup of a folder to the server, then it removes a step.

    Another benefit that using the server middleman approach is that if you ever have to change a device later on down the road, you are only having to add 1 remote device to the server instance, instead of having to add your new device onto every syncthing that needs access to that device.

    Additionally, if you already have the built in structure but it isn’t seeming like it is working, some standard troubleshooting steps I’ve found helpful:

    • if trying to share between devices, make sure that there is at least two devices that are connected as remote devices active in order to sync
    • If above is true, make sure the folder ID’s are the same between both devices. that is how syncthing detects folders that should be sync’d
    • If also true, make sure the devices are being seen as online in remote devices. If it isn’t showing as online, the connection is being blocked somewhere, verify you don’t have a firewall or router blocking it somewhere.


  • I fall into this category. Went Nvidia back in 16 when I built my gaming rig expecting that I would be using windows for awhile as gaming on Linux at that point wasn’t the greatest still, ended up deciding to try out a 5700xt (yea piss poor decision i know) a few years later because I wanted to future proof if I decided to swap to linux. The 5700XT had the worst reliability I’ve ever seen in a graphics card driver wise, and eventually got so sick of it that I ended up going back to Nvidia with a 4070. Since then my life opened up more so I had the time to swap to Linux on my gaming rig, and here we are.

    Technically I guess I could still put the 5700XT back in, and it would probably work better than being in my media server since Nvidia seems to have better isolation support in virtualized environments but, I haven’t bothered doing so, mostly because getting the current card to work on my rig was a pain, and I don’t feel like taking apart two machines to play hardware musical chairs.


  • Keepass is a great way of password management, I use keepass as well. I also use syncthing to sync my password database across all devices and then I have the server acting as the “always on” device so I have access to all passwords at all times. Works amazing because syncthing can also be setup so when a file is modified by another device, it makes a backup of the original file and moves it to a dedicated folder (with retention settings so you can have them cleaned every so often). Life is so much easier.

    For photo access you can look into immich, its a little more of an advanced setup but, I have immich looking at my photos folder in syncthing on the server, and using that location as the source. This allows me to use one directory for both photo hosting and backup/sync


  • I hard agree with this. I would NEVER have wanted to start with containerized setups. I know how I am, I would have given up before I made it past the second LXC. Starting as a generalized 1 server does everything and then learning as you go is so much better for beginnings. Worst case scenario is they can run docker as the later on containerized setup and migrate to it. Or they can do what I did, start with a single server setup, moved everything onto a few drives a few years later once I was comfortable with how it is, nuked the main server and installed proxmox, and hate life learning how it works for 2 or 3 weeks.

    Do i regret that change? No way in hell, but theres also no way I would recommend a fully compartmentalized or containerized setup to someone just starting out. It adds so many layers of complexity.