

While I agree with the rest, does Lutris have backup options? I never actually checked, but don’t remember seeing any of that
While I agree with the rest, does Lutris have backup options? I never actually checked, but don’t remember seeing any of that
They sure do, unless you missed a parenthesis and somebody wants to point that out ;)
Sure, but the point is to be realistic and not put undue weight on the developers, right? Binaries can generally be much more permissive than source code when proprietary dependencies are involved, and easier to release “clean” than source code.
I haven’t seen that and haven’t tried it, but I know fish tartare is a thing, so that could be it
It’s not being made “as painful as possible”, it’s just manual. Arch isn’t a distro that’ll preconfigure things for you so everything’s plug’n’play, it’s a distro that’ll give you access to everything and the power to use it however you like, but with that comes the expectation and responsibility to manage those things.
Installing arch manually is simply a good lesson in how your system is set up, what parts it’s made up of, in part because you’re free to remove and switch out those parts.
And sure, there’s no magic bullet to make sure a new user understands everything they did, but I think in the end, if you’re not willing to read, learn and troubleshoot, you might just want a different distro.
I don’t think either has ntsync support enabled by default, but it’s supposed to have better accuracy or performance, thanks to putting the needed APIs directly in the kernel, right?
Isn’t installing extensions in it also a pain, since the Google webstore doesn’t let you install from it?
I guess to answer my own question, I looked it up - there’s an extension to let you do that alongside some flag changes, so I guess not too bad… But it’s another step on the list of things you’d want to do as a user
I believe Steam Deck got a completely new interface that also later replaced the old Big Picture mode. It also of course has a more complex setup, since it’s not running in a desktop environment, but that’s more about the overlay and running games.
Not the same person and cba to get a timestamp right now, but it’s the 80% rule - the electrical stuff isn’t designed to deliver the rated amperage continuously for hours on end, so for car charging, you’re apparently supposed to limit it to 80%. Now, 80% of 50 isn’t 42 but 40, so not sure if it’s a case of 80% not being a precise number or a mistake here, but it roughly checks out.
I’m not sure which puzzles you’re referring to - do you mean stuff to reach an ending, or the obscure, very much optional, deep secrets?
It’s been a while since I played it, but I don’t remember grindy puzzles in the main content, bar the big one, but that one felt exhilarating to figure out and solve.
As for combat, it is difficult, but I remember beating the whole game without turning down the difficulty (which I remember being a thing), so it seemed fine to me… But yeah, people misrepresenting a game is always a risk.
I will point out that (IIRC) Tunic does have significantly more mechanical progression than some other examples, like Outer Wilds or Toki Tori 2, but they’re all lovely games
This feels like surreal memes before they turned into almost entirely misspellings and other repeat jokes.
The issue is that the privacy policy changed on an old game people bought long ago, and now they’re not allowed to play the game without agreeing to the changes.
I had the impression cloud was about the opposite - detaching your server software from physical machines you manage, instead paying a company to provide more abstracted services, with the ideal being high scalability by having images that can be deployed en masse independent of the specifics of where they’re hosted and on what hardware. Pay for “storage”, instead of renting a machine with specific hardware and software, for example.
Yes, apple should allow that, and Sony should allow that. Your “gotcha” seems pretty stupid, because “allow” doesn’t mean “facilitate” - it’s not Apple’s responsibility to make those things work on their devices, but Apple is going out of their way to prevent individuals from making those things happen on their own.
If you license your project under GPL, and somebody submits some code (like through a pull request) that ends up in the library you use, you are now also bound by the GPL license, meaning you also have to publish the source of any derivatives.
The way to avoid it is to use something like a CLA, requiring every contributor to sign an agreement giving you special rights to their code, so you can ignore the GPL license in relation to the code they wrote. This works, but is obviously exploitative, taking rights to contributions while giving out less.
It also means if somebody forks the project, you can’t pull in their changes (if you can’t meet GPL terms, of course), unlike with MIT, where by default everybody can make their own versions, public or private, for any purpose.
Though it’s worth noting, if you license your code under MIT, a fork can still add the GPL license on top, which means if you wanted to pull in their changes you’d be bound to both licenses and thus GPL terms. I believe this is also by design in the GPL license, to give open-source an edge, though that can be a bit of a dick move when done to a good project, since it lets the GPL fork pull in changes from MIT versions without giving back to them.
I think the trick might be that nothing is stopping you from using more than one 32-bit integer to represent addresses and the kernel maps memory for processes in the first place, so as long as each process individually can work within the 32-bit address space, it’s possible for the kernel to allocate that extra memory to processes.
I do suppose on some level the architecture, as in the CPU and/or motherboard need to support retrieving memory using more than 32 bits of address space, which would also be what somebody else replied, and seems to be available since 1999 on both AMD and Intel.
Thinking about it… Since I’m not planning on switching to windows, and (non-VAC?) bans on steam are game specific, would I even care if I got banned from a game for using it on Linux?
Though I probably also won’t be playing this game, since I have no prior interest, so I’m biased.
Is it? Or did they choose Arch because of the ease of setting it up with all the latest software the community was already packaging?
I would assume they sent them after the number was drawn, to before the number was drawn, which means the future self doesn’t need their own message to learn the numbers.