

Steamdb lets you filter out games with less than x reviews which I’ve made liberal use of over the years.


Steamdb lets you filter out games with less than x reviews which I’ve made liberal use of over the years.


I’d just take a look at the Steam Deck Verified pages as that’ll give you a good idea about a game (at least though Proton).
Or S is a straitjacket for TempleOS users


Apple already has the Game Porting Toolkit which is made by CodeWeavers - D3DMetal can run a lot of Windows games like Proton’s DXVK/VKD3D. MoltenVK is a little behind to fully empower VKD3D on macOS; it’s not as smooth sailing as Proton.
The biggest issue is that Apple are still hoping developers spend the time to work on converting shaders to Metal, implement Game Center, UI and Accessibility features etc so the game feels like a native app.
Which is dumb. As was Metal (they should have just made Metal as a Vulkan abstraction layer).
Valve took the smart route and while they love developers using the Steam SDK, at least with the Steam Overlay they can still offer a native-like feeling experience.
Here’s hoping Steam Machine etc is incredibly disruptive as if it’s a decent workstation too, there’s a dwindling number of reasons to not use Linux (Adobe / Affinity / Office / AutoCAD / MinecraftBE / Fortnite).


CVEs don’t get issued “resolved” statuses… They are either reserved, published, or rejected (technically NVD have a few extra for published). That’s just junk data in that tool you’re using. Use authoritative sources like cve.org or nvd.nist.gov.
You can see the CPEs on NVD and they’re old versions of Plex (and were old when the vulns were published).


You’re aware those CVEs are only relevant for ancient versions of Plex and were fixed long ago?


You’re going to need to back up your claim otherwise you might as well be lying as there’s no CVE like this I can find nor any public disclosure.
Plex have a bug bounty program and a responsive security team too.
Post your security report.


Oh, I know (I am a greying wizard), but why should the editor care? In theory, it should assume XCF, with the opened JPEG as the first layer.
Instead, it’s gotchas and RTFM. Which is sadly a very poor approach when developing a tool used by creatives who are vastly less likely to RTFM than the engineers making the tool.


Manually adding alpha channels to layers… I’ve seen so many people knock their heads against GIMP because, for whatever reason, they didn’t just add the channel by default. (Okay, sure it’s probably the default if you’re starting with a blank file but the background layer doesn’t have one and if you start by opening a jpg, then subsequent layers won’t have alpha because… reasons…)
I don’t think it’s because they don’t have UX designers, it’s because they only solicit feedback from existing users rather than researching new user experience and watching how a new user gets on with the program.
I also think very few Adobe or Affinity power users get stuck into GIMP etc because they bounce off it so quickly. So they never get feedback from the very users they want to convince to move over.


Inkscape and GIMP etc are fine tools in their own right (I have had them installed for years) but where things have always broken down is when you’re working in larger teams and working towards a larger goal.
Inkscape, GIMP, Krita, LibreOffice is an awful chain when you compare it to say Affinity where you can shift between vector, pixel, and layout workflows within the same tool (or copy and paste seamlessly across Adobe tools).
Until the FOSS community sits down and works with creatives and end users who don’t use the tools (which Audacity did thanks to Tantacrul and the results speak for themselves), we’ll be stuck with proprietary tools.
The problem is when new users turn up to give feedback to say Inkscape for some of their weirdness like opening a blank doc each time the app opens, different tabs for fill and stroke color, weird behavior with fonts changing when you backspace out to an empty box, blah blah, the community goes “skill issue” or “this isn’t Adobe”.
Yet they fail to understand the design decisions as to why other products have more obvious behaviour patterns - they want the tool to be relatively self explanatory and try and align to user expectations as much as possible.
Tantacrul did a great talk at FOSS Backstage Design conference that is really worth watching if you’re interested in the topic.
I’d say the second one was even harder too.


<


Isn’t 64bit Steam Client due to drop for Windows imminently? They end 32bit support at the start of next year supposedly.


Really well articulated.
Valve have enabled a critical mass of “target platforms” that enables both the community and developers to get things working on Linux, which all other distros are about to benefit from.
I’m likely going to buy all the new Valve hardware out of principle. The Deck is incredible, but I still have my beefy gaming rig. But my living room wouldn’t mind a Steam Machine (and my girlfriend is definitely after both a Steam Frame and Controller 2.
I’m taking time off work in a couple of weeks and I’m moving over to Linux completely - I too have felt the inertia of dual booting and find myself in Windows far too often.


The failure of the Steam Machine is why Valve hosted Khronos group at their office to kick off Vulkan and funded LunarG etc in the early days to get things moving quickly.
Valve took their time but this new hardware range is based on years of learning and solving the problems from their original foray into hardware and Linux for gaming.
And I’m so thankful for it!


And while the Link hardware was cancelled, they still put out updates for it. I think the last update was only a month or two ago.
Steam Controller 1 also got cancelled but they’re still shipping updates and showing that device too.


GCompris or TuxPaint are great for younger kids. They’re free/open source and have versions available for Windows, macOS, and Linux.


Or spin up TES3MP with some friends and experience it together!


Have you had anyone with experience with security look at this thing? There’s a lot of really questionable practices in your schedule shell scripts. I especially find how you’re handling VPN secrets kinda worrying. And the backup_challenge_clients.sh script isn’t robust at all. Your nginx config has a few bad choices like lack of try_files, the regex \.php$. It’s definitely not hardened so I hope people don’t put this Internet facing.
I’ve spent like 5min in the GitHub to get a feel for the project maturity. Personally, I don’t think this is suitable for actual use yet.
If you’ve not done any security assessments on your project yet, you might not want to (a) call it “Safe”box and (b) might not want to start charging money for it until you do.
I worry you’re setting yourself up for a hard-to-shake-off embarrassment should a nasty vuln be found. Maybe a name like “selfbox” etc that drops the connotation of security would be safer.
Edit: Kudos on the project website though! Looks fricking gorgeous.
I’d rather a Mac than a Windows box. At least you get a proper shell (zsh or bash - zsh is the default now I think), python installed by default, can install package managers (macports, brew), can get coreutils, etc and most FOSS software from the Linux world runs since macs are UNIX at heart.
I’m pretty sure
cdisn’t even coreutils but implemented by shells as a wrapper forchdir/fchdirwhich is part of the kernel. Which has always bugged me since you can’t reliably pipe or redirect tocdsince shells do things differently; it doesn’t handlestdinor the last component of a command runs in a subshell so doesn’t affect your current shell, blah blah.