• 0 Posts
  • 656 Comments
Joined 1 year ago
cake
Cake day: March 8th, 2024

help-circle
  • I found this post confusing because on the face of it, it sounds like you agree with me.

    I mean, yeah, HEAD and head should overwrite each other.

    As you say, only technical command-line users care about the case sensitivity. So no, it shouldn’t matter to the nontechnical user. And because the nontechnical user doesn’t care about the distinction if something is called “head” in any permutation it shares a name with anything else called “head”. And the rules are items within a directory have unique filenames. So “head” and “HEAD” aren’t unique.

    The issue isn’t that the names are case insensitive, the issue is that two applications are using the same name in the same path.

    If we’re not careful that’ll lead to a question about whether consolidating things in the Unix-style directory structure is a bad idea. I normally tend to be neutral on that choice, but you make a case for how the DOS/Windows structure that keeps all binaries, libraries and dependencies under the same directory at the cost of redundancy doesn’t have this problem to begin with.

    But either way, if two pieces of software happen to choose the same name they will step over each other. The problem there is neither with case sensitivity or case insensitivity. The problem there is going back and forth between the two in a directory structure that doesn’t fence optional packages under per-application directories. As you say, this is only possible in a very particular scenario (and not what the post in question is about anyway).


  • You are right, I keep doing that.

    Bugs and security problems aren’t bad UX, they’re a backlog.

    You may not be able to afford the implementation, but that’s not the same as arguing the feature has no value. You want to argue that case insensitivity would be better but it’s too hard/problematic to implement? I can have that conversation.

    Arguing that it’s the better option in general? Nah, lost me there.

    Sorry, I said last word and then came back, but I feel we’re closer to meeting in the middle now, so maybe worth it. All yours again. This time I’m gone for reals.


  • OK, but you see how you’re saying “there is no standard implementation, so the solution is not having the feature, as opposed to selecting a standard”.

    That’s wrong. It’s just bad implementation. Or, rather, it’s bad prioritization of UX, which is then bad implementation by default.

    Also, having case sensitivity be a user toggle is not the same as having no case insensitivity. We know case sensitivity works technically, you need to do additional work to make certain characters be read as equivalent. I don’t mind if grandma wants to set her documents folder to be case sensitive to hack the world. I mind that there is no feature to make it so she can’t be confused about what file she’s selecting because the engineers didn’t like having to deal with edge cases.

    Alright, I’m getting trauma flashbacks now. I think we’ve established our positions. Happy to give you the last word.


  • OK, no, but yes, do that.

    Yes, prioritize making sure that grandmas are not confused by case sensitivity over bug-free secure software. That’s correct.

    Also do that robustly in the user layer. Why not? That’s cool as well.

    I am a bit confused about how you suggest implementing a file system where two files can have the same user-facing name in document names, file manager paths, shortcuts/symlinks, file selectors and everywhere else exposed by the user without having the file system prevent two files with the same case-insensitive name existing next to each other. That seems literally worse in every way and not how filenames are implemented in any filesystem I’ve ever used or known about. I could be wrong, though.

    Point is, I don’t care. If you figure out a good implementation go nuts.

    But whatever it is, it NEEDS to make sure grandma will never see Office.exe and office.exe next to each other in the same directory. Deal?


  • No, hold on, this is not about the OS.

    This is about whether the filesystem in the OS supports case insensitive names.

    That determines whether the GUI supports case insensitive names down the line, so the choices made by the filesystem and by the OS support of the filesystem must be done with the usability of the GUI in mind.

    So absolutely yes, the OS should decide that some characters are the same as others, not arbitrarily but because the characters are hard to read distinctly by humans and that is the first consideration.

    Or hey, we can go back to making all filenames all caps. That works, too and fully solves the problem.


  • Arbitrary is the word.

    Arbitrary means you can implement it however you want. The limits to it are by convention. There is no need to go any further than case insensitive filenames. At all. Rolling case insensitive filenames into the same issue is entirely an attempt to make a case against a pet peeve for unrelated reasons.

    You want it to handle the edge cases? Have it handle the edge cases. You want to restrict it to the minimum feature just for alphabet characters? Do that.

    But you do NOT give up on the functionality or user experience because of the edge cases. You don’t design a user interface (and that’s what a OS with a GUI is, ultimately) for consistency or code elegance, you design it for usability. Everything else works around that.

    I can feel this conversation slipping towards the black hole that is the argument about the mainstream readiness of Linux and I think we should make a suicide pact to not go there, but man, is it starting to form a narrative and am I finding it hard to avoid it.


  • The entire issue is that gradmas don’t type out filepaths.

    When you’re tying filenames case is easy, because a) you have to press something different, and b) typically terminal monospace fonts look very different in caps and non caps.

    But in a GUI where you aren’t typing the names out? For a human reading human text caps and non caps are interchangeable. So as the name of an icon case sensitivity is confusing and prone to human error.

    I mean, it’s that in typing, too, because it’s a very easy typo to make and all sorts of mixed case choices can be hard to remember, but it’s MORE confusing if you end up with just an icon with a name and the exact same icon with the exact same name just one character is a different case.

    OSs don’t do anything by themselves, but they come bundled with all sorts of standardize applications built on top of them. If case sensitivity is baked into the filesystem, it’s baked into the filesystem. And absolutely no, you can’t put it in at the application level. I mean, congratulations for finding the absolute worst of both worlds, but how would that even work? If I tell an app to use a file and there are two of them with different cases how would that play out? You can build it into indexing and search queries and so on when they will display more than one result (and that, by the way, is typically extra EXTRA confusing), but you can’t possibly override the case sensitive filesystem.

    Now, character byte codes are a different thing, and it’s true that the gripe in this particular rant seems to be almost more focused into weird unicode quirks and the case sensitivity thing seems to be mostly a pet peeve he rolls into it, I suspect somewhat facetiously.

    But still, that’s for the OS, the filesystem and the applications to sort out. It’s an edge case to handle and it can be sorted out via arbitrary convention regardless of whether you do case sensitivity for filenames. “Case insensitive means insensitive to other things, too” is not a given at all.


  • Hah. Second absolutely deadpan Average Familiarity instance in a Linux forum I have this week.

    I mean, no offense to grandma. Plenty of grandmas are computer literate. But the idea of this hypothetical normie Windows user doing anything but double click on an icon (too slowly, with a bit too much pressure on the left mouse button, as if that made a difference, probably having single clicked to select first, just in case) is absurd.

    File names are icon names first and foremost. File paths are a UI element to breadcrumb the location of the currently open file manager/explorer window unless proven otherwise.

    And that is the right answer and how the whole thing should be designed.


  • Case insensitive is more intuitive and MUCH safer.

    You do not want every Windows user to live in a world where Office.exe, office.exe, Offlce.exe and 0fflce.exe are all different files.

    OSs and filesystems aren’t built for programmers, they’re built for grandmas. Programmers just happen to use them. It’s much more sensible to give programmers a harder time fixing bugs and incompatibilities than it is to make the user experience even marginally worse.

    I mean, all due respect for the guy, but that is an absolutely terrible opinion and I will die on this hill.


  • To be clear, they ARE building an AI-forward browser and he is very plain about collecting a ton of user info. The way it’s presented in context is that they intend to plug it in to their assistant/agent thing and surface relevant stuff to you on searches (which is the potential ad opportunity the article quotes as if it was the sole goal). But yeah, the implication is that they are collecting data regardless, even if the user profile ends up being used to cater AI responses to you specifically, to train models or whatever.

    Hearing the guy talk about it I get the impression that he envisions an Apple-like ecosystem where they’re constantly ingesting data and you’re paying them to have their AI services act as a personal assistant and handle purchases and booking for you directly and so on, on top of anwering queries.

    I would rather clip my toenails with a rusty chainsaw, myself, but that seems to be the idea.


  • I hate when people post hyperpartisan reporting because it makes me do homework. In this case, you made me listen to almost an hour of a three hour podcast with three techbros chatting about techbro crap in techbro ways. You owe me years of life.

    Anyway, so the conspicuously missing context here is he’s asked if they will let go of the subscription model and go after an ad business model instead and he responds “hopefully not” and clarifies that he thinks the AI differentiator from Google search is that it doesn’t feed people ads.

    He then transitions into saying that you’d need a super hyperspecialized profile for it to make sense and then maybe it could work but they haven’t figured out long term memory well enough for that, which is when he talks about why they’d want to have a browser to build that hyperspecialized profile.

    This is my least favorite type of misinfo, too, because he’s actually kinda saying what they say he’s saying, just out of context. But more importantly, because he says some other shit that is more outrageous, too. For example, when explaining why he thinks the subscription business will grow more than the ad business the way he puts it is that “people see it as hiring someone”, so they’re more willing to spend, and he ponders “how much do people pay for personal assistants and assistant managers and nannies?” and suggests that they’ll provide similar services for cheaper to people who can’t afford human help.

    Which may not be as clickbaity and I get he finds it positive-on-the-aggregate, but is certainly some cyberpunk dystopia stuff that didn’t need the out of context quoting to be a thing.



  • Well, the relationship between Chrome and Chromium in this situation is… interesting and a big question mark.

    Presumably whoever owns Chrome will by default have a remarkable amount of influence on the ongoing direction of Chromium, just by way of having a massive dominant position over the market overnight. Chrome is not just plain Chromium as it is.

    Given that the sale of Chrome would be fundamentally a regulatory constraint it’s also a given that Google would not immediately attempt to re-enter that market (or if they did that they would get a swift spanking all over again).

    So yeah, Chrome is probably valuable. How well you can monetize it decoupled from Google’s advertising business proably depends heavily on who you are. Meta or Microsoft could do that very well, but then they’d be in the same regulatory danger zone Google is. DDG, Brave, Opera or Mozilla would definitely benefit but probably wouldn’t be able to afford it.

    Because we’re on this timeline the more likely outcome is Elon Musk buys it and we go into another round of seeing the shambling, zombified corpse of a thing stumble forward for years while shedding fascist propaganda. I’m trying to decide if Bezos buying it puts us in the regulation danger zone scenario or the shambling fascist zombie scenario. Both?


  • Well, let me solve that for you right away.

    You need neither of these things. Games and entertainment are not a priority if you’re in a “this current economy” type of situation.

    If you already have one, that’s the right one for the money, probably.

    Was Nintendo Life “misrepresenting the value of a Switch 2 over a Deck”? Myeeeeh, not sure. I’ll say I agree with their premise that “Steam Deck fans Seriously Underestimating the Switch 2”. In somewhat petty, immature ways, as demonstrated very well here. Does the Steam Deck “obliterate the Switch 2”? Probably not, no. I’ll tell you for sure in the summer, I suppose. That said, their listicle is brand shilling as much as this post is.

    Are these two things different and have different sets of pros and cons? Yeah, for sure. It’s even a very interesting exercise to look at the weird-ass current handheld landscape, because it’s never been wider, more diverse or move overpopulated. The Switch 2 and the Deck will probably remain the two leading platforms until whatever Sony is considering materializes, but they’re far from alone, from dirt cheap Linux handhelds to ridiculously niche high end laptop-in-a-candybar Windows PCs.

    If you want to have a fun thread about that I’m game, but fanboyism from grown men is a pet peeve of mine, and even if I didn’t find it infuriating I’d find it really boring.

    For the record, between these two? Tied for price, Switch 2 will be a bit more powerful and take advantage of specifically catered software from both first and third parties, has better default inputs, a better screen and support for physical games. Current Deck is flexible, hugely backwards compatible, can be upgraded to a decent OLED screen and has fewer built-in upsells.

    And as a bonus round, Windows handhelds scale up to better performance than either, have better compatibility than the Deck and some superior screen and form factor alternatives… but are typically much more expensive and most (but not all) struggle with the Windows interface and lack hardware HDR support.

    We good? Because that’s that’s the long and short of it.




  • That mostly tracks. I think the problem is less the availability of affordable options and more the willingness of the market to take those as a standard, though.

    The affordable options are there. You can get a PS5 starting at 400 bucks (tariffs allowing). That’s a lower sticker price than a launch PS3 and on par with the inflation-adjusted price of a 360. It’s also cheaper than some comparably performant GPUs, let alone an entire PC.

    Problem is then you’re playing at some variation of upscaled 1080-1440p at 30 to 60 fps and apparently the PC market thinks that’s for peasants and you should only ever play at hundreds of fpss and many megapixels.

    And yeah, the tech hasn’t made those specs available to the human-tier and instead the marketers have gotten really good at giving you FOMO for all the high end features you could be getting instead.

    There is a low end. I think the fact that a Steam Deck is ostensibly a full handheld PC starting at 420 bucks is absurd. Not gonna raytrace much on it, though.

    Do I think games should all be made for Steam Decks and PS5s and not have any features that require beefier hardware? Well, seeing my point about loving visual features I’m going to be a no, but I also think we need to get better at managing the FOMO as a group.

    Or the hardware needs to find a new route to get us back on the Moore’s Law curve. Either/or.


  • Yeah, see, on the features argument I’m gonna disagree, just to disrupt the lovefest.

    I LOVE new graphics features. I’ve been messing with raytracers and pathtracers since back when they were a commandline DOS application. Real time path traced visuals? Gimme. I don’t have a problem with alternate ways to feed pixels or frames, even. All of that is just a byproduct of all the tensor acceleration they are using for AI anyway, I’m just glad there’s a gaming application for it, too.

    If I’m going to question an unreasonably high technical paradigm it’s going to be what we consider “gaming monitors” now. Who needs 4K at 240Hz? There are 500fps displays out there now, which is a frame per 2 miliseconds. Back in the 1080p60 days that was your antialiasing budget.

    But I’m also willing to acknowledge that other people have different priorities and would rather maximize that for specific applications instead of visual fidelity.

    And that’s the real problem these days, there is no standard spec. Different people are taking different elements of rendering to absurd extremes. Crazy framerates, crazy resolutions, crazy real time pathtracing, crazy scope, crazy fidelity, crazy assets, crazy low power draw on handhelds. You’re supposed to be servicing all of those needs at once PLUS this is now the hardware that runs the last two tech gold rushes driving insane speculative investment.

    That is ultimately neither here nor there, but if we’re all going to have to accept that there will never be a fire-and-forget, max-everything-out user-level hardware spec ever again the least manufacturers could do is prioritize the insane, wildly expensive new prosumer segment not catching on actual fire if you look at it sideways.


  • Yeah, that’s my point about the microwave thing. It’s not that the total power is too much, it’s that you need more reliable ways to get it where it needs to be.

    I don’t understand how massively ramping up the power led to thinner wires and smaller plugs, for one thing. Other than someone got fancy and wanted prettier looking cable management over… you know, the laws of physics. Because apparently hardware manufacturers haven’t gotten past the notion that PC enthusiasts want to have a fancy aquarium that also does some computing sometimes. They should have made this thing a chonker with proper mains power wires. It’s called hardware for a reason.

    But I agree that the other option is completely changing how a PC is built. If you’re gonna have a GPU pulling 600W while the entire rest of the system is barely doing half of that maybe it’s time to rethink the idea of a modular board with sockets for cards, CPU and RAM and cables for power delivery. This entire structure was designed for 8086s and Motorola 68000s back when ISA ports were meant to hold a plug for your printer, a hard drive controller and a sound card. Laptops have moved on almost entirely from this format and there are plenty of manufacturers now building PCs on laptop hardware, Apple included.

    Maybe it’s time you start buying a graphics card with integrated shared memory and a slot to plug in a modular CPU instead. Maybe the GPU does its own power management and feeds power to the rest of the system instead of the other way around.

    I don’t know, I’m not a hardware engineer. I can tell the current way of doing things for desktop PCs is dumb now, though.


  • That is a broader and well litigated issue at this point. Going for more power draw wouldn’t be a problem in itself (hey, your microwave will pull 1000w and it doesn’t spontaneously combust). The problem is they designed the whole thing to what would safely fit (poorly) managing 350w and tidy and instead they are pushing 600w through it with meaningful cost-cutting shortcuts.

    That is what it is, and I think it’s a more than reasonable dealbreaker, which leaves this generation of GPUs down to a low-tier Intel card with its own compatibility issues and a decent but expensive mid-tier AMD offering. We are at a very weird impasse and I have no intuition about where it goes looking forward.