

Plot twist: they can, and will, do it even if you opt out. The only thing that change is that you won’t get anything out of it. Not that it would have been a significant return to begin with.


Plot twist: they can, and will, do it even if you opt out. The only thing that change is that you won’t get anything out of it. Not that it would have been a significant return to begin with.


I was being generous. For some reasons, when I try voice recognition it triggers as if I’m speaking Japanese. And when I actually try to speak Japanese, I get english gibberish.


Hey, when I have Iron man Jarvis-like chatting me up, running locally, and never ever messing anything up, I’ll be impressed.
So far I have semi-competent voice transcription, borked understanding, incorrect action 4/5 of the time, underwhelming, if not broken output, and most of the time this bad version is dependent on a datacenter that’s aiming at obliterating a star worth of power every two hours.
I WONDER why this is not seen as impressive.


Better than seeing weird letters and 80 style colored geometric shape sliding around.


Not only it is actually happening, it’s actually well researched and mathematically proven.


There are better way to live. But we’re used to a certain level of comfort, that includes not doing the many, many upkeep tasks to grow food, maintain home, clothing, etc. so we trade some time for currencies, that is then traded with other people, and the leftover currency allows us to indulge in fun things that are also complex and high maintenance, so they’re done by other.
Well, that’s the theory. In practice, working a full-time job barely, if even, covers the minimum expanses required to live, which keep going up anyway, so you have to work more to barely go by, which thankfully will let you forget that you won’t make anywhere near enough money for leisure time. Good thing you won’t have any, eh?
sigh knowing we have the technologies, right now, to cover all basic needs, including food and housing, for cheap, but still do with the charade of inflation so that a few select individual can extract all our time from us is really sad.
It mostly did, yes. But when a big issue pops up, X still gets the occasional patch.
And, since this is a bit of a hot topic it seems, that sounds fair to me. X is the past, wayland is the future. I’m just annoyed at people glossing over the reasons not everyone can move on.
While it’s certainly winded down over time, XOrg is still maintained. Last fix was released in september 2025. Is it enough? It never is. But that’s not really an argument to move from “working” to “not working as well” for now.
Yeah, I know of such “solution”. But what is the point of forcing the change when it doesn’t bring me tangible benefits, brings significant downsides, and only some of these downsides have half-useful workarounds?
I have no problem with whether wayland existing or it becoming the new standard, but forcing people to move in these circumstances seems a bit silly, especially when some issues stem from people having hardware from one manufacturer that represents around 75% of general consumer systems (according to Steam survey, which might or might not be representative but sure brings a lot of people).
Thankfully, at least with the distributions I use, switching back and forth is trivial. But given the circumstances, I don’t really understand the extremely heavy push.
What are you talking about? You can copy-paste from Terminal programs to GUI programs and vice-versa like everywhere else (with the terminal of course needing CTRL + SHIFT + C / V, which as we know is historical to Unix terminals). I’m doing that for years, so does my family. It works just fine.
I’m not talking about copy/pasting from the terminal emulator, thank you very much. Just run VIM and have it copy/paste from the global clipboard without setting up esoteric, sometimes DE-dependent stuff, and you’ll understand.
And bringing up Nvidia now really is bending down backwards to paint Wayland as bad while it’s painfully obvious it’s the driver’s fault.
Sure. I did not say it was wayland fault. Or anyone else, really. I explained why some people could not “just move on to wayland already you nincompoop” with very tangible issues that still prevent them from doing so. Who is at fault is of no consequence here. If I switch to wayland, I lose features, I have a broken desktop, and throwing away thousands of equipment because “it’s the future” does not sound that great. It’s just a matter of fact. Whether it’s wayland’s fault, plasma’s implementation’s fault, nvidia’s fault, or anyone else’s is irrelevant to the user experience here.
People can’t go “stop using X and use wayland”, and ignore raised issues by saying “no, that issue you’re having is not a big issue”, “that issue you’re having is not wayland’s fault”, “that issue you’re having does not concern most people”, etc. And reading replies in this thread, it seems people have a hard time imagining circumstances beyond their own.
That sounds more like escape sequence not being interpreted, but maybe? It’s a mess.
Basically, in some implementations (it’s true for at least KDE Plasma), the console app is never seen as “active” (the terminal emulator is), and as such can’t access the clipboard, something like that. There’s third party program you can use, and plugins for things like VIM, but when you get a step further with remote clipboard it’s even worse. And even when solutions exists, there’s weird caveat like “it will work all the time except if you’ve clicked somewhere in the past few seconds” or something.
I’m sure things will improve over time, but “we’re not there yet”.
I can’t copy/paste from a terminal program to a GUI program under wayland without jumping through hoops and configuring every individual program to use some variant of a DE-specific utility that bypass wayland’s model to peek/poke into the clipboard.
That’s not a minor feature to me. And in my (and probably some other people) case, trading basic copy/paste for not-yet-implemented differential DPI scaling does not sound too great.
Some people are adamant to not switch, but I swear some people are so adamant to force everyone else to switch without even considering that their use case might not match other people use case, it’s infuriating. It’s not like me staying on X will degrade everyone else’s experience of the new shiniest thing.
Distribution moving to wayland might be good in the very long term, but for now, when you have a 3080Ti (a relatively recent card) and it breaks basic desktop composition when switching to wayland, telling people “just throw it out and buy another card instead of keeping your currently working system” is not going to help anyone.
There are still existing issues with wayland that do not exist on X11. I’m talking, using last-gen consumer grade hardware that will break basic applications like, who knows, a web browser. Meanwhile the “upside” are extremely marginal to a lot of people. Different screen scaling isn’t implemented using proper DPI on most implementations, variable refresh rate is not something most people care about (I sure don’t care that my second monitor is capped at 120Hz instead of 144Hz because of my first monitor), etc.
So, yeah, for some people, it’s not a matter of preference, it’s a matter of having a stable, working system vs. a broken system where basic features are not a given.
If you took an uber and the car was a horse-driven carriage and your seat was a hole in a rotted plank, you’d complain.
Yeah, so, switching to wayland still break copy/paste from terminal apps, still requires me to disable all hardware acceleration lest firefox freeze and plasma’s effects are visually broken, and it randomly swap my screen on each boot.
Meanwhile, no issue at all on X. I’ll still wait a bit.


Who’s doing the asking there? Neither my laptop nor my phones asked anything.
According to the settings on my current phone, the automatic setting will decide by itself to limit the maximum charge overnight, then plan to go full charge around the time my alarm should fire.
But, again, that’s the kind of micromanagement that would yield a tiny fraction of “maybe improvement” over the lifetime of the damn thing. I’d rather have a device works all the time for 6 years than have a device that’s sometime undercharged for 6.1 years.


Nah, I can’t be bothered by that. And the only device’s battery I really had issues with was a seven years old laptop, years ago. BMS and software will almost always know better than the user these days.


The point is that I never had to care about battery management for years. I just leave the phone doing its thing. Not that it’s useful or not useful to do so.
The whole point is that I leave that in the hand of people that know.


Just live in a microwave. Problem solved!


I hadn’t watched the video yet, but my phone’s going the opposite way. It run slow charge overnight when it feels like it’s going to be enough for it to be fully charged the next morning.
We really should let electronics and tight software take care of these little things.
I’ll never put foot on that hellhole again, but it would be funny if this was something he posted on twitter and grok showed up to “correct” him.