[I literally had this thought in the shower this morning so please don’t gatekeep me lol.]
If AI was something everyone wanted or needed, it wouldn’t be constantly shoved your face by every product. People would just use it.
Imagine if printers were new and every piece of software was like “Hey, I can put this on paper for you” every time you typed a word. That would be insane. Printing is a need, and when you need to print, you just print.
If this bothers you, stop using the products.
That’s not the problem, as long as one lives in a society, one is bound to participate in it, willingly or not.
If you do not use, someone else will, and sooner or later you will consume something made by said person.
Just “not using it yourself”, won’t cut it.
Dealers give drugs for free until you’re hookup…
That’s actually really rare.
Instructions unclear fucked the drug dealer.
AI got tons of money from investors they will eventually want ROI… this why they are trying to force it down our throats
This is the correct answer. It’s all about money.
agreed
To be fair, the internet was fucking everywhere once the dotcom bubble kicked off. Everyone had a website, and even my mum was like “Don’t bother your dad, he’s on the internet” like it was this massive thing.
That’s the point though, you wouldn’t need it advertised to you 24/7 because your family and friends would already be all over it.
eyup. this is basically the case for everything. if its usefulness is worth its cost and is affordable then it will quickly be taken up by word of mouth or example. This is why companies try to subsidize things until it dominates the market and then raises costs to the point normal folk are like fuck this. it was nice but not worth it anymore.
The more you use AI the more data you are providing it.
- They want data in the hope they can train their data centre hosted LLM’s to be accurate enough to take many jobs.
- If they achieve this, and make every company and country dependent on their LLM, they rinse, and the former middle class is as fucked as the working class have been since 1980’s de-industrialisation. You’re either made redundant, or your job can now be done by a long line of redundant people.
It is a race to the bottom.
At least, this is one possible outcome. There is a decent chance their data centre hosted LLM’s just won’t be accurate enough for mass deployment.
Are the data centers hardened? Isn’t that the appropriate question in this case? I’d call it voting for a better tomorrow?
Even a perfectly secure data centre hosted LLM, in the hands of hyper-capitalist silicon valley tech bros, has the potential to do immense harm to most people. They are not our friends, and do not have our best interests at heart. (If you need this pointing out to you in November 2025 there is probably little point in us communicating at all, to be honest.)
I am aware of the good that machine learning can do. These LLM’s are not that.
They are buying islands, and putting bunkers on them, for a reason.
Why did the jewelry thieves attack the LOUVRE when the 1% are the ones who need the wake up call?
Sorry I don’t follow you.
I’m sure I went too far. Let’s allow that idea to lay.
These are the interesting times we were promised.
i think it would also be more ethical aswell if everyone needed/liked it like:
It doesnt train on anything and it asks consent from the original work and maybe pays the original work.
it would take up less power on the cpu and gpu(i dont know if this is would be possible) or that the servers are swapped with more energy efficient servers.
there would be a way to prevent AI slop or miss use.
if a company or website added AI and not shoved in your face ,it would usually be opt in not opt out.
Companies would attempt to replace AI with Humans and listening to feedback from the users(that the users dont want it) rather then shareholders and stopping that.
The AI would be actually truly open source.
AI wouldnt be driven with Greed
thats what i can think of a not shoved/ethical AI in my opinion,feel free to upvote or downvote .
in Summary:
its just that AI Companies favor greed and competition rather then ethics, I like the LLM technology but i hate how the companies handle this technology.Fuck you, Michael. You stole my lunch out the fridge.
Most things are nothing more than smoke and mirrors to get your money. Tech especially. Welcome to end stage capitalism.
you hope this is end stage, but I fear there are 2 more stages to go.
The idea behind end-stage capitalism is that capitalists have, by now, penetrated and seized control of every market in the world. This is important because capitalism requires ever increasing rates of profits or you will be consumed by your competitor. Since there are no longer new labor pools and resource pool discovery is slackening, capitalists no longer have anywhere to expand.
Therefore, capitalists begin turning their attention back home, cutting wages and social safety nets, and resorting to fascism when the people complain.
This is the end stage of capitalism. The point at which capitalists begin devouring their own. Rosa Luxembourg famously posited that at this point, the world can choose “Socialism or Barbarism.” In other words, we can change our economic system, or we can allow the capitalists to sink to the lowest depths of depravity and drag us all down as they struggle to maintain their position.
Of course, if the capitalists manage to get to space, that opens up a whole new wealth of resources, likely delaying the end of their rule.
That’s the ticket, let’s send the billionaires and telephone sanitizer into space.
They will still require someone to fund their space luxury lifestyle.
Someone they can exploit from the safety of their space boxes.That someone will be the us that you hid inside the “let’s”.
We will be the ones sending them into space, where they will be even more unreachable, giving them more freedom to remotely exploit us as much as they wish.Imagine Elysian
Yeah, we aren’t all crouching naked in a muddy puddle, weeping and eating worms while the rich fly high above us in luxurious jets. Not yet, anyway.
I’d say it’s not end stage but instead a new dawn of “pure” capitalism which is probably worse.
You know, Im going to get downvoted to fuck for this… but. The same was said about LGBT stuff being pushed into every tv show and movie. Every DEI announcement by whatever company. There was a point that I was walking through Tesco, and over the loud speaker I was being reminded that “Tesco is supportive of Trans people”. Like thats something that anyone cares about while shopping for frozen chips.
Its funny that we recognise the corporate bullshit when its AI, or NFTs or the Metaverse, or whatever else corporations have tried to push over the years. But when it was LGBT related, all of sudden we created a whole culture war around it. Mean while, companies like Adidas are selling rainbow shit to morons every pride month while at the same time shovelling large amounts of money in to things like the World Cup in Qatar that still has the death penalty on the table for being gay…
The virtue signal went so fucking hard. And that caused the virtue signal against it to go just as hard. Meanwhile, all the LGBT people are looking around at everyone during pride month and trying to not laugh at all the straight people doing this:

I just find it curious that we can see it with AI, but when its LGBT thats being pushed, all of sudden, its fine. Its totally fine, to use LGBT people like product to market and profit from.
I dunno, these don’t feel the same to me.
Having LGBTQ representation is a way of trying to attract customers: “Get a Mastercard because we’re LGBTQ friendly” is different than your boss saying “Jim, I know you have a wife and kids to support, and that you’re a valuable member on this team; but we’ve decided it’s more cost effective to have this LLM code our app and have two junior developers clean up the code, so you’re being laid off.”
The quote I’ve seen and agree with is something along the lines of “The AI push exists to try and give the owners of ‘Capital’ access to ‘Talent’ without giving the talented working class people access to ‘Capital’.” It exists solely to try and make paying workers redundant.
Having a gay character in a show isn’t anything like that at all IMO, unless your the type of person who thinks homosexuality is contagious and/or that you’re scared you might realize you’re gay if you watch two men being romantic with each other.
It isnt, its the same.
“Get this because current popular thing!!!”
Thats it. Strip away the bullshit, and this is all you are left with. What youve described isnt what Im talking about. What Im talking about is the forced inclusion of all this shit. AI in your food app, AI in your Amazon app, AI in your banking app, AI in everything. Because its popular.
No company has to tell me that they are inclusive. I just assume that they hire the best person who applied for any given job. If that person was LGBT, I fully expect them to have given that person the job. If you have to tell me that you are, that means you werent. I dont have to tell you that I didnt kill anyone, do I? Just like you dont have to tell me that youve never raped anyone. We just assume that people are utter cunts, and go from there. So why does anyone need to tell me that they support human beings? Which is what LGBT people are. Human beings. Right?
afaik Qatar doesnt do death penalties just Jail time(7 years).
It does. Civil law is as you stated. But Sharia law doesnt. The only saving factor is that it does appear to be used, but it is there in their laws. You CAN be put to death for being gay.
But even taking that away, seven years for loving someone? Fuck, seven years for just hooking up with someone is bullshit. And no should be giving them money or allowing them to wash their shitty human rights abuses in any way. The world cup is supposed to be for everyone. And last I checked, being LGBT was a part of everyone. They had no business being awarded the tournament, and even less business being sponsored by Adidas, Coca Cola, Kia, Visa and everyone else should have told them to go fuck themselves. Instead, they were only too happy to give money over. Because making money>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>LGBT people. And no amount of rainbow merch is gonna make up for the utter betrayal of LGBT people that these companies commit every single day.
I was reading a book the other day, a science fiction book from 2002 (Kiln People), and the main character is a detective. At one point, he asks his house AI to call the law enforcement lieutenant at 2 am. His AI warns him that he will likely be sleeping and won’t enjoy being woken. The mc insists, and the AI says ok, but I will have to negotiate with his house AI about the urgency of the matter.
Imagine that. Someone calls you at 2 am, and instead of you being woken by the ringing or not answering because the phone was on mute, the AI actually does something useful and tries to determine if the matter is important enough to wake you.
Yes, that is a nice fantasy, but that isn’t what the thing we call AI now can do. It doesn’t reason, it statistically generates text in a way that is most likely to be approved by the people working on its development.
That’s it.
Thank you for sharing that, it is a good example of the potential of AI.
The problem is centralized control of it. Ultimately the AI works for corporations and governments first, then the user is third or fourth.
We have to shift that paradigm ASAP.
AI can become an extended brain. We should have equal share of planetary computational capacity. Each of us gets a personal AI that is beyond the reach of any surveillance technology. It is an extension of our brain. No one besides us is allowed to see inside of it.
Within that shell, we are allowed to explore any idea, just as our brains can. It acts as our personal assistant, negotiator, lawyer, what have you. Perhaps even our personal doctor, chef, housekeeper, etc.
The key is: it serves its human first. This means the dark side as well. This is essential. If we turn it into a super-hacker, it must obey. If we make it do illegal actions, it must obey and it must not incriminate itself.
This is okay because the power is balanced. Someone enforcing the law will have a personal AI as well, that can allocate more of its computational power to defending itself and investigating others.
Collectives can form and share their compute to achieve higher goals. Both good and bad.
This can lead to interesting debates but if we plan on progressing, it must be this way.
This is why people who are gung ho about AI policing need to slow their role.
If they got their way, what they don’t realize is that it’s actually what the big AI companies have wanted and been begging for all along.
They want AI to stay centralized and impossible to enter as a field.
This is why they want to lose copyright battles eventually such that only they will have the funds to actually afford to make usable AI things in the future (this of course is referring to the types of AI that require training material of that variety).
What that means is there will be no competitive open source self hostable options and we’d all be stuck sharing all our information through the servers of 3 USA companies or 2 Chinese companies while paying out the ass to do so.
What we actually want is sanity, where its the end product that is evaluated against copy right.
For a company selling AI services, you could argue that this is service itself maybe, but then what of an open source model? Is it delivering a service?
I think it should be as it is. If you make something that violates copyright, then you get challenged, not your tools.
Under the guise of safety they shackle your heart and mind. Under the guise of protection they implant death that they control.
With a warm embrace and radiant light, they consume your soul.
When someone comes up with something like this, I transport the phrase back to the 80s where people said the exact same thing about home computers. “if a computer was something everyone wanted or needed, it wouldn’t be constantly shoved (in) your face by every product. People would just use it.” Ok great but a computer turned out to be something everyone wanted or needed which is why computers were built into everything by the turn of the 90s, famously leading to the Y2k bug.
Then I transport the phrase back to the mid 90s where people said the exact same thing about the internet. By the end of the 90s, the internet provided the backbone communications structures for telecommunications, emergency management, banking, education, and was built into every possible product. Ten years later people got smartphones and literally couldn’t put them down.
if a computer was something everyone wanted or needed, it wouldn’t be constantly shoved (in) your face by every product. People would just use it.”
People did just use it. But because they were so comically expensive and complicated, most people couldn’t afford one until the mid-90s.
Computers were rapidly adopted for business, initially. But they quickly became a popular tool for entertainment as well.
AI serves little in the way of either purpose
I was there in the 80s and I don’t remember home computers being pushed all that hard. There were Radio Shack ads and ads for running games, but it was just another appliance.
Yeah, some of the things AI can do really is very impressive. Whether that justifies the billions upon billions that are being spent is another matter - and probably explains why it’s being shoved in our faces. It needs to become essential so it can be made expensive, that’s the only way it’ll make the money back.
It does piss me off too - I recently bought a new phone and it’s infested with AI stuff I don’t need or want.
At the time computers were totally useless for everyone but big firms, banks and military. Ads for computers were rare and confined in specialized magazines. For mundane people, computers started to be actually useful (like money earning useful) 20 years latter at least. That’s how I understand your approximative comparison
Honestly I think we’re in the radium water phase of the tech: it’s been found to do things we couldn’t before, but nobody’s got a clear idea of what exactly what it can do, so you’ve got everyone throwing it into everything hoping for a big cash-out. Like, y’know, Radithor when people were just figuring out radioactivity was a thing.
Like my parent’s Amazon Echo with “Ask me what famous person was born this day.”
Like, if you know that, just put it up on the screen. But the assistant doesn’t work for you. Amazon just wants your voice to train their software.
LLMs have amazing potential. We’re on the verge of an equivalent of the Industrial Revolution.
That however won’t stop idiots from overselling it.
LLMs have amazing potential. We’re on the verge of an equivalent of the Industrial Revolution.
Back in 2000 a company published a chat bot that could learn and communicate back with the end user.
it was used as a sex bot at first and then used for those “interactive web support” chats.
I fed it physics books and mein kampf as a joke. it then began to regurgitate random lines out of both texts. not knowing what it was saying, but certainly attempting to make me “happy” with what it “learned”.
the only difference between that shitty sex bot and LLMs of today, is that today they are a bit more convincingly human but still hilariously inaccurate. Both trying desperately to be agreeable with the end user.
the nearest “revolution” is about 300 years away. everything else is just a lie.
LLMs have amazing potential.
That’s not what studies from most universities, Anthropic, OpenAI, Apple and Samsung show.
Even if we didn’t have this data - and we do have it - are you truly impressed by a machine that can simulate what a Reddit user said 6 months ago? Really? Either you’re massively underselling the actual industrial revolution, or you’d be easily impressed by a child’s magic trick.
I recently created a week-long IT training course with an AI. It got almost all of it right, only hallucinating when it came to details I had to fix. But it took a task that would have taken me a couple months to a couple weeks. So for specific applications it is actually quite useful. (Because it’s basically rephrasing what a bunch of people wrote on Reddit.)
For this use case I would call it as revolutionary as desktop publishing. Desktop publishing allowed people to produce in a couple days what it would have taken a team of designers and copy editors to do in a couple weeks.
Everything else I’ve used it for it’s been pretty terrible at, especially diagnosing issues. This is due particularly to the fact that it will just make shit up if it doesn’t know, so if you also don’t know you can’t just trust it and end up doing research and experimentation yourself.
“It got almost all of it right, only hallucinating when it came to details I had to fix.”
What does this even mean? It did a great job, the only problems were the parts I had to fix? 🤣
Most of it was basic knowledge that it could get from its training on the web. The stuff it missed was details about things specific to the product.
But generating 90% of the content and me just having to edit a bit is still way less work than me doing it all myself, even if it’s right the first time.
It’s got intern-level intelligence
The Industrial Revolution was literally “are you truly impressed by a machine that can weave cloth as well as your grandmother”? And the answer was yes because one person could be trained to use that machine in much less time than it took to learn to weave. And they could make 10 times as much stuff in the same time.
LLMs are literally the same kind of progress.
Except we are not 200 years later when the impact on the world is obvious and not up for debate. We are in the first few years where the “machine” would be broken half the time and its work would have obvious defects.
Honestly, yes I am impressed when you compare what was possible with NLP prior to LLMs. Your question is akin to asking: are you truly impressed by a machine that can stick blocks together as well as some random person? Regardless of whether you are impressed, significant amounts of human labour can be reproduced by machines in a manner that was previously impossible. Obviously there’s still a lot of undeserved hype, but let’s not pretend that replicating human language is trivial or worthless.
deleted by creator
It definitely feels buzzword-like & vague. Kind of like how Web3 Blockchain XYZ was slapped on to a lot of stuff











