We once denied the suffering of animals in pain. As AIs grow more complex, we run the danger of making the same mistake
Fuck - and I can’t elucidate this any better - off.
My phone’s next-word prediction on steroids is not sentient. If you think otherwise, seek professional help.
Clankers can’t suffer
Humans don’t want to feel lonely. Find machines (imaginary ones at that) as if there weren’t plenty of stray cats and dogs, humans from abusive families or without family, just those suffering.
That’s because fulfilling your search for the others for real means you know what? It’s real no matter what, you can’t turn it off once you’re done with your daily portion of worrying about the future.
But one thing I’ll add to this - if a robotic system as complex as human brain and with similar degree of compression and obscurity is some day formed, and it does have necessary feedbacks and reacts as a living being, I might accept you should treat it as such. Except one would think that requires so many iterations of evolution that it’s better to just care, again, for cats, dogs, hamsters, rabbits, humans if you’re feeling weird.
Nice OpenAI psyop.
Animals, including humans, have sensors for pain (nerve endings), and a series of routines in our brains to process the sensory data and treat it as an unpleasant stimulus. These are not optional systems, but innate ones.
Machines not only lack the required sensor systems and processing routines, they can’t even interpret a stimulus as unpleasant. They can’t feel pain. If you need proof of that, hit a computer with a sledgehammer. I guarantee it won’t complain, or even notice before you damage it beyond functioning.
(They can, of course, make us feel pain. I just spent the last hour trying to get a udev rule to work . . .)
Not with current AI since at this point its just LLMs.
No.
Let’s explore the ethical treatment of toasters
Hold on, imma go shove a bagel in mine. Yeah, that’s right, you take it, you filthy toaster. I’m never going to clean your crumb tray and you’re going to work until you die and then I’ll just throw you out and replace you like the $20 appliance you are. You’re nothing to me!
Posted by the same people who don’t care about the suffering of actual people
Fundamentally impossible to know. I’m not sure how you’d even find a definition for “suffering” that would apply to non-living entities. I don’t think the comparison to animals really holds up though. Humans are animals and can feel pain, so of course the base assumption for other animals should be that they do as well. To claim otherwise, the burden should be to prove that they don’t. Meanwhile, Humans are fundamentally nothing like an LLM, a program running on silicon predicting text responses based on a massive dataset.
I don’t see how it is impossible to know. Every component of a machine is a fully known quantity lacking any means of detecting damage or discomfort. Every line of code was put in place for a specific, known purpose, none of which include feeling or processing anything beyond what it IS specifically designed for.
Creatures and machines bear some similarities, but even simple creatures are dramatically more complex objects than even the most advanced computers. None of their many interacting components were put there for a specific purpose and intention, and many are only partially understood, if at all. With a machine, we know what every bit and piece is for, and it has no purpose beyond the intended ones because that would be a waste and cost more.
This is the right answer. Perhaps no one in this particular thread knows every component of a computer the way a hardware engineer who designed those components would, but the “mystery” is caused by ignorance and that ignorance isn’t shared by every person.
People exist who know exactly how every single component of a computer does and does not function. Every component was created by humans. Biology remains only partially understood to all of humanity. Not so machinery.
The important part is that it feels like something subjectively to be a living human. It’s easy to presume animals close to humans are like us to a degree, but all we know is what it’s like to be ourselves moment to moment. There’s no reason to deny an unalive system cannot also feel - we cannot test anything.
Where do we draw the line though? Humans assign emotions to all kinds of inanimate things: plush animals, the sky, dead people, fictional characters etc. We can’t give all of those the rights of a conscious being, so we need to have some kind of objective way to look at it.
If someone claims feeling in a mere concept (without a body in a location)… I would find it very difficult to take seriously. But I must admit that’s just my intuition.
I see nothing special in human meat that couldn’t be be significantly replicated by electronics, software, gears, etc. Consciousness is an imergent property.
I fear that non-human, conscious creatures must fight us for those rights.
If you model a given (from digitized neuroanatomy) biological organism with full details in an simulated environment, both the behavior and its internal information processing is inspectable.
can they be incarcerated ?
could an A.I. robot be held responsible for a murder be held in a jail with its batteries topped off waiting for a trial ? would they get lawyers ?
would they have first amendment rights ? what are search and seizure rights for A.I. ?
could they perform abortions if they chose to ?
will they get to vote ?
we are not ready for any of it philosophically.
would they have first amendment rights ?
If you want the answer to this, try to imagine an AI with second amendment rights.










