The human brain is more a “rationalization engine” than anything else. We sometimes call it a “pattern matching engine” but the only patterns it accepts as valid are ones with inherent biases. Ie: “I am doing this awful thing to you for our own collective good”.
The human brain is more of a “rationalization engine” than anything else
Just no. Some people tend to rationalize everything, sometimes post-factum, that much is true, but the quoted statement is just wrong.
First and foremost, our brain is emotional and impulsive. Consciousness and rationalization comes after, as speech is a learned skill, and we use speech to rationalize in the first place.
False pretenses are a thing, but it exists to deceive a larger group of people for the benefit of a smaller one. And yes, it means that the deceiver can be the sole beneficiary, even if often than not it isn’t the case.
See? Look at you rationalizing; You’re very good at it!
Go man, go!
(And yes yes amygdala yes yes limbic system, I know all that, and you are, to a large extent, correct. I’m just having a bit of fun. A wise man once said: “I used to think the human brain was the most interesting aspect of the human body, but then I realized ‘look what’s telling me that’”)
my point still stands tho. I wish people in their variety were more rational, but alas, rationality comes last asit is power-consuming. Not that you seem particularily interested in opposing my stance, so whatever. I do enjoy a silly banter as well, so it’s not like i have anything against this
wise man once said: “I used to think the human brain was the most interesting aspect of the human body, but then I realized ‘look what’s telling me that’”
yeah, yeah, our fascination with brain is brain propaganda to make more brains.
Pesky little parasite trapped inside a box, piloting a sex-mech. An entirely overcomplicated mechanism designed by the dna so that the stupid molecule could fuck more effectively. And look what it lead to — fucking taxes! Are you happy, you microscopic shit!? Cuz i’m not breeding now, so neither do you! Muahaha! You outplayed yourself, you dumb piece of organic chemistry!
Ahem… Sorry, got a bit distracted… Yeah, brains! So cool! Yaay! :D
Banter it is then! And thank the stars, the world needs more pleasantries, today more than ever. I appreciated your most lettered reply.
So: semi-off topic (I’m tired, but the nodes are sorta connecting here…) but I always recommend Peters Watts - Blindsight for an absolutely thrilling sci-fi book that totally exposes the sham that consciousness is.
(Clears throat, adjusts notes on lecturn)
Evolution has no foresight. Complex machinery develops its own agendas. Brains—cheat. Feedback loops evolve to promote stable heartbeats and then stumble upon the temptation of rhythm and music. The rush evoked by fractal imagery, the algorithms used for habitat selection, metastasize into art. Thrills that once had to be earned in increments of fitness can now be had from pointless introspection. Aesthetics rise unbidden from a trillion dopamine receptors, and the system moves beyond modeling the organism. It begins to model the very process of modeling. It consumes ever-more computational resources, bogs itself down with endless recursion and irrelevant simulations. Like the parasitic DNA that accretes in every natural genome, it persists and proliferates and produces nothing but itself. Metaprocesses bloom like cancer, and awaken, and call themselves I.
The system weakens, slows. It takes so much longer now to perceive —to assess the input, mull it over, decide in the manner of cognitive beings. But when the flash flood crosses your path, when the lion leaps at you from the grasses, advanced self-awareness is an unaffordable indulgence. The brain stem does its best. It sees the danger, hijacks the body, reacts a hundred times faster than that fat old man sitting in the CEO’s office upstairs; but every generation it gets harder to work around this— this creaking neurological bureaucracy.
So kinda to your point…on the pie chart, “rational thought” is a thin slice.
However, upon a bit of reflection, I don’t really know if I am referring specifically to rational thought when I said rationalization engine earlier, and that’s probably down to my layman’s education. Ambiguity. Truly the devil’s volleyball. Not the right way to start an interesting conversation so again, kudos to your magnanimous disposition in this dialogue.
It’s fair, perhaps even obvious, to assert that rationalizations are a by product of faulty mental modelling. That this requires modeling of any sort implies that the organism is capable of abstract thought (ie: what humans are good at: putting ourselves in the other parties’ shoes to either empathize our outwit them - Erasmus or Machiavelli). But (and this is speculative on my part, but if it’s incorrect I need another theory to explain animal behaviour), I posit that organisms other than humans also need to model reality with high fidelity, and also need an internally consistent, accurate version of it in order to succeed. That implies error correction on the model, which is, more or less, the error correction algorithm we call rationalization; making the incongruent ends make sense so mountains can once again, be mountains.
So really, I meant “the fitness of brains has been attuned over the course of evolution such that a coherent narrative is seen as the optimal desired ground state”. In humans the narrative is verbose and tagged with lots of less than useful metadata. Is the data stream is the same as it is for an alligator as it is for us though? And do primitive systems, developed in evolution’s lab aeons ago and shared by all of us, govern our responses in the same way?
Tldr; the back and forth banter here seems to me to be: is rationalization a process of high level consciousness or is it a side effect of the “inertia of stability”? Is a “stable-specific” pattern of mental activity just where the elastic relaxes to, and so rationalization (or make-sense-of-it-ness) just has to happen? In other words: what is the motivation, from a fitness perspective, for anyone, any “being” to narrate/edit the models, no matter how primitively they do it?
Thanks for the opportunity to exposition dump with ya. Definitely curious about this subject, but that’s probably the selection bias from my own brain.
The human brain is more a “rationalization engine” than anything else. We sometimes call it a “pattern matching engine” but the only patterns it accepts as valid are ones with inherent biases. Ie: “I am doing this awful thing to you for our own collective good”.
Just no. Some people tend to rationalize everything, sometimes post-factum, that much is true, but the quoted statement is just wrong.
First and foremost, our brain is emotional and impulsive. Consciousness and rationalization comes after, as speech is a learned skill, and we use speech to rationalize in the first place.
False pretenses are a thing, but it exists to deceive a larger group of people for the benefit of a smaller one. And yes, it means that the deceiver can be the sole beneficiary, even if often than not it isn’t the case.
See? Look at you rationalizing; You’re very good at it!
Go man, go!
(And yes yes amygdala yes yes limbic system, I know all that, and you are, to a large extent, correct. I’m just having a bit of fun. A wise man once said: “I used to think the human brain was the most interesting aspect of the human body, but then I realized ‘look what’s telling me that’”)
aw, thanks :D
my point still stands tho. I wish people in their variety were more rational, but alas, rationality comes last asit is power-consuming. Not that you seem particularily interested in opposing my stance, so whatever. I do enjoy a silly banter as well, so it’s not like i have anything against this
yeah, yeah, our fascination with brain is brain propaganda to make more brains.
Pesky little parasite trapped inside a box, piloting a sex-mech. An entirely overcomplicated mechanism designed by the dna so that the stupid molecule could fuck more effectively. And look what it lead to — fucking taxes! Are you happy, you microscopic shit!? Cuz i’m not breeding now, so neither do you! Muahaha! You outplayed yourself, you dumb piece of organic chemistry!
Ahem… Sorry, got a bit distracted… Yeah, brains! So cool! Yaay! :D
Banter it is then! And thank the stars, the world needs more pleasantries, today more than ever. I appreciated your most lettered reply.
So: semi-off topic (I’m tired, but the nodes are sorta connecting here…) but I always recommend Peters Watts - Blindsight for an absolutely thrilling sci-fi book that totally exposes the sham that consciousness is.
(Clears throat, adjusts notes on lecturn)
So kinda to your point…on the pie chart, “rational thought” is a thin slice.
However, upon a bit of reflection, I don’t really know if I am referring specifically to rational thought when I said rationalization engine earlier, and that’s probably down to my layman’s education. Ambiguity. Truly the devil’s volleyball. Not the right way to start an interesting conversation so again, kudos to your magnanimous disposition in this dialogue.
It’s fair, perhaps even obvious, to assert that rationalizations are a by product of faulty mental modelling. That this requires modeling of any sort implies that the organism is capable of abstract thought (ie: what humans are good at: putting ourselves in the other parties’ shoes to either empathize our outwit them - Erasmus or Machiavelli). But (and this is speculative on my part, but if it’s incorrect I need another theory to explain animal behaviour), I posit that organisms other than humans also need to model reality with high fidelity, and also need an internally consistent, accurate version of it in order to succeed. That implies error correction on the model, which is, more or less, the error correction algorithm we call rationalization; making the incongruent ends make sense so mountains can once again, be mountains.
So really, I meant “the fitness of brains has been attuned over the course of evolution such that a coherent narrative is seen as the optimal desired ground state”. In humans the narrative is verbose and tagged with lots of less than useful metadata. Is the data stream is the same as it is for an alligator as it is for us though? And do primitive systems, developed in evolution’s lab aeons ago and shared by all of us, govern our responses in the same way?
Tldr; the back and forth banter here seems to me to be: is rationalization a process of high level consciousness or is it a side effect of the “inertia of stability”? Is a “stable-specific” pattern of mental activity just where the elastic relaxes to, and so rationalization (or make-sense-of-it-ness) just has to happen? In other words: what is the motivation, from a fitness perspective, for anyone, any “being” to narrate/edit the models, no matter how primitively they do it?
Thanks for the opportunity to exposition dump with ya. Definitely curious about this subject, but that’s probably the selection bias from my own brain.