Right?! At the end of the day, they’re still just people. Gotta eat, gotta sleep, gotta shit. I will never understand how any individual gets so much money/power/attention because they’re all just god damn people and in the event of a catastrophe i imagine they would be about as helpful as any other random human. They aren’t gods, and they certainly don’t deserve the stratification. It’s not like they’re enlightened or something, most of the time they’re just sociopaths who are rich, clever, and/or connected. When you get a glimpse under the hood at moments like this, it really is kinda jarring. Helps to dispel those silly presumptions about them at least.
It never fails to amaze me, how much C-level people are disconnected from reality.
What you’re describing is a general experience with LLM, not limited to the C-level.
If an LLM sprouts rubbish you detect it because you have external knowledge, in other words, you’re the subject matter expert.
What makes you think that those same errors are not happening at the same rate outside your direct personal sphere of knowledge?
Now consider what this means for the people around you, including the C-level.
Repeat after me, AI is Assumed Intelligence and should not be considered anything more than autocorrect on steroids.
Right?! At the end of the day, they’re still just people. Gotta eat, gotta sleep, gotta shit. I will never understand how any individual gets so much money/power/attention because they’re all just god damn people and in the event of a catastrophe i imagine they would be about as helpful as any other random human. They aren’t gods, and they certainly don’t deserve the stratification. It’s not like they’re enlightened or something, most of the time they’re just sociopaths who are rich, clever, and/or connected. When you get a glimpse under the hood at moments like this, it really is kinda jarring. Helps to dispel those silly presumptions about them at least.