

No one will ever pay me enough to be emotionally invested in ‘work’.
No one will ever pay me enough to be emotionally invested in ‘work’.
adding to the list: SHIA LABEOUF
i hope it gets a 15 minute standing boo
I’m running an rtx 4070 on mint-cinnamon. Works great. I know that’s a different GPU but just sharing that nvidia drivers + mint are working just fine for me.
hm. Maybe i’ll try a dual wield tank buillll…aaaaaaaaaaand imma stealth archer.
sudo apt-get remove google
Exactly this. Just left iOS and got a pixel8a and flashed grapheneos on it. Apple is doing the same shit. GOS might be a pain in the ass sometimes but I feel much better knowing that Tim Apple isn’t reading my texts and monitoring my bank apps so they can target me with ads.
valid.
In addition to this, I was just thinking about how many kids are asking questions to LLM’s that just a few years ago they would have asked friends or their parents or a mentor. A whole generation that will be used to taking advice from a black-box LLM.
After reading the actual published science paper referenced in the article, I would downvote the article because the title is clickbaity and does not reflect the conclusions of the paper. The title suggests that AI could replace pathologists, or that pathologists are inept. This is not the case. Better title would be “Pathologists use AI to determine if biopsied tissue samples contain markers for cancerous tissue that is outside the biopsied region.”
From the peer-reviewed paper: “This study examined if artificial intelligence (AI) could detect these morphological clues in benign biopsies from men with elevated prostate-specific antigen (PSA) levels to predict subsequent diagnosis of clinically significant PCa within 30 months”… so yes, these were men who all had high cancer risk.
Your tax money
Its a shame that I assume this article is written by an LLM, prompted and edited by a person, and thus I have little will to even read it.
I’ve been running GrapheneOS on a pixel8a for about a month now and its fine. Came from iphone so it’s probably an easier transition for people who have used android before.
this reads like an aged alchemist who is convinced he’s on the cusp of finding the philosopher’s stone
Now you’ve got me reading the wikipedia page on Zen 5 and 6 instead of working 👍
I was reading recently about how Nvidia gpu familes (i.e. the RTX 40xx series) all use the same architecture, but are separated by the number of defects on the board (as all nanofab-made electronics have a certain amount of defects). They are placed into different ‘bins’ that have certain ranges of defective components, but the functionality as a gpu is still fine. I imagine that other companies do this too, and with other electronics besides gpus. Also as mentioned in another comment, ‘2nm’ is a marketing term and does not indicate the actual size of the transistors. My point being: this may not result in larger supply of currently available electronics that use different architectures (4nm, 5nm process, etc). It is more likely to be a move to prepare facilities for next-gen electronics. I don’t know about specific companies intentions to move to ‘2nm’ processes for their products or how easy that actually is, though, so it would be a case by case check.
“I see AI as a positive because it undermines the monopoly on intellectual property of capitalists”
I would contest that it does the exact opposite. Massive companies have been given free reign to legally steal (see Meta’s plagiarism/piracy court docs) from the entire internet. They then roll out the ai model, that they own, which can recreate things (art, writing, technical docs, etc) that are amalgamations of all the stolen work. They then sell subscriptions for people to use their ai, funneling people away from paying real artists.
did you put mirrors on both walls?