ikt@aussie.zone to LocalLLaMA@sh.itjust.worksEnglish · 1 month agoHow Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inferencearxiv.orgexternal-linkmessage-square16fedilinkarrow-up123arrow-down13
arrow-up120arrow-down1external-linkHow Hungry is AI? Benchmarking Energy, Water, and Carbon Footprint of LLM Inferencearxiv.orgikt@aussie.zone to LocalLLaMA@sh.itjust.worksEnglish · 1 month agomessage-square16fedilink
minus-squareikt@aussie.zoneOPlinkfedilinkEnglisharrow-up5arrow-down2·1 month agoWhere is that chart from? If we were smart and responsible we would admit AI has hit a wall What wall has it hit?
minus-squareSoftestSapphic@lemmy.worldlinkfedilinkEnglisharrow-up2arrow-down1·1 month agohttps://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/
minus-squareikt@aussie.zoneOPlinkfedilinkEnglisharrow-up2arrow-down4·1 month agoyou’re saying a wall has been hit based on a wired article 🤣 i just watched my first ai movie https://m.youtube.com/watch?v=vtPcpWvAEt0 3 years ago this was a tiny 5 second blurry mess i don’t know why you’re here, you’re clueless
minus-squareSoftestSapphic@lemmy.worldlinkfedilinkEnglisharrow-up2arrow-down5·1 month agoIm taking the CEO of Open AI at his word as a Computer Scientist Cope harder religous freak
minus-squareikt@aussie.zoneOPlinkfedilinkEnglisharrow-up3arrow-down2·1 month agoall good bro again I don’t know why you’re here, you can literally follow this sub and run your own llm locally on your pc running on solar power
Where is that chart from?
What wall has it hit?
https://www.wired.com/story/openai-ceo-sam-altman-the-age-of-giant-ai-models-is-already-over/
you’re saying a wall has been hit based on a wired article 🤣
i just watched my first ai movie
https://m.youtube.com/watch?v=vtPcpWvAEt0
3 years ago this was a tiny 5 second blurry mess
i don’t know why you’re here, you’re clueless
Im taking the CEO of Open AI at his word as a Computer Scientist
Cope harder religous freak
all good bro
again I don’t know why you’re here, you can literally follow this sub and run your own llm locally on your pc running on solar power