

Touch buttons on the steering “wheel”
Touch buttons on the steering “wheel”
😄my infotainment is only a "dumb” touchscreen radio with android auto / carPlay support anyway
Free market doesn’t work if all car brands are owned by like 5 companies and all of them agreeing to add ads.
That is why we need to regulate cooperations as well as enforce them properly.
Yea, but ReGuLaTiOn bAd, of course
I am so happy that my car doesn’t update itself automatically
It was a rip off from the start…
I must confess, I was into it as well, but the game design was it to be so omnipresent that it hindered me on other tasks, so that I had to stop. I can’t play this game casually 😆 either all in or nothing, lol
Like this rezz?
https://music.apple.com/ch/artist/rezz/1046759940?l=en-GB
Yea, those are great vibrations!
So hyped seeing her at rampage!
Whoever thought touch button for blinking signs is a good idea 😆
So many Teslas blinking wrong on the streets now…
I’m thinking about getting one of those
Maybe if AI talks to AI for long enough, they get smart 🤔
deleted by creator
America is on track I’d say, Musk n Zuck are so horny to do that…
I like how you corrected opinion to experience 😃👌🏻
And yes, I would call that an evidence, not a proof but clearly an evidence, especially if you did not change anything else (hardware or start from scratch setting up Linux distribution).
Maybe it is kinda a bias since nvidia is easy to blame and is existing in most PCs 🤔
🤭and sometimes, if you wake your linux things go to shit and all you see is black screen and white mouse on it
Sometimes super+ctrl+alt+F8 saves me and I can restart PC from TTY, and sometimes, there is only a flashing cursor. In second case, I have to take hard measures and forcefully manually restart it
(Yes nvidia card with latest proprietary driver and kde on wayland) -> everything latest meaning from endeavour/arch/aur repos.
This is because hardcoded human algorithms are still better in doing stuff on your phone than AI generated actions.
It seems like they didn’t even test the chatbot in a real live scenario, or trained it specifically to be an assistant on a phone.
They should give it options to trigger stuff, like siri with the workflows. And they should take their time and resources training it. They should give app developers a way to give the AI worker some structured data. The AI should be trained to search for correct context using that API and that it plugs the correct data into the correct workflow.
I bet, they just skipped that individual training of gemini to work as phone assistant.
Apple seems to plan exactly that, and that is way it will be released so late VS the other LLM AI phone assistants. I’m looking forward to see if apple manages to achieve their goal with AI (I will not use it, since I will not buy a new phone for that and I don’t use macOS)
😂MaNDA
I bet it is this sticker, lol
I prefer ongoing maintenance over backwards compatibility, I can easily run such old software in an emulator in recent hardware.
Well it is a nice young youtuber starting a startup using kickstarter. I think an investment is worth it 😇
Edit: 😯 way more expensive than I thought