

We need sites that are decentralized and make clear where their funding comes from. In the end, users need to be prepared to fund the services they use to keep them honest. This is especially true of video which is very expensive to host.
We need sites that are decentralized and make clear where their funding comes from. In the end, users need to be prepared to fund the services they use to keep them honest. This is especially true of video which is very expensive to host.
YouTube on mobile is intolerable these days. NewPipe is the way for Android, though Google is in a constant battle to disable it.
I’m sure they kept copies of the data to train their AI on.
Good luck. I hope it goes well for you. You might want to find a different therapist.
Also, in Windows when you finally do run the program it just hangs with “Not responding”.
This may be due to manufacturers locking their machines down with Secure Boot and only installing the keys that allow it to boot Windows. It’s not something that could be fixed by the makers of the Linux install disk. They’d need to persuade the hardware manufacturer to preinstall their key.
I’d even go as far as saying that you should reject anyone applying for your startup if they claim to have vibe coding experience.
I find it hard to imagine why you would even put that on a resume. Isn’t it like saying “I DON’T KNOW WHAT I’M DOING”? But then I’m old.
I install Linux on many machines each year, and I can’t even remember the last time I had a problematic installation. Your experience sounds quite unusual. Are you using some obscure distro?
I think they meant you’d have to design a combination of hardware that’s all compatible with Linux - that is, that has Linux driver support.
“No real human would go four links deep into a maze of AI-generated nonsense,” Cloudflare explains. “Any visitor that does is very likely to be a bot, so this gives us a brand-new tool to identify and fingerprint bad bots.”
It sounds like there may be a plan to block known bots once they have used this tool to identify them. Over time this would reduce the amount of AI slop they need to generate for the AI trap, since bots already fingerprinted would not be served it. Since AI generators are expensive to run, it would be in Cloudflare’s interests to do this. So while your concern is well placed, in this particular case there may be a surge of energy and water usage at first that tails off once more bots are fingerprinted.
Some of these LLMs introduce very subtle statistical patterns into their output so it can be recognized as such. So it is possible in principle (not sure how computationally feasible when crawling) to avoid ingesting whatever has these patterns. But there will also be plenty of AI content that is not deliberately marked in this way, which would be harder to filter out.
Are you talking about Teams in Teams for Home or Teams for Work and School, and is it Teams or New Teams you mean?
Obsidian is a fancy markdown editor with metadata, sync, indexing, data querying and views and a lively ecosystem of plugins. It has everything except being open source.
Same for ZDNet.
You could use any trustworthy sync service with automatic camera uploads, but they will all wait until the video has finished recording before uploading it. Ideally there would be an app that streams live to a remote server that’s recording. There used to be. A sync service might be second best though.
Do any dash cams stream to the cloud or a self-hosted server? If the police spot the dashcam they may just delete the footage.
You need something that streams to a secure server, so the police can’t just delete the video.
I never really liked that kind of use of variable shadowing. It seems like swapping one set of potential risks, that are easily spotted when debugging, for another more subtle kind of risk that’s harder to notice.
How is discoverability in PeerTube? That was the sticking point for me with PeerTube as with Mastodon last time I looked. It was not easy to discover what’s out there.