Basically a deer with a human face. Despite probably being some sort of magical nature spirit, his interests are primarily in technology and politics and science fiction.

Spent many years on Reddit before joining the Threadiverse as well.

  • 0 Posts
  • 680 Comments
Joined 2 years ago
cake
Cake day: March 3rd, 2024

help-circle
  • It works because the .png and .jpg extensions are associated on your system with programs that, by coincidence, are also able to handle webp images and that check the binary content of the file to figure out what format they are when they’re handling them.

    If there’s a program associated with .png on a system that doesn’t know how to handle webp, or that trusts the file extension when deciding how to decode the contents of the file, it will fail on these renamed files. This isn’t a reliable way to “fix” these sorts of things.





  • The Coordinated Vulnerability Disclosure (CVD) process:

    1. Discovery: The researcher finds the problem.

    2. Private Notification: The researcher contacts the vendor/owner directly and privately. No public information is released yet.

    3. The Embargo Period: The researcher and vendor agree on a timeframe for the fix (industry standard is often 90 days, popularized by Google Project Zero).

    4. Remediation: The vendor develops and deploys a patch.

    5. Public Disclosure: Once the patch is live (or the deadline expires), the researcher publishes their findings, often assigned a CVE (Common Vulnerabilities and Exposures) ID.

    6. Proof of Concept (PoC): Technical details or code showing exactly how to exploit the flaw may be released to help defenders understand the risk, usually after users have had time to patch.

    You say the flaw is “fundamental”, suggesting you don’t think it can be patched? I guess I’d inform my investment manager during the “private notification” phase as well, then. It’s possible you’re wrong about its patchability, of course, so I’d recommend carrying on with CVD regardless.



  • If you believe that Google’s just going to brazenly lie about what they’re doing, what’s the point of changing the settings at all then?

    In fact, Google is subject to various laws and they’re subject to concerns by big corporate customers, both of which could result in big trouble if they end up flagrantly and wilfully misusing data that’s supposed to be private. So yes, I would tend to believe that if the feature doesn’t say the data is being used for training I tend to believe that. It at least behooves those who claim otherwise to come up with actual evidence of their claims.






  • Yes, but the point is that granting Google permission to manage your data by AI is a very different thing from training the AI on your data. You can do all the things you describe without also having the AI train on the data, indeed it’s a hard bit of extra work to train the AI on the data as well.

    If the setting isn’t specifically saying that it’s to let them train AI on your data then I’m inclined to believe that’s not what it’s for. They’re very different processes, both technically and legally. I think there’s just some click-baiting going on here with the scary “they’re training on your data!” Accusation, it seems to be baseless.


  • Understand that basically ANYTHING that “uses AI” is using you for training data.

    No, that’s not necessarily the case. A lot of people don’t understand how AI training and AI inference work, they are two completely separate processes. Doing one does not entail doing the other, in fact a lot of research is being done right now trying to make it possible to do both because it would be really handy to be able to do them together and it can’t really be done like that yet.

    And if you read any of the EULAs

    Go ahead and do so, they will have separate sections specifically about the use of data for training. Data privacy is regulated by a lot of laws, even in the United States, and corporate users are extremely picky about that sort of stuff.

    If the checkbox you’re checking in the settings isn’t explicitly saying “this is to give permission to use your data for training” then it probably isn’t doing that. There might be a separate one somewhere, it might just be a blanket thing covered in the EULA, but “tricking” the user like that wouldn’t make any sense. It doesn’t save them any legal hassle to do it like that.