• warm@kbin.earth
    link
    fedilink
    arrow-up
    19
    ·
    11 hours ago

    Fuck AI, but a respectable stance.

    And then you have the ShareX developer, using a screenshot of an AI response to respond to a GitHub issue.

  • HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    10 hours ago

    What about the IP issues? Not even talking about the “ethics” of “ip theft via AI” or anything, you just know a company like Microsoft or Apple will eventually try suing an open source project over AI code that’s “too similar” to their proprietary code. Doesn’t matter if they’re doing the same to a much greater degree, all that matters is they have the resources to sue open source projects and not the other way around. If a tech company can get rid of the competition by abusing the legal system, you just know they will, especially if they can also play the "they’re knowingly letting their users use pirated media that we own with their software” card on top of it.

    • Eager Eagle@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      9 hours ago

      you just know a company like Microsoft or Apple will eventually try suing an open source project over AI code that’s “too similar” to their proprietary code.

      Doubt it. The incentives don’t align. They benefit from open source much more than are threatened by it. Even that “embrace, extent, extinguish” idea comes from different times and it’s likely less profitable than the vendor lock-in and other modern practices that are actually in place today. Even the copyright argument is something that could easily backfire if they just throw it in a case, because of all this questionable AI training.