The GNOME.org Extensions hosting for GNOME Shell extensions will no longer accept new contributions with AI-generated code. A new rule has been added to their review guidelines to forbid AI-generated code.

Due to the growing number of GNOME Shell extensions looking to appear on extensions.gnome.org that were generated using AI, it’s now prohibited. The new rule in their guidelines note that AI-generated code will be explicitly rejected

  • uncouple9831@lemmy.zip
    link
    fedilink
    arrow-up
    2
    arrow-down
    5
    ·
    edit-2
    13 hours ago

    Yeah something tells me operating heavy machinery is different from uploading an extension for a desktop environment. This isn’t building medical devices, this isn’t some misra compliance thing, this is a widget. Come on, man, you have to know the comparison is insane.

    • theneverfox@pawb.social
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      4 hours ago

      People have already died to AI. It’s cute when the AI tells you to put glue on your pizza or asks you to leave your wife, it’s not so cute when architects and doctors use it

      Bad information can be deadly. And if you rely too hard on AI, your cognitive abilities drop. It’s a simple mental shortcut that works on almost everything

      It’s only been like 18 months, and already it’s become very apparent a lot of people can’t be trusted with it. Blame and punish those people all you want, it’ll just keep happening. Humans love their mental shortcuts

      Realistically, I think we should just make it illegal to have customer facing LLMs as a service. You want an AI? Set it up yourself. It’s not hard, but realizing it’s just a file on your computer would do a lot to demystify it

      • uncouple9831@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        3 hours ago

        Have people died to desktop extensions?

        Cause that’s the topic here.

        You’re fighting a holy war against all AI, dune style.

        I’m saying this is a super low risk environment where the implications appear to be extra try/catch blocks the code reviewers don’t like – not even incorrect functionality.

        • theneverfox@pawb.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 hour ago

          Well I was just arguing that people generally are using AI irresponsibly, but if you want to get specific…

          You say ban the users, but realistically how are they determining that? The only way to reliably check if something is AI is human intuition. There’s no tool to do that, it’s a real problem

          So effectively, they made it an offense to submit AI slop. Because if you just use AI properly as a resource, no one would be able to tell

          So what are you upset about?

          They did basically what you suggested, they just did it by making a rule so that they can have a reason to reject slop without spending too much time justifying the rejection