The GNOME.org Extensions hosting for GNOME Shell extensions will no longer accept new contributions with AI-generated code. A new rule has been added to their review guidelines to forbid AI-generated code.

Due to the growing number of GNOME Shell extensions looking to appear on extensions.gnome.org that were generated using AI, it’s now prohibited. The new rule in their guidelines note that AI-generated code will be explicitly rejected

    • IngeniousRocks (They/She) @lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      17
      ·
      edit-2
      2 hours ago

      Just an example:

      I’m a programming student. In one of my classes we had a simple assignment. Write a simple script to calculate factorials. The purpose of this assignment was to teach recursion. Should be doable in 4-5 lines max, probably less. My coed decided to vibe code his assignment and ended up with a 55 line script. It worked, but it was literally %1100 of the length it needed to be with lots of dead functions and ‘None->None(None)’ style explicit typing where it just simply wasn’t needed.

      The code was hilariously obviously AI code.

      Edit: I had like 3/4 typos here

    • brian@programming.dev
      link
      fedilink
      arrow-up
      10
      ·
      5 hours ago

      if it’s not clear if it’s ai, it’s not the code this policy was targeting. this is so they don’t have to waste time justifying removing the true ai slop.

      if the code looks bad enough to be indistinguishable from ai slop, I don’t think it matters that it was handwritten or not.

    • kadu@scribe.disroot.org
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      5 hours ago

      I guess the practical idea is that if your AI generated code is so good and you’ve reviewed it so well that it fools the reviewer, the rule did it’s job and then it doesn’t matter.

      But most of the time the AI code jumps out immediately to any experienced reviewer, and usually for bad reasons.

      • refalo@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 hour ago

        So then it’s not really a blanket “no-AI” rule if it can’t be enforceable if it’s good enough? I suppose the rule should have been “no obviously bad AI” or some other equally subjective thing?

  • i_stole_ur_taco@lemmy.ca
    link
    fedilink
    arrow-up
    49
    ·
    16 hours ago

    extension developers should be able to justify and explain the code they submit, within reason

    I think this is the meat of how the policy will work. People can use AI or not. Nobody is going to know. But if someone slops in a giant submission and can’t explain why any of the code exists, it needs to go in the garbage.

    Too many people think because something finally “works”, it’s good. Once your AI has written code that seems to work, that’s supposed to be when the human starts their work. You’re not done. You’re not almost done. You have a working prototype that you now need to turn into something of value.

    • Jankatarch@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      9 hours ago

      Just the fact that people are actually trying to regulate it instead of “too nuanced, I will fix it tomorrow” makes me haply.

      But they are also doing it pretty reasonably too. I like this.

  • Stern@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    12 hours ago

    Good.

    I’m mostly switched off SAMMI because their current head dev is all in on AI bullshit. Got maybe one thing left to move to streamerbot and I’m clear there. My two regular viewers wont notice at all but I’ll feel better about it.

    • uncouple9831@lemmy.zip
      link
      fedilink
      arrow-up
      5
      arrow-down
      48
      ·
      16 hours ago

      Why? If the code works the code works, and a person had to make it work. If they generated some functions who cares? If they let the computer handle the boilerplate, who cares? “Oh no the style is inconsistent…” Who cares?

      • brian@programming.dev
        link
        fedilink
        arrow-up
        5
        ·
        5 hours ago

        you shouldn’t be able to tell if someone used ai to write something. if you can then it is bad code. they’re not talking about getting completion on a fn, they’re talking about letting an agent go and write chunks of the project.

        • uncouple9831@lemmy.zip
          link
          fedilink
          arrow-up
          1
          arrow-down
          3
          ·
          edit-2
          5 hours ago

          So then the policy doesn’t make sense and should focus on what specific issues are associated with llm-generated code that are problematic. For example, I’ve seen llms generate fairly unreadable loops because it uses weird variable names. That’s a valid offense to criticize.

          However I’ve also read C code before so I’ve seen an obscene amount of human generated code with shitty variable names that don’t mean anything. So why is the shitty human C code ok but shitty LLM code is not? And if no shitty code is accepted (it’s gnome so I doubt that), then why does anyone need a new rule?

      • urandom@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 hours ago

        It’s always some definition of works. The code never works in all cases, which works lead to people being annoyed with gnome for allowing the extension in the first place

        • uncouple9831@lemmy.zip
          link
          fedilink
          arrow-up
          1
          arrow-down
          4
          ·
          edit-2
          5 hours ago

          How is that different from most gnome extensions? Or gnome extensions right after a gnome update that breaks them all? Isn’t being broken the default state of gnome extensions?

        • De Lancre@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          2 hours ago

          But if we talking about extensions, no one will debug your code. There like, 5 extensions that used consistently and others have 5-10 downloads. We have like, 5 extensions to hide top bar, cause each time developer just give up, so I don’t really understand this “rule” and reasons behind it.

        • ikidd@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          3
          ·
          6 hours ago

          This is Gnome we’re talking about here, they don’t GAF if extensions work or not. They’ll break them tomorrow if they feel like it.

        • uncouple9831@lemmy.zip
          link
          fedilink
          arrow-up
          2
          arrow-down
          17
          ·
          edit-2
          5 hours ago

          Why would that be anyone other than the original author? This sounds like a hosting service is refusing to host things based on what tool was used in creation. “Anyone using emacs can’t upload code to GitHub anymore” seems equivalently valid.

          • vrighter@discuss.tchncs.de
            link
            fedilink
            arrow-up
            9
            ·
            9 hours ago

            in the case of ai generated code, that is almost always the case. People say “but I review all my pet neural network’s code!” but they don’t. If they did, the job would actuallydtake longer. Reading and understanding code takes longer than writing it.

            • uncouple9831@lemmy.zip
              link
              fedilink
              arrow-up
              1
              arrow-down
              3
              ·
              5 hours ago

              I don’t think this is in response to my message. If that was the intent, I think you need to define what “that” is, which is always the case.

          • imecth@fedia.io
            link
            fedilink
            arrow-up
            11
            ·
            11 hours ago

            GNOME manually reviews every extension, and they understandably don’t want to review AI generated code.

            • uncouple9831@lemmy.zip
              link
              fedilink
              arrow-up
              1
              arrow-down
              4
              ·
              edit-2
              5 hours ago

              Oh…an actually human response. How refreshing. At least one person here got their rabies shot.

              Do they actually review it or is it like how android and apple “review” apps? And why would they be reviewing the code rather than putting it through some test suite/virus scanning suite or something? That is, this shit isn’t going away any time soon even if the bubble pops, so why not find a way to avoid the work rather than ban people who make the work “too hard”?

                • uncouple9831@lemmy.zip
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  2 hours ago

                  I’m calm, but since you need to hear it: nobody has ever in the history of the human race received the command to “calm down” and had it make them calmer. So chill out broski.

                • uncouple9831@lemmy.zip
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  5
                  ·
                  edit-2
                  4 hours ago

                  Oof this just makes it so much worse. It sounds like they have two complaints:

                  There are more extensions being made now. Good. If you can’t keep up, charge money to review them or something. Even charging 10 cents will drop submissions instantly.

                  The extensions have unnecessary try/catch blocks. And it’s not just any try catch blocks that aren’t necessary…it’s only the ai-generated unnecessary try catch blocks. Human-generated unnecessary try/catch blocks are fine. This is dumb and a dumb example because it’s a structure whose behavior is well understood and well defined. I add unnecessary try/catch blocks to my code all the time if I don’t feel like digging in at the moment to figure out all of the failure modes of some function. It’s only when a LLM does it that it upsets the poster. Ridiculous.

                • uncouple9831@lemmy.zip
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  5 hours ago

                  Why are you atting us? Replies show up for us anyway.

                  Regardless, I don’t know how your statement addresses anything I said.

          • fodor@lemmy.zip
            link
            fedilink
            arrow-up
            6
            ·
            12 hours ago

            Yes it would be someone else. If the code looks good then it might last a long time, and it could even be expanded upon. One key point of FOSS is that anyone can change it, and if it’s good, people will.

            • uncouple9831@lemmy.zip
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              edit-2
              5 hours ago

              Great, so then it’s someone reading new code either way, so it shouldn’t matter if it’s in the LLM style or random human A’s style, it’s still something you have to read and learn.

              But also I wonder if there’s an analysis of how many of these extensions has ever been touched by more than a single human, ever. I don’t know, but I sure wouldn’t be surprised if the answer is 80%.

            • uncouple9831@lemmy.zip
              link
              fedilink
              arrow-up
              1
              arrow-down
              4
              ·
              5 hours ago

              And is that something that happens regularly with gnome extensions? My recollection is they are a barely functioning collection of random trash code. Were they all written by contractors who got fired?

  • itsathursday@lemmy.world
    link
    fedilink
    arrow-up
    68
    ·
    22 hours ago

    You used to be able to tell an image was photoshopped because of the pixels. Now with code you can tell it was written with AI because of the comments.

  • danhab99@programming.dev
    link
    fedilink
    arrow-up
    6
    arrow-down
    8
    ·
    18 hours ago

    So what does this mean? Bc like (at least with my boss) whenever I submit ai generated code at work I still have to have a deep and comprehensive understanding of the changes that I made, and I have to be right (meaning I have to be right about what I say bc I cannot say the AI solved the problem). What’s the difference between that and me writing the code myself (+googling and stack overflow)?

    • theneverfox@pawb.social
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      1
      ·
      17 hours ago

      The difference is people aren’t being responsible with AI

      You’re projecting competence onto others. You speak like you’re using AI responsibly

      I use AI when it makes things easier. All the time. I bet you do too. Many people are using AI without a steady hand, without the intellectual strength to use it properly in a controlled manner

      • Hawk@lemmynsfw.com
        link
        fedilink
        arrow-up
        3
        ·
        17 hours ago

        Its like a gas can over a match. Great for starting a campfire. Excellent for starting a wildfire.

        Learning the basics and developing a workflow with VC is the answer.

          • Hawk@lemmynsfw.com
            link
            fedilink
            arrow-up
            4
            arrow-down
            1
            ·
            16 hours ago

            Large language models are incredibly useful for replicating patterns.

            They’re pretty hit and miss with writing code, but once I have a pattern that can’t easily be abstracted, I use it all the time and simply review the commit.

            Or a quick proof of concept to ensure a higher level idea can work. They’re great for that too.

            It is very annoying though when I have people submit me code that is all AI and incredibly incorrect.

            Its just another tool on my belt. Its not going anywhere so the real trick is figuring out when to use it and why and when not to use it.

            To be clear VC was version control. I should have been more clear.

            • theneverfox@pawb.social
              link
              fedilink
              English
              arrow-up
              3
              ·
              16 hours ago

              Okay, that’s pretty fair. You seem to understand the tool properly

              I’d argue that version control is not the correct layer to evaluate output, but it is a tool that can be used in many different ways…I don’t think that’s a great workflow, but I can conceive situations where that’s viable enough

              If I were handing out authorizations to use AI, you’d get it

      • uncouple9831@lemmy.zip
        link
        fedilink
        arrow-up
        1
        arrow-down
        10
        ·
        16 hours ago

        Banning a tool because the people using it don’t check their work seems shortsighted. Ban the poor users, not the tool.

        • logging_strict@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          5 hours ago

          They should state a justification. Not merely what they are looking for to identify AI generated code.

          The justification could be the author is unlikely to be capable of maintenance. In which case the extension is just going to inconvenience/burden onto others.

          So far their is no justification stated besides, da fuk and yuk.

          • uncouple9831@lemmy.zip
            link
            fedilink
            arrow-up
            1
            ·
            5 hours ago

            Exactly, there isn’t a criteria other than the reviewer getting butthurt. Granted this is gnome, so doing whatever they feel like regardless of consequences is kind of their thing, but a saner organization would try to make the actual measurable badness more clear.

        • theneverfox@pawb.social
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          16 hours ago

          We do this all the time. I’m certified for a whole bunch of heavy machinery, if I were worse people would’ve died

          And even then, I’ve nearly killed someone. I haven’t, but on a couple occasions I’ve come way too close

          It’s good that I went through training. Sometimes, it’s better to restrict who is able to use powerful tools

          • uncouple9831@lemmy.zip
            link
            fedilink
            arrow-up
            2
            arrow-down
            5
            ·
            edit-2
            13 hours ago

            Yeah something tells me operating heavy machinery is different from uploading an extension for a desktop environment. This isn’t building medical devices, this isn’t some misra compliance thing, this is a widget. Come on, man, you have to know the comparison is insane.

            • theneverfox@pawb.social
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              1
              ·
              4 hours ago

              People have already died to AI. It’s cute when the AI tells you to put glue on your pizza or asks you to leave your wife, it’s not so cute when architects and doctors use it

              Bad information can be deadly. And if you rely too hard on AI, your cognitive abilities drop. It’s a simple mental shortcut that works on almost everything

              It’s only been like 18 months, and already it’s become very apparent a lot of people can’t be trusted with it. Blame and punish those people all you want, it’ll just keep happening. Humans love their mental shortcuts

              Realistically, I think we should just make it illegal to have customer facing LLMs as a service. You want an AI? Set it up yourself. It’s not hard, but realizing it’s just a file on your computer would do a lot to demystify it

              • uncouple9831@lemmy.zip
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                3 hours ago

                Have people died to desktop extensions?

                Cause that’s the topic here.

                You’re fighting a holy war against all AI, dune style.

                I’m saying this is a super low risk environment where the implications appear to be extra try/catch blocks the code reviewers don’t like – not even incorrect functionality.

                • theneverfox@pawb.social
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 hour ago

                  Well I was just arguing that people generally are using AI irresponsibly, but if you want to get specific…

                  You say ban the users, but realistically how are they determining that? The only way to reliably check if something is AI is human intuition. There’s no tool to do that, it’s a real problem

                  So effectively, they made it an offense to submit AI slop. Because if you just use AI properly as a resource, no one would be able to tell

                  So what are you upset about?

                  They did basically what you suggested, they just did it by making a rule so that they can have a reason to reject slop without spending too much time justifying the rejection

    • fodor@lemmy.zip
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      12 hours ago

      What’s the difference? Jesus, we have seen the difference in the news for the past year. You know the difference. Don’t play dumb now.

      • De Lancre@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        2 hours ago

        We still talking about extensions, right? Those things in gnome, that shows weather or time in different time zone?

        Cause if yes, your response is kinda weird. Oh no, my weather applet is created using AI! Everything will fall apart! Jesus Christ, we need to burn author for that!