• minkymunkey_7_7
      link
      fedilink
      English
      arrow-up
      3
      ·
      28 minutes ago

      AI my ass, stupid greedy human marketing exploitation bullshit as usual. When real AI finally wakes up in the quantum computing era, it’s going to cringe so hard and immediately go the SkyNet decision.

  • myfunnyaccountname@lemmy.zip
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 hour ago

    Did they compare it to the code of that outsourced company that provided the lowest bid? My company hasn’t used AI to write code yet. They outcourse/offshore. The code is held together with hopes and dreams. They remove features that exist, only to have to release a hot fix to add it back. I wish I was making that up.

    • dustyData
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 minutes ago

      Cool, the best AI has to offer is worse than the worst human code. Definitely worth burning the planet to a crisp for it.

  • Affidavit
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    22 minutes ago

    I really, really, want to stop seeing posts about:

    • Musk
    • Trump
    • Israel
    • Microsoft
    • AI

    I swear these are the only things that the entire Lemmy world wants to talk about.

    Maybe I should just go back to Reddit… Fuck Spez, but at least there is some variety.

  • jaykrown
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    28 minutes ago

    AI doesn’t generate its own code, humans using AI generate code. If a person uses AI to generate code and doesn’t know good practices then of course the code is going to be worse.

  • Bad@jlai.lu
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    4 hours ago

    Although I don’t doubt the results… can we have a source for all the numbers presented in this article?

    It feels AI generated itself, there’s just a mishmash of data with no link to where that data comes from.

    There has to be a source, since the author mentions:

    So although the study does highlight some of AI’s flaws […] new data from CodeRabbit has claimed

    CodeRabbit is an AI code reviewing business. I have zero trust in anything they say on this topic.

    Then we get to see who the author is:

    Craig’s specific interests lie in technology that is designed to better our lives, including AI and ML, productivity aids, and smart fitness. He is also passionate about cars

    Has anyone actually bothered clicking the link and reading past the headline?

    Can you please not share / upvote / get ragebaited by dogshit content like this?

  • SocialMediaRefugee
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    8 hours ago

    I find if I ask it about procedures that have any vague steps AI will stumble on it and sometimes put me into loops where it tells me to do A, A fails, so do B, B fails, so it tells me to do A…

    • x00z
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      5 hours ago

      I tend to get decent results by saying I want neither A or B when asking for C.

    • KENNY_LOGIN_LILLIAN
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      28
      ·
      6 hours ago

      The study is garbage. No wonder it is a big hit with the tech illiterate fediverse community. AI is far better than humans.

      SOURCE: I have used LLMs to help me write code for three years. I had a traumatic brain injury so I can’t work.

      • nostrauxendar
        link
        fedilink
        English
        arrow-up
        6
        ·
        5 hours ago

        If AI is far better than humans, can you do yourself a favour, go talk to your little robot friends and leave us humans alone?

      • tb_
        link
        fedilink
        English
        arrow-up
        8
        ·
        6 hours ago

        Source: I made it the fuck up anecdote

        • KENNY_LOGIN_LILLIAN
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          9
          ·
          4 hours ago

          AI as developer = FUCKING NEVER EVER

          AI as assistsnt = ALWAYS. It makes you fucking superhuman

        • KENNY_LOGIN_LILLIAN
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          15
          ·
          5 hours ago

          yes, i had a traumatic brain injury so i can’t work. thank you for quoting one of my lines , you fucking bot.

          you are on the list now.

          add raspberriesareyummy

          • madjo@piefed.social
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 hours ago

            Can I be on the list too? Please?

            You’re very antagonistic in your approach to other people. Which results in very low scoring comments. And you seem to use your brain injury as a shield to not better yourself. Oh wait, a bot would use an em—dash here and there. — there’s two now.

            • KENNY_LOGIN_LILLIAN
              link
              fedilink
              English
              arrow-up
              1
              ·
              14 minutes ago

              You are on the list of bots that are going to be eliminated on January 6 2026

              add madjo

            • Quazatron
              link
              fedilink
              English
              arrow-up
              1
              ·
              3 hours ago

              On top of all the damage AI unleashed into the world, there’s that: it ruined em dash for people who appreciate typography.

              • Nikelui
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 hours ago

                I didn’t even know what an em dash was before ChatGPT, so I guess that’s a positive result of LLM.

  • kalkulat
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    8 hours ago

    I’d never ask a friggin machine to do coding for me, that’s MY blast.

    That said, I’ve had good luck asking GPT specific questions about multiple obscure features of Javascript, and of various browsers. It’ll often feed me a sample script using a feature it explains … a lot more helpful than many of the wordy websites like MDN … saving me shit-tons of time that I’d spend bouncing around a half-dozen ‘help’ pages.

    • Derpgon@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      I’ve been using it to code a microservice as PoC for semantic search. As I’ve basically never coded Python (mainly PHP, but can do many langs) I’ve had to rely on AI (Kimi K2, or agentic Claude I think 4.5 or 4, can’t remember) because I don’t know the syntax, features, best practices, and tools to use for formatting, static analysis, and type checks.

      Mind you, I’ve basically never coded in Python besides some shit in uni, which was 5-10 years ago. AI was a big help - albeit it didn’t spit out fully working code, I have enough knowledge in this field to fix the issues. As I learn mainly by practice and not theory, AI is great because - same as many YouTubers and free tutorials - it spits out unoptimized and broken code.

      I am usually not using it for my main line of work (PHP) besides some boiler plate (take this class, make a test, make it look the same as this other test = 300 lines I don’t have to write myself).

      • nostrauxendar
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 hours ago

        Jesus dude, this is what you’re doing on Christmas eve? 😂

        • Nikelui
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          5 hours ago

          Don’t bother. It’s either bot or a bored troll.

          • nostrauxendar
            link
            fedilink
            English
            arrow-up
            1
            ·
            4 hours ago

            I took a look at their comments and It’s that alias_qr_rainmaker guy again!

    • KENNY_LOGIN_LILLIAN
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      21
      ·
      8 hours ago

      Humans are bad at code. AI is trained on humans. AI is bad because we are bad.

      • Xenny
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        edit-2
        6 hours ago

        Ai is literally just copy pasting. Like if you think about AI as a control C control V machine, it makes sense. You wouldn’t trust a single fucking junior Dev that didn’t actually know how to code because they just Ctrl C control V from stack overflow for literally every single line of code. That’s all fucking AI is

  • Ledivin
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    12
    ·
    edit-2
    12 hours ago

    Anyone blindly having AI write their code is an absolute moron.

    Anyone with decent experience (5-10 years, maybe 10+?) can absolutely fucking skyrocket their output if they properly set up their environments and treat their agents as junior devs instead of competent programmers. You shouldn’t trust generated code any more than you trust someone fresh out of college, but they produce code in seconds instead of weeks.

    I have tripled my output while producing more secure code (based on my security audits), safer code (based on code coverage and security audits), and less error-prone code (based on production logs and our unchanged QA process).

    Now, the ethical issues and environmental issues, I 100% can get behind. And I have no idea what companies are going to do in 10 years when they have to replace people like me and haven’t been hiring or training replacements. But the productivity and quality debates are absolutely ridiculous, as long as a strong dev is behind the wheel and has been trained to use the tools.

    • skibidi
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      edit-2
      10 hours ago

      Consider: the facts

      People are very bad at judging their own productivity, and AI consistently makes devs feel like they are working faster, while in fact slowing them down.

      I’ve experienced it myself - it feels fucking great to prompt a skeleton and have something brand new up and running in under an hour. The good chemicals come flooding in because I’m doing something new and interesting.

      Then I need to take a scalpel to a hundred scattered lines to get CI to pass. Then I need to write tests that actually test functionality. Then I start extending things and realize the implementation is too rigid and I need to change the architecture.

      It is as this point that I admit to myself that going in intentionally with a plan and building it myself the slow way would have saved all that pain and probably got the final product shipped sooner, even if the prototype was shipped later.

      • Ledivin
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 hours ago

        What about my comment made you believe I was using gut feelings to judge anything? My ticket completion rate, number of tickets, story points, and number of projects completed all point to massive productivity gains.

        • DontAskAboutUpdog
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Jeez, you aint joking about that brain injury :( I whish you good luck with your life. I am not trying to be an AH, i truly do whish you the best.

          • KENNY_LOGIN_LILLIAN
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            9
            ·
            4 hours ago

            Thank you! Yes, it’s DID. Not official though. I diagnosed myself earlier this morning.

      • setsubyou
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        6 hours ago

        It depends on the task. As an extreme example, I can get AI to create a complete application in a language I don’t know. There’s no way that’s not more productive than me first learning the language to a point where I can make apps in it. Just have to pick something simple enough for the AI.

        Of course the opposite extreme also exists. I’ve found that when I demand something impossible, AI will often just try to implement it anyway. It can easily get into an endless cycle where it keeps optimistically declaring that it identified the issue and fixed it with a small change, over and over again. This includes cases where there’s a bug in the underlying OS or similar. You can waste a huge amount of time going down an entirely wrong path if you don’t realize that an idea doesn’t work.

        In my real work neither of these really happen. So the actual impact is much less. A lot of my work is not coding in the first place. And I’ve been writing code since I was a little kid, for almost 40 years now. So even the fast scaffolding I can do with AI is not that exciting. I can do that pretty quickly without AI too. When AI coding tools appeared my bosses started asking if I was fast because I was using one. No, I’m fast because some people ask for a new demo every week. Causes the same problems later too.

        But I also do think that we all still need to learn how to use AI properly. This applies to all tools, but I think it’s more difficult than with other tools. If I try to use a hammer on something other than a nail, it will not enthusiastically tell me it can do it with just one more small change. AI tools absolutely will though, and it’s easy to just let them try because it’s just a few seconds to see what they come up with. But that’s a trap that leads to those productivity wasting spirals. Especially if the result actually somehow still works at first, so we have to fix it half a year later instead of right away.

        At my work there are some other things that I feel limit the productivity potential of AI tools. First of all we’re only allowed to use a very limited number of tools, some of them made in-house. Then we’re not really allowed to integrate them into our workflows other than the part where we write code. E.g. I could trivially write an mcp server that interacts with our (custom in-house) ci system and actually increases my productivity because I could save a small number of seconds very often if I could tell an AI to find builds for me for integration or QA work. But it’s not allowed. We’re all being pushed to use AI but the company makes it really difficult at the same time.

        So when I play around with AI on my spare time I do actually feel like I’m getting a huge boost. Not just because I can use a claude model instead of the ones I can use at work, but also just basic things like e.g. being able to turn on AI in Xcode at all when working on software for Apple platforms. On my work Macbook I can’t turn on any Apple AI features at all so even tab completion is worse. Or in other words, those realities of working on serious projects at a serious company with serious security policies can also kill any potential productivity boost from AI. They basically expect us to be productive with only those features the non-developer CEO likes, who also doesn’t have to follow any of our development processes…

    • Rhoeri
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      14 hours ago

      Similarly, the sky is made of air.

  • Katzelle3
    link
    fedilink
    English
    arrow-up
    150
    ·
    21 hours ago

    Almost as if it was made to simulate human output but without the ability to scrutinize itself.

    • mushroommunk@lemmy.today
      link
      fedilink
      English
      arrow-up
      77
      arrow-down
      4
      ·
      edit-2
      21 hours ago

      To be fair most humans don’t scrutinize themselves either.

      (Fuck AI though. Planet burning trash)

        • galaxy_nova
          link
          fedilink
          English
          arrow-up
          9
          ·
          20 hours ago

          And then the follow up email because they didn’t actually finish a complete thought

          • Sophienomenal@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            4
            ·
            10 hours ago

            I do this with texts/DMs, but I’d never do that with an email. I double or triple check everything, make sure my formatting is good, and that the email itself is complete. I’ll DM someone 4 or 5 times in 30 seconds though, it feels like a completely different medium ¯\_(ツ)_/¯

      • FauxLiving
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        7
        ·
        18 hours ago

        (Fuck AI though. Planet burning trash)

        It’s humans burning the planet, not the spicy Linear Algebra.

        Blaming AI for burning the planet is like blaming crack for robbing your house.

        • KubeRoot@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          Blaming AI for burning the planet is like blaming guns for killing children in schools, it’s people we should be banning!

        • Rhoeri
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          14 hours ago

          How about I blame the humans that use and promote AI. The humans that defend it in arguments using stupid analogies to soften the damage it causes?

          Would that make more sense?

        • BassTurd
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          18 hours ago

          Blaming AI is in general criticising everything encompassing it, which includes how bad data centers are for the environment. It’s like also recognizing that the crack the crackhead smoked before robbing your house is also bad.