• Walop@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      46
      ·
      2 days ago

      You ask an LLM to code something and then just run the code blind without reading it and if it seems to work, publish the application. You can imagine how many performance and security problems it produces.

      • Psychadelligoat@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        Absolutely insane that anyone would do this

        Meanwhile I’m dropping my config into my local AI asking “why the FUCK do I suddenly have no permission to write to a folder I’ve been using for days?!?!” Only to have the air go "hey dipshit you put /data/downloads in one spot and /data/download in the other, that’s why, try not making typos that are real words in the future (I told my tech support persona to be mildly annoyed and in need of more coffee because I’ve never known an IT person not to be those things)

      • filcuk@lemmy.zip
        link
        fedilink
        arrow-up
        21
        ·
        2 days ago

        I used to laugh at these, but I’ve just been asked to help with an issue where I work and what do you know… long story short it’s not funny to me anymore

        • Opisek@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 day ago

          Not funny because you’re afraid the AI will take your job away right? Right?