• Anna@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    9 days ago

    When AI can sit through dozen meeting discussing stupid things only to finalize whatever you had decided earlier then I’ll be worried

    • CarrotsHaveEars@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      Personally I would happily let my AI bot attend the stupid scrum meetings for me. Let it tell my scrum master and stakeholders whatever the progress of my day of work and in the sprint. Don’t bother me in my coding time.

      • balsoft@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        7 days ago

        We made a (so far internal) tool at work that takes your activity from Github, your calendar, and the issue tracker, feeds that to a local LLM, which spits out a report of what you have been doing for the week. It messes up sometimes, but speeds up the process of writing the report dramatically. This is one of those cases where an LLM actually fits.

  • HiddenLayer555@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    9 days ago

    LLMs can’t even stay on topic when specifically being asked to solve one problem.

    This happens to me all the damn time:

    I paste a class that references some other classes which I have already tested to be working, my problem is in a specific method that doesn’t directly call on any of the other classes. I tell the LLM specifically which method is not working, I also tell it that I have tested all the other methods and they work as intended (complete with comments documenting what they’re supposed to do). I then ask the LLM to only focus on the method I have specified, and it still goes on about “have you implemented all the other classes this class references? Here’s my shitty implementation of those classes instead.”

    So then I paste all the classes that the one I’m asking about depends on, reiterate that all of them have been tested and are working, tell the LLM which method has the problem again, and it still decides that my problem must be in the other classes and starts “fixing” them which 9 out of 10 times is just rearranging the code that I already wrote and fucking up the organisation that I had designed.

  • Jimmycrackcrack@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    9 days ago

    I realise the dumbass here is the guy saying programmers are ‘cooked’, but there’s something kind of funny how the programmer talks about how people misunderstand the complexities of their job and how LLMs easily make mistakes because of an inability to understand the nuances of what he does everyday and understands deeply. They rightly point out how without their specialist oversight, AI agents would fail in ridiculous and spectacular ways, yet happily and vaguely adds as a throw away statement at the end “replacing other industries, sure.” with the exact same blitheness and lack of personal understanding with which ‘Ace’ proclaims all programmers cooked.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      I find this is a really common trope where people appreciate the complexity of the domain they work in, but assume every other domain is trivial by comparison.

      • HiddenLayer555@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        There’s a saying in Mandarin that translates to something like: Being in different professions is like being on opposite sides of a mountain. It basically means you can never fully understand a given profession unless you’re actually doing it.

  • Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    It’s like a conspiracy theory for that guy. Everyone who tells them it’s not true that you can get rid of programmers, has to be a programmer, and therefore cannot be trusted.

    • moseschrute@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      9 days ago

      To be fair, we should probably all start migrating to cybersecurity positions. They’ll need it when they discover how many vulnerabilities were created by all the non-programmers vibe coding.

      • bountygiver [any]@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        And will be a good time to start doing them for nefarious purposes, in particular hit it where it hurts, the company’s balance books, so that it actually starts driving demand for actually fixing them.

  • CapriciousDay@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    I don’t know if they quite appreciate how if programmers are cooked like this, everyone who isn’t a billionaire is too. Let me introduce you to robot transformer models.

    • ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      Exactly, to eliminate the need for programmers you would need AGI, and that would simply mean the end of capitalism because at that point any job a human does can be automated.

      • WhatsTheHoldup@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Ah but I’ll simply ask ChatGPT to generate me a job.

        Personally I haven’t gotten any jobs this way but that doesn’t mean employers aren’t cooked

        They are 👍

  • verdigris@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    The problem is that too many execs are thinking like this guy. It’s not actually tenable to replace programmers with AI, but people who aren’t programmers are less likely to understand that.