• trolololol@lemmy.world
    link
    fedilink
    English
    arrow-up
    97
    arrow-down
    2
    ·
    4 months ago

    I hope this helps people understand that you don’t get to be CEO by being smart or working hard. It’s all influence and gossip all the way up.

  • Hackworth@lemmy.world
    link
    fedilink
    English
    arrow-up
    76
    arrow-down
    1
    ·
    4 months ago

    “Coding” was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      Agreed! Problem solving is core to any sort of success. Whether you’re moving up or on for more pay, growing tomatoes or nurturing a relationship, you’re problem solving. But I can see AI putting the screws to those of us in tech.

      Haven’t used it much so far, last job didn’t afford much coding opportunity, but I wrote a Google Apps script to populate my calendar given changes to an Excel sheet. Pretty neat!

      With zero experience App scripting, I tried going the usual way, searching web pages. Got it half-ass working, got stuck. Asked ChatGPT to write it and boom, solved with an hour’s additional work.

      You could say, “Yeah, but you at least had a clue as to general scripting and still had to problem solve. Plus, you came up with the idea in the first place, not the AI!” Yes! But point being, AI made the task shockingly easier. That was at a software outfit so I had the oppurtuniy to chat with my dev friends, see what they were up to. They were properly skeptical/realistic as to what AI can do, but they still used it to great effect.

      Another example: Struggled like hell to teach myself database scripting, so ignorant I didn’t know the words to search and the solutions I found were more advanced answers than my beginner work required (or understood!). First script was 8 short lines, took 8 hours. Had AI been available to jump start me, I could have done that in an hour, maybe two. That’s a wild productivity boost. So while AI will never make programmers obsolete, we’ll surely need fewer of them.

  • L0rdMathias@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    57
    arrow-down
    2
    ·
    4 months ago

    “Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of”

  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    51
    ·
    4 months ago

    They’ve been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don’t like that kind of thing.

    Unfortunately, I don’t think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      the number of positions will be reduced as much as the owning class can get away with

      Well, after all, you don’t hire people to do nothing. It’s simply a late-stage capitalism thing. Hopefully one day we can take the benefits of that extra productivity and share the wealth. The younger generations seem like they might move us that way in the coming decades.

      • Boozilla@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        I really hope so. Sometimes I think the kids are alright. Like the 12 year old owning the My Pillow idiot. Then I hear the horror stories from my school teacher friends.

    • SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      4 months ago

      And by that time, processors and open source AI are good enough that any noob can ask his phone to generate a new app from scratch. You’d only need big corpo for cloud storage and then only when distributed systems written by AI don’t work.

  • EnderMB@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    4 months ago

    It’s worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.

    With that in mind, while it’s a hilariously stupid comment to make, he’s in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.

    • mycodesucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      4 months ago

      PM and sales, eh?

      So you’re saying his lack of respect for programmers isn’t new, but has spanned his whole career?

    • Squizzy@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      4 months ago

      As a wage slave with no bitcoin or crypto, the technology has been hijacked by these types and could otherwise have been useful.

      • EnderMB@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        4 months ago

        I’m not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I’m just one man and if people find uses for it then good for them.

        • eleitl@lemm.ee
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          4 months ago

          What other solutions to double spending were there in financial cryptography before?

          • EnderMB@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            4 months ago

            No idea, I don’t work in fintech, but was it a fundamental problem that required a solution?

            I’ve worked with blockchain in the past, and the uses where it excelled were in immutable bidding contracts for shared resources between specific owners (e.g. who uses this cable at x time).

            • eleitl@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              edit-2
              4 months ago

              Fully decentralized p2p cryptocurrency transactions without double spending by proof of work (improvement upon Hashcash) was done first with Bitcoin. The term fintech did not exist at the time. EDIT: looked it up, apparently first use as Fin-Tech was 1967 https://en.wikipedia.org/wiki/Fintech – it’s not the current use of the term though.

  • jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    38
    ·
    4 months ago

    Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn’t exist.

    A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.

    The funny thing was, it knew and could explain why those functions couldn’t be used when I corrected it. But it wasn’t able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.

    Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it’s talking about to the average person.

    Basically, AI is currently functioning at the same level as the average tech CEO.

  • Lettuce eat lettuce@lemmy.ml
    link
    fedilink
    English
    arrow-up
    36
    arrow-down
    1
    ·
    4 months ago

    Lol sure, and AI made human staff at grocery stores a thing of the…oops, oh yeah…y’all tried that for a while and it failed horribly…

    So tired of the bullshit “AI” hype train. I can’t wait for the market to crash hard once everybody realizes it’s a bubble and AI won’t magically make programmers obsolete.

    Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers…

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      4 months ago

      It’s the pinnacle of MBA evolution.

      In their worldview engineers are a material, and all that matters in the world is knowing how to do business. So it just makes sense that one can guide and use and direct engineers to replace themselves.

      They don’t think of fundamentals, they really believe it’s some magic that happens all by itself, you just have to direct energy and something will come out of it.

      Lysenko vibes.

      This wouldn’t happen were not the C-suite mostly comprised of bean counters. They really think they are to engineers what officers are to soldiers. The issue is - an officer must perfectly know everything a soldier knows and their own specialty, and also bears responsibility. Bean counters in general less education, experience and intelligence than engineers they direct, and also avoid responsibility all the time.

      So, putting themselves as some superior caste, they really think they can “direct progress” to replace everyone else the way factories with machines replaced artisans.

      It’s literally a whole layer of people who know how to get power, but not how to create it, and imagine weird magical stuff about things they don’t know.

        • rottingleaf@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          Yeah, that’s what I mean. Black boxes are a concept to accelerate development, but we can’t blackbox ourselves through civilization. They are also mostly useful for horizontal, not vertical relationships, which people misunderstand all the time (leaky abstractions).

          This actually should make us optimistic. If hierarchical blackboxing were efficient, it would be certain that state of human societies will become more and more fascist and hierarchical over time, while not slowing down in development. But it’s not.

  • mojo_raisin@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    4 months ago

    The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.

    • vaultdweller013@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 months ago

      Worst case scenario the ai fucking loses it and decides to do some wacky but weirdly effective shit. Like spamming out 1 width units en masse in Hearts of Iron 4.

  • SuperiorOne@lemmy.ml
    link
    fedilink
    English
    arrow-up
    28
    ·
    4 months ago

    ‘Soon’ is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today’s LLMs - which are sometimes hallucinate so bad, they claim ‘C’ in CRC-32C stands for ‘Cool’.

    I wish we could also add a “Do not hallucinate” prompt to some CEOs.

  • painfulasterisk1@lemmy.ml
    link
    fedilink
    English
    arrow-up
    24
    ·
    4 months ago

    It’s really funny how AI “will perform X job in the near future” but you barely, if any, see articles saying that AI will replace CEO’s in the near future.

    • Dearth@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      4 months ago

      Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 months ago

      C-suites are like Russian elites.

      The latter are some thieves who’ve inherited a state from Soviet leadership. They have a layman’s idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.

      The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.

      While in actuality with today’s P2P technologies CEO’s are the most likely to be replaced, if we use our common sense, but without “AI”, of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.

  • yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    4 months ago

    I just want to remind everyone that capital won’t wait until AI is “as good” as humans, just when it’s minimally viable.

    They didn’t wait for self-checkout to be as good as a cashier; They didn’t wait for chat-bots to be as good as human support; and they won’t wait for AI to be as good as programmers.

      • SlopppyEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        4 months ago

        They’ll try the opposite. It’s what the movie producers did to the writers. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was “just touching up an existing script” it was half price.

        • xtr0n@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          They can try. But cleaning up a mess takes a while and there’s no magic wand to make it ho faster.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          Yeah they’ll try. Surely that can’t cascade into a snowball of issues. Good luck for them 😎

          • SlopppyEngineer@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            A strike with tech workers would be something else. Curious what would happen if the one maintaining the servers for entertainment, stock market or factories would just walk out. On the other hand, tech doesn’t have unions.

      • peopleproblems@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        4 months ago

        You better fucking believe it.

        AIs are going to be the new outsource, only cheaper than outsourcing and probably less confusing for us to fix

    • SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.

      If it’s AI doing all the work, the responsibility goes to the remaining humans. They’ll be interesting lawsuits even there’s the inevitable bug that the AI itself can’t figure out.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        4 months ago

        We saw this happen in Amazon’s cashier-less stores. They were actively trying to use a computer based AI system but it didn’t work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.

        I doubt the vast majority of tech workers will be replaced by AI any time soon. But they’ll probably keep trying because they really really don’t want to pay human beings a liveable wage.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      4 months ago

      Already happening. Cisco just smoked another 4,000 employees. And anecdotally, my tech job hunt is, for the first time, not going so hot.

  • SparrowRanjitScaur@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    2
    ·
    edit-2
    4 months ago

    Extremely misleading title. He didn’t say programmers would be a thing of the past, he said they’ll be doing higher level design and not writing code.

    • Tyfud@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      4 months ago

      Even so, he’s wrong. This is the kind of stupid thing someone without any first hand experience programming would say.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        4 months ago

        Yeah, there are people who can “in general” imagine how this will happen, but programming is exactly 99% not about “in general” but about specific “dumb” conflicts in the objective reality.

        People think that what they generally imagine as the task is the most important part, and since they don’t actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.

        But objective reality doesn’t bend. Their general ideas without every little bloody detail simply won’t work.

      • SparrowRanjitScaur@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        10
        ·
        4 months ago

        Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.

        • Tyfud@lemmy.world
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          2
          ·
          4 months ago

          This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.

          I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.

          People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.

          If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.

          Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.

          So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.

          • SparrowRanjitScaur@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            edit-2
            4 months ago

            It’s just a tool like any other. An experienced developer knows that you can’t apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It’s a tool, and a useful one if you know how to use it.

            • rottingleaf@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              4 months ago

              This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.

              A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, … , but an LLM is no better than a human in that too)

        • OmnislashIsACloudApp@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          4 months ago

          right now not a chance. it’s okay ish at simple scripts. it’s alright as an assistant to get a buggy draft for anything even vaguely complex.

          ai doing any actual programming is a long ways off.

    • realharo@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      4 months ago

      Sounds like he’s just repeating a common meme. I don’t see anything about higher level design that would make it more difficult for an AI (hypothetical future AI, not the stuff that’s available now) compared to lower level tasks.