Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

  • Godort@lemm.ee
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    5 months ago

    This is one of those weird things that venture capital does sometimes.

    VC is is injecting cash into tech right now at obscene levels because they think that AI is going to be hugely profitable in the near future.

    The tech industry is happily taking that money and using it to develop what they can, but it turns out the majority of the public don’t really want the tool if it means they have to pay extra for it. Especially in its current state, where the information it spits out is far from reliable.

    • cheese_greater@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      ·
      5 months ago

      I don’t want it outside of heavily sandboxed and limited scope applications. I dont get why people want an agent of chaos fucking with all their files and systems they’ve cobbled together

    • Tenthrow@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      ·
      5 months ago

      I have to endure a meeting at my company next week to come up with ideas on how we can wedge AI into our products because the dumbass venture capitalist firm that owns our company wants it. I have been opting not to turn on video because I don’t think I can control the cringe responses on my face.

    • TipRing@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      5 months ago

      Back in the 90s in college I took a Technology course, which discussed how technology has historically developed, why some things are adopted and other seemingly good ideas don’t make it.

      One of the things that is required for a technology to succeed is public acceptance. That is why AI is doomed.

  • exanime@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    5 months ago

    AI for IT companies is looking more and more like 3D was for movie industry

    All fanfare and overhype, a small handful of examples that do seem a solid step forward with millions others that are just a polished turd. Massive investment for something the market has not demanded

  • TheEntity@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    ·
    5 months ago

    And what do the companies take away from this? “Cool, we just won’t leave you any other options.”

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    5 months ago

    I would pay for a power efficient AI expansion card. So I can self host AI services easily without needing a 3000€ gpu that consumes 10 times more than the rest of my pc.

    • eleitl@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      I will be looking into AMD Halo Strix’ performance as a poor man’s GPU to run LLMs and some scientific codes locally.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      I would consider it a reason to upgrade my phone a year earlier than otherwise. I don’t know what ai will stick as useful, but most likely I’ll use it from my phone, and I want there to be at least a chance of on-device ai rather than “all your data are belong to us” ai

  • t00l@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    1
    ·
    5 months ago

    They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      5 months ago

      Well, NPU are not in pair with modern GPU. General GPU has more power than most NPUs, but when you look at what electricity cost, you see that NPU are way more efficient with AI tasks (which are not only chatbots).

  • bluewing@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    5 months ago

    Remember when the IoT was very new? There were similar grumblings of “Why would I want talk to my refridgerator?” And now more and more things are just IoT connected for no reason.

    I suspect AI will follow as similar path into the consumer mainstream.

      • ChapulinColorado@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        I feel like the local only devices do have a place in the home automation sector (e.g. home assistant compatible with no cloud integrations).

        Most vendors want to lock you into their crappy cloud system that will someday be offline and render things useless however.

  • EliteDragonX@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    2
    ·
    5 months ago

    This is yet another dent in the “exponential growth AGI by 2028” argument i see popping up a lot. Despite what the likes of Kurzweil, Musk, etc would have you believe, AI is severely overhyped and will take decades to fully materialise.

    You have to understand that most of what you read about is mainly if not all hype. AI, self driving cars, LLM’s, job automation, robots, etc are buzzwords that the media loves to talk about to generate clicks. But the reality is that all of this stuff is extremely hyped up, with not much substance behind it.

    It’s no wonder that the vast majority of people hate AI. You only have to look at self driving cars being unable to handle fog and rain after decades of research, or dumb LLM’s (still dumb after all this time) to see why. The only real things that have progressed quickly since the 80s are cell phones, computers, etc. Electric cars, self driving cars, stem cells, AI, etc etc have all not progressed nearly as rapidly. And even the electronics stuff is slowing down soon due to the end of Moore’s Law.

    • cestvrai@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      edit-2
      5 months ago

      There is more to AI than self driving cars and LLMs.

      For example, I work at a company that trained a deep learning model to count potatoes in a field. The computer can count so much faster than we can, it’s incredible. There are many useful, but not so glamorous, applications for this sort of technology.

      I think it’s more that we will slowly piece together bits of useful AI while the hyped areas that can’t deliver will die out.

      • jj4211@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        5 months ago

        Machine vision is absolutely the most slam dunk “AI” does work and has practical applications. However it was doing so a few years before the current craze. Basically the current craze was driven by ChatGPT, with people overestimating how far that will go in the short term because it almost acts like a human conversation, and that seemed so powerful .

        • AA5B@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          5 months ago

          That’s why I love ai: I know it’s been a huge part of phone camera improvements in the last few years.

          I seem to get more use out of voice assistants because I know how to speak their language, but if language processing noticeably improves, that will be huge

          Motion detection and person detection have been a revolution in cheap home cameras by very reliably flagging video of interest, but there’s always room for improvement. More importantly I want to be able to do that processing real time, on a device that doesn’t consume much power

          • technocrit@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            5 months ago

            None of what you’re describing is anything close to “intelligence”. And it’s all existed before this nonsense hype cycle.

      • EliteDragonX@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        That’s nice and all, but that’s nowhere close to a real intelligence. That’s just an algorithm that has “learned” what a potato is.

      • technocrit@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        5 months ago

        So… A machine is “intelligent” because it can count potatoes? This sort of nonsense is a huge part of the problem.

    • captainlezbian@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Idk robots are absolutely here and used. They’re just more Honda than Jetsons. I work in manufacturing and even in a shithole plant there are dozens of robots at minimum unless everything is skilled labor.

      • nadram@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 months ago

        I might be wrong but those do not make use of AI do they? It’s just programming for some repetitive tasks.

        • captainlezbian@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          5 months ago

          They use machine learning these days in the nice kind, but I misinterpreted you. I interpreted you as saying that robots were an example of hype like AI is, not that using AI in robots is hype. The ML in robots is stuff like computer vision to sort defects, detect expected variations, and other similar tasks. It’s definitely far more advanced than back in the day, but it’s still not what people think.

  • cmrn@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    5 months ago

    I still don’t understand how the buzzword of AI 10x’d all these valuations, when it’s always either: a) exactly what they’ve been doing before, now with a fancy new name b) deliberately shoehorning AI in, in ways with no practical benefit

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Isn’t that the entire point behind what most business people do? The whole goal is to upsell some schmuck by speaking too fast, and mentioning a lot of words that don’t really mean anything. Except the difference now is that the business person in this case is the leadership behind most of the tech industry

  • OCATMBBL@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    5 months ago

    Why would I pay more for x company to have a robot half ass the work of all the employees they’re gonna cut?

    • Wogi@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      5 months ago

      So the trades have been unknowingly fucking with AI for decades, because of the time honored tradition of fucking with apprentices.

      A lot of forums are filled with absolutely unhinged advice, and sprinkled in there is some good advice. If you know what you’re doing, you can spot the bullshit.

      But if you don’t know anything about it, the advice seems perfectly reasonable. There’s a skill in giving unhinged advice. Literally you can’t get your master cert without convincing at least one apprentice to ask where the board stretcher is.

      Do I actually have a dedicated vise for Vaseline when I run a tap cycle or is that old timer bullshit? HOW WOULD YOU POSSIBLY KNOW??

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      I would pay less, and then either use it for dumb stuff or just not use it at all.

  • MrAlternateTape@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 months ago

    I have no clue why any anybody thought I would pay more for hardware if it goes with some stupid trend that will be blow up in our faces soon or later.

    I don’t get they AI hype, I see a lot of companies very excited, but I don’t believe it can deliver even 30% of what people seem to think.

    So no, definitely not paying extra. If I can, I will buy stuff without AI bullshit. And if I cannot, I will simply not upgrade for a couple of years since my current hardware is fine.

    In a couple of years either the bubble is going to burst, or they really have put in the work to make AI do the things they claim it will.

  • Sam_Bass@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    5 months ago

    Its bad enough they shove it on you in some websites. Really not interested in being their lab rats

  • snek_boi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    5 months ago

    I agree that we shouldn’t jump immediately to AI-enhancing it all. However, this survey is riddled with problems, from selection bias to external validity. Heck, even internal validity is a problem here! How does the survey account for social desirability bias, sunk cost fallacy, and anchoring bias? I’m so sorry if this sounds brutal or unfair, but I just hope to see less validity threats. I think I’d be less frustrated if the title could be something like “TechPowerUp survey shows 84% of 22,000 respondents don’t want AI-enhanced hardware”.