• .:\dGh/:.@lemmy.ml
    link
    fedilink
    arrow-up
    15
    arrow-down
    2
    ·
    1 year ago

    Well, that’s a bummer, but it will be interesting to see how it stacks up on day-to-day usage.

    It’s not that the folks on the base M3 are going to stress out the machine with high computation tasks, but the Pro and Max surely will have enough people talking about synthetic benchmarks vs real benchmarks to see what optimizations Apple made that and are not paying off.

  • TBi@lemmy.world
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    I really am interested to see it go up against the new snapdragon elite chip. Hopefully some competition at last!

    • AwesomeSteve@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      The only way to put a shame on Apple is the Laptop OEM manufacturers ship their entry level laptops and mini desktop computers equipped with Snapdragon Elite X chip with baseline 16GB of RAM, and $50 to upgrade to 32GB, 512GB of SSD, and adequate price to upgrade without robbing the customers. If they really want to retain and re-capture the market share from Dracula Tim Cook before bleeding out much like the Nokia, this is a step to take before even more people get hooked up on Apple ecosystem, because once they are into the ecosystem, it is hard to get out from data and services they signed into it. I am curious if a class-action suit can be brought against Apple in the memory and SSD pricing debacles.

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    I’m pretty close to getting a used m1 air for $500.

    I can probably search a bit and get a slightly better deal.

    The price might be a bit high, but I’m not in the US and we have higher prices here.

    • w3dd1e@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      I just got one for around $600 in the US on Swappa. I tried to get one cheaper but couldn’t find it where I lived. Anyway, I’m super happy with it. I made sure it was a low number of battery cycles and it’s in near mint condition.

      The other day, I was coding in VSStudio, debugging JavaScript in Chrome with multiple tabs open, and logging issues I found on a template in Excel. Excel alone makes my work computer freeze and I didn’t notice a single slow down on this thing. It was fantastic.

      I don’t love the way Mac handles open-window management but aside from that I’m very happy.

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Do you have 8gb of ram in your machine?

        There is an electronics market where I live. I have a recentish lenovo it actually might be a year newer than the M1 so I am going to try and swap it. Maybe I can go next week.

        • w3dd1e@lemm.ee
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Yeah, just 8. I was worried about only 8 actually but I couldn’t bring myself to spend the extra money on the 16gb (I have a desktop if I need to fall back on it).

          So far so good. I haven’t even noticed hitting a wall with the low amount of ram. I forgot to mention, I’m just coding websites. Even with the JavaScript, I’m not building AAA or doing a ton, really.

  • bbbbb@lemmy.world
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    1 year ago

    This was a real bummer for anyone interested in running local LLMs. Memory bandwidth is the limiting factor for performance in inference, and the Mac unified memory architecture is one of the relatively cheaper ways to get a lot of memory rather than buying a specialist AI GPU for $5-10k. I was planning to upgrade the memory a bit further than normal on my next MBP upgrade in order to experiment with AI, but now I’m questioning whether the pro chip will be fast enough to be useful.

  • mingistech@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    M3 Pro has 150GB/s bandwidth vs 200 for the M2 Pro. I think that can be explained by using 3 6GB/12GB modules for the RAM vs 4 on the M2.

    The M3 Max is listed as “up to" 400GB/s, where the M2 Max doesn’t have that qualifier. The 14 core I think is always using 3 24GB/32GB wider modules for 300GB/s, the 16 core is always using 4 for 400GB/s.

  • psycho_driver@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    No more Jim Keller architecture design. Same thing will probably happen to AMD when they need to move on from Zen. Bulldozer 2.0.

  • Nogami@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    1 year ago

    Doubt it will make a difference that anyone except benchmarkers will notice.

      • Viking_Hippie@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        That’s my point: it costs more but has less memory bandwidth, which people here seem to consider a GOOD thing, or at least thats what they seem to be trying to convince themselves and others of.

        • 4am@lemm.ee
          link
          fedilink
          arrow-up
          2
          arrow-down
          1
          ·
          1 year ago

          It can be more complicated than “bigger number better”. I don’t think anyone’s trying to justify it, probably just speculate on why it is the way it is

          Maybe Apple discovered that most software’s bottleneck isn’t at the RAM access for user land operations but is with cache misses, and they sacrificed some of the circuitry supporting memory access speed for additional on-die memory? So while you have less RAM bus speed, it doesn’t actually matter because you never could’ve used it anyway?

          I don’t know any real world numbers of any of this, I’m spitballin’ here - but that’s an example of an optimization that could plausibly happen when you are working with hardware design.

          People have been talking shit about Apple since the early 90s, but their stuff still works and they’re still selling it so, miss me with that “no no THIS time they’re playing us all for fools! No, seriously, guys! Guys? STOP HAVING FUN!” nonsense.

          I’ll believe it when the benchmarks come out.