• Korkki@lemmy.world
    link
    fedilink
    English
    arrow-up
    89
    arrow-down
    2
    ·
    edit-2
    1 year ago

    6ghz

    Knowing intel and their recent performance gains it just needs a small nuclear reactor as a PSU and liquid oxygen cooling for light desktop use.

    • Rai@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      Hmmm I have a 1kw power supply and a d15, so maybe it’s time to upgrade from my still-amazing 9900k.

    • A_Random_Idiot@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      Or they are going back to old tried and true methods and reducing the IPCs for higher clocks like they did in the move from Pentium 3 to Pentium 4.

  • MeanEYE@lemmy.world
    link
    fedilink
    English
    arrow-up
    53
    arrow-down
    2
    ·
    1 year ago

    Buy our new stuff. Promise we didn’t fake numbers this time and/or make new security vulnerabilities.

  • Kumabear@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    1
    ·
    1 year ago

    Remember when CPU’s / GPU’s had clockspeeds that didn’t include “up to”

    I understand why it’s better net performance doing it this way and better uses the whole potential performance of the chip.

    That still doesn’t make me like it, not because of the technical reasons, but because of the way they twist the marketing, it could hit 6ghz for 1 microsecond and they could still claim #nowupto6ghz!

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      2
      ·
      1 year ago

      I remember when chips first hit 1GHz around 1999. Tech magazines were claiming that we’d hit 7GHz in 5 years.

      What they failed to predict is that you start running into major heat issues if you try to go past ~3GHz. Which is why CPU manufacturers started focusing on other ways to improve performance, such as multiple cores and better memory management.

    • QuarterSwede@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 year ago

      And it has. The phone you have is faster than the 3GHz chip back then. A phone powered by a battery. And faster by like 20 times.

      • Stabbitha@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        My dad had one of the first consumer 3GHz chips available. By the time I inherited it in 2009 it was completely outclassed by a <2GHz dual-core laptop.

    • ashok36@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      1 year ago

      That would’ve been a single 3ghz cpu core. Now we have dozens in one chip. Also, the instruction sets and microcode has gotten way better since then as well.

    • PixxlMan@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      Clock speed isn’t improving that quickly anymore. Other aspects, such as more optimized power consumption, memory speeds, cache sized, less cycle-demanding operations, more cores have been improving faster instead.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      We’re running into hard physical limits now, the transistors in each chip are so small that any smaller and they’d start running into quantumn effects that would render them unreliable.

  • SuperSpruce@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I thought Intel was dropping the K9 branding and skipping desktop chips for 14th Gen…