I have seen all kind of claims, bit they seems to be in the future, like 1.000.000qbits or perfect error correction.
Do they have a real breakthrough or is it just undergoing advancments on a slow road?
Short version, they made a new kind of digital quantum qubit where the previous state of the art required a lot of analog sensors and controls. This means the equipment needed to hook up the qubits is simpler, and they’re hoping to build on this design to make scalable quantum computers that can support many more qubits than current techniques.
Not a huge breakthrough on its own outside of a very niche academic field, but a necessary step towards really expanding on the capabilities of the tech.
I’m not deep enough in that topic, but for mildly entertaining criticism of that whole thing: https://m.youtube.com/watch?v=NKYxdzSNqzE&t=343s
Sabine always rocks!
Again, wait and see. & even if true then it seems it’s just 1 qbit.
Expert in the field said they are nowhere close in a YT comment on the announcement. Looks like all BS hype to me.
Chip hype is always garbage IMO. Real hardware takes 10 years from napkin idea to first delivery of a any product. There is no fast track here. The cutting edge nodes are extremely expensive to design for and you’re largely doing so based on the future node that doesn’t fully exist yet to be relevant.
So what do they have. Where was the tech 10 years ago, and why is that relevant now. The only thing I see that is relevant is that the market is all over Nvidia and there are a lot of fools playing that stock. So a hyped chip is an easy scam to bait the fools at the moment.
Nothing in quantum is relevant at all anyways. The only thing it can do with value is break encryption. It has no other real application outside of potential military communications. That is the only reason it is funded IMO. The funding for quantum compute is a tiny fraction of AI because AI solves most of the same potential problems.
Same feeling here, and quantum resistant algos for asymmetric encryption ate already designed and are coming.
Without that, quantum computers wouldn’t only break military stuff but also wreck havoc on the internet by breaking ssh (and bitcoin, lol).
Dr. Ben Miles is a physicist turned head of a venture capital firm that “invests in this kind of thing” according to him. He seems skeptical still, but does the best job I’ve seen when it comes to breaking down this whole thing:
Yeah but
-
he’s not specialised in quantum computing
-
he’s a venture capitalist, it’s his job to sell his stuff.
-
even if it does work, we just have to use the new quantum resistant assymetric algorithms (selected last summer, so they do already exist) to make this e-waste.
-
they lied in 2018 about topological qbits (retracted in 2021), this might just be a funding scheme…
-
they have, supposedly, made one qbit. And the big thing is they have a roadmap of 25 years to make the big breakthrough… Gimme money for 25 years, I’ll be retired before that happens…
Let’s see what happens, but I’m not laying awake thinking about topological qbits exactly.
-
There isn’t enough information to know yet. There’s some science and a lot of self promoting hype in the press release.
That was my feeling too :-)
I don’t know anything about quantum computing, but recently I heard a long talk by a quantum-computing expert who were trying to convince us to work on quantum error correction. His (probably optimistic) estimate is that: with a good amount of help on error correction, we might achieve 100 logical qbit in 5 to 10 years.
Completely unpredicted breakthrough is rare in computer science; if Microsoft’s tech can actually solve quantum computing (as you discribed), it would have made much much bigger wave than this.
Yeah that was my gut feeling too. Turns out they haven’t made any breakthroughs and it’s still in a far future if at all.
Thanks!
So nothing special, at the moment.
It looks like worthwhile research but no, there won’t be a QPU to go with your GPU anytime soon.
part of the work in designing a chip isnt just designing and attaching wires, but designing the fundamental instruction set and system architecture that goes along with the chip.
because since you have an architecture(think like x86, arm, powerpc, risc-v), developers can start to actually formulate instructions on what to do with said hardware., be it if it was deaigned for 8 qbit, or 1million down the line, the point is that theres a baseline now in which one can start formulating and try to execute said commands.
We already got emulators for that though?