I mean, we do the same thing, for the same reasons, with our government and defense procurement orders these days. This isn’t that weird. It’s only weird in that they’re clearly cutting themselves off from the best high-volume x86 CPU manufacturers that currently exist, but aside from that, the geopolitical and strategic calculus adds up.
Hey China I made you this sweet horse statue in the form of an x86 processor – You should put it in the town square to show it off and then all go to sleep…
x86 is dying, legacy processing. It’s all GPU’s and ARM processing now. Apple is leaning hard into it so they set themselves as a leader in AI in the future.
There’s a lot of critical infrastructure running on Windows 3.1. A lot of very expensive machinery runs on proprietary software only released as x86 binaries, from autoclaves to MRI machines.
Oh, and here’s the fun part: Basically the only appeal Windows has is its legacy software support. ‘My games just work.’ ‘My software just runs.’ That wasn’t the case with the ARM editions of Windows, you couldn’t just run a .exe. So they either have to do emulation, which in most cases WINE under Linux works better, or lock you into their app store which is Apple but 1,000 times shittier.
Gaming though. The gaming situation on non-x86 cpus is passable at best. AFAIK you can’t put a 4070ti in any non x86 system right now and have it work. Are there even any commercially available non-x86 systems that have pcie 16x slots?
The death of x86 is inevitable I just hope we can still play computer games on cheaper homebuilt systems afterwards because having to replace your entire system just to upgrade the integrated non upgradable gpu is no longer better or cheaper than consoles. I absolutely fucking doubt even indie developers, let alone others are going to downgrade graphics to let their games run on cheaper systems when this happens and everything becomes 10x more expensive.
AFAIK you can’t put a 4070ti in any non x86 system right now and have it work.
Try an AMD card, much better chances because open drivers. There definitely have been people who got dedicated GPUs to run on ARM boards via the not even a handful of pcie lanes meant for m.2 storage.
I wouldn’t be too sure about ARM because Qualcomm definitely is eyeing alternatives and other licensors might not exactly mind not being reliant on litigious bastards. That alternative is RISC-V. Most ARM licensors are making chips for products where apps don’t really care about the architecture, that is, Android.
To actually make a dent in the completely entrenched x86 market we’d need probably chips with dual insn decoders. I certainly wouldn’t put that past AMD they don’t like being fused to Intel at the hip.
You’re getting down voted but in all honesty, you’re not wrong. All it takes is one x86/64 alternative to show the world that Intel and AMD aren’t the only players in the game. Apple did it with ARM and the m1 chip, now we’re hearing reports of Microsoft actually putting a real effort into ARM and making their own chips for AI instead of that half-assed Windows on ARM initiative. I for one love this competition, because that only benefits the consumers.
If x86 is going to die, Apple has to be defeated at all costs or else computers are going to become 10x expensive once they establish a monopoly. I hope someone starts making real progress in ARM system stuff. If they do away with expansion ports and make it so the gpu, ram, and cpu are all on one chip even on the competing non-x86 non-M1 systems then everything’s fucked though.
They’re not great, yet, but they’re pretty cheap and really small. They’ll probably get a lot better in the future though, remember the speed of x86 CPU’s was once measured in MgHZ. I remember my first P4 with one whole GgHZ of speed, before even dual core CPU’s.
I mean, we do the same thing, for the same reasons, with our government and defense procurement orders these days. This isn’t that weird. It’s only weird in that they’re clearly cutting themselves off from the best high-volume x86 CPU manufacturers that currently exist, but aside from that, the geopolitical and strategic calculus adds up.
Hey China I made you this sweet horse statue in the form of an x86 processor – You should put it in the town square to show it off and then all go to sleep…
x86 is dying, legacy processing. It’s all GPU’s and ARM processing now. Apple is leaning hard into it so they set themselves as a leader in AI in the future.
You very obviously don’t understand the truly enormous power of technical inertia.
Except a lot of infrastructure runs on legacy software. There’s stuff built on like windows 2000 that is still used by hospitals and governments.
There’s a lot of critical infrastructure running on Windows 3.1. A lot of very expensive machinery runs on proprietary software only released as x86 binaries, from autoclaves to MRI machines.
Oh, and here’s the fun part: Basically the only appeal Windows has is its legacy software support. ‘My games just work.’ ‘My software just runs.’ That wasn’t the case with the ARM editions of Windows, you couldn’t just run a .exe. So they either have to do emulation, which in most cases WINE under Linux works better, or lock you into their app store which is Apple but 1,000 times shittier.
You’re not wrong, but most of this legacy software runs on legacy hardware as well. Win 2k isn’t supported by most modern hardware
Gaming though. The gaming situation on non-x86 cpus is passable at best. AFAIK you can’t put a 4070ti in any non x86 system right now and have it work. Are there even any commercially available non-x86 systems that have pcie 16x slots?
The death of x86 is inevitable I just hope we can still play computer games on cheaper homebuilt systems afterwards because having to replace your entire system just to upgrade the integrated non upgradable gpu is no longer better or cheaper than consoles. I absolutely fucking doubt even indie developers, let alone others are going to downgrade graphics to let their games run on cheaper systems when this happens and everything becomes 10x more expensive.
Try an AMD card, much better chances because open drivers. There definitely have been people who got dedicated GPUs to run on ARM boards via the not even a handful of pcie lanes meant for m.2 storage.
I wouldn’t be too sure about ARM because Qualcomm definitely is eyeing alternatives and other licensors might not exactly mind not being reliant on litigious bastards. That alternative is RISC-V. Most ARM licensors are making chips for products where apps don’t really care about the architecture, that is, Android.
To actually make a dent in the completely entrenched x86 market we’d need probably chips with dual insn decoders. I certainly wouldn’t put that past AMD they don’t like being fused to Intel at the hip.
You’re getting down voted but in all honesty, you’re not wrong. All it takes is one x86/64 alternative to show the world that Intel and AMD aren’t the only players in the game. Apple did it with ARM and the m1 chip, now we’re hearing reports of Microsoft actually putting a real effort into ARM and making their own chips for AI instead of that half-assed Windows on ARM initiative. I for one love this competition, because that only benefits the consumers.
If x86 is going to die, Apple has to be defeated at all costs or else computers are going to become 10x expensive once they establish a monopoly. I hope someone starts making real progress in ARM system stuff. If they do away with expansion ports and make it so the gpu, ram, and cpu are all on one chip even on the competing non-x86 non-M1 systems then everything’s fucked though.
deleted by creator
ARM computers are positively repugnant. This abobination of an architecture MUST be exterminated.
They’re not great, yet, but they’re pretty cheap and really small. They’ll probably get a lot better in the future though, remember the speed of x86 CPU’s was once measured in MgHZ. I remember my first P4 with one whole GgHZ of speed, before even dual core CPU’s.