What’s the best affordable pre-built mini server available in Europe? I’m looking for a reliable and compact option that won’t break the bank
Edit: something that are not arm based
Edit 2: I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services. By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go
Check out used tiny/mini/micro desktops on eBay. Loads of info here: https://www.servethehome.com/introducing-project-tinyminimicro-home-lab-revolution/
Only downside is going to be no GPU for the AI workload. Maybe some of the later AMD APUs could cut it. If not, all three major manufacturers have SFF variants that are pretty much the same hardware in a little bigger case. Those will accept smaller off the shelf GPUs.
2 bay nas with a Ryzen7 and up to 32GB ram.
Thanks i will take a look at that.
There is also a cheaper option with an N100
I’m not sure if they’re still affordable but I ended up getting both a morefine and a beelink, one with the n100 Intel CPU and the other with the n305. They handle everything I’ve thrown at them, and come with out of the box quicksync transcoding for Jellyfin/Plex. Handles 4K transcode like a champ. Couple that with 2.5g Ethernet and they sip power. Though they might have gone up in price since I bought mine.
Edit: something that are not arm based
You want pre-built to run ollama, that’s at least gonna cost you an arm, maybe even a leg.
I love my NUCs but haven’t really paid attention to what has happened since Intel sold that line to ASUS.
Thanks i will take a look at the NUC.
what are you gonna use it for?
something that are not arm based I’m looking to set up a system that can run Jellyfin, Ollama, and a few other small services. By ‘pre-built’, I mean I want to buy a device that already has the necessary hardware and installation, so all I need to do is install the operating system and I’m good to go
Go on ebay or your local 2nd hand market and search “mini PC” or “Office computer”…
You can try the Minisforum MS-01. Relatively compact, inexpensive, with a lot of options for expandability as well as relatively powerful Intel CPUs with QuickSync for LLM and transcode. Here is a nice overview of the device.
I see people mentioning small office desktops, and they are good, but I will warn you that they use proprietary parts so upgrading and repairing them can be difficult. Also jellyfin.org has some good info under the hardware acceleration section for what to use.
deleted by creator
I’ve had a good experience so far with two minipcs, mele quieter 4c for kodi, and a morefine m9 (I think this one is branded as mipowcat in the EU). They’re both n100, the m9 can go up to 32gb of ram although it is picky about what modules it will accept. I use the m9 for jellyfin and about 10 other services. Quick sync works great as far as I’ve tested it. For jellyfin I’m relying mostly on direct streaming, but I tried a few episodes with forcing some transcoding by using Firefox for playback and it worked fine.
How small? How many drives? I bought several used Lenovo P330 E2276G for my servers.
The Intel CPU has great low power GPU for video encoding/decoding for video streaming.
The Xeon ECC ram gives long term reliability. It’s important if you leave your PC on 24/7 for years at a time.
I am not a big fan of buying used.
Used servers/workstations are likely more reliable than new consumer.
They were very likely kept temperature controlled, have ECC, and are actually known working instead of something like Asus. If I remember correctly, PC mortality is very high the first 6 months, goes down to near zero for 5 years, then starts going back up.
Replace the SSD/hard drive and you are good. You might not even have to do that. I checked the stats on the SSD that came with my used Lenovo workstation and it had like 20 hours on it.
If you want to run Ollama and other ML stuff, you’re looking at buying an RTX4090, my friend. Affordable and ML are two things you can’t put into one sentence.
While you certainly can run AI models that require such a beefy GPU, there are plenty of models that run fine even on a CPU-only system. So it really depends on what exactly Ollama is going to be used for.
I am satisfied if it can run a 7/8B relatively fast
Raspberry PI
What are the reviews on hardware offered by umbrelos guys?