Not sure why this doesn’t exist. I don’t need 12TB of storage. When I had a Google account I never even crossed 15GB. 1TB should be plenty for myself and my family. I want to use NVMe since it is quieter and smaller. 2230 drives would be ideal. But I want 1 boot drive and 2 x storage drives in RAID. I guess I could potentially just have 2xNVMe and have the boot partition in RAID also? Bonus points if I can use it as a wireless router also.
Maybe but they also presumably consume much more power?
If you pick one of those machines with a “T” CPU you won’t even notice them. They’ll downscale on idle to probably around the same power the N100 would. The real difference is that they’ll use more power if you demand more resources but even that that point do you really care about a few watts?
Before anyone loses their minds, imagine you get the i3-8300T model that will peak at 25W, that’s about 0.375$ a month to run the thing assuming a constant 100% load that you’ll never have.
Even the most cheap ass cloud service out there will be more expensive than running that unit at 100% load. People like to freak about power consumption yet it’s not their small mini PC that ruins their power bill for sure.
Before anyone loses their minds, imagine you get the i3-8300T model that will peak at 25W, that’s about 0.375$ a month to run the thing assuming a constant 100% load that you’ll never have.
Not sure how you came to that conclusion, but even in places with very cheap electricity, it does not even come close to your claimed $0.375 per month. At 25 W you would obviously consume about 18 kWh per month. Assuming $0.10/kWh you’d pay $1.80/month. In Europe you can easily pay $0.30/kWh, so you would already pay more than $5 per month or $60 per year.
Just used cpubenchmark.net.
Well, what they are stating is obviously wrong then. No need to use some website for that anyway, since it is so easy to calculate yourself.
Okay, you’re into something, they’re considering 8 hours / day at 0.25$ per kWh2 and 25% CPU by default. Still if we tweak that into reasonable numbers, or even consider your 60$/year it will still be cheaper than a cloud service… either way those machines won’t run at 25W on idle, more like 7W.
Sure, cloud services can get quite expensive and I agree that using used hardware for self-hosting - if it is at least somewhat modern - is a viable option.
I just wanted to make sure, the actual cost is understood. I find it rather helpful to calculate this for my systems in use. Sometimes it can actually make sense to replace some old hardware with newer stuff, simply because of the electricity cost savings of using newer hardware.
Sometimes it can actually make sense to replace some old hardware with newer stuff, simply because of the electricity cost savings of using newer hardware.
Yes and we usually see that with very old server grade hardware vs new-ish consumer hardware. Once the price difference is around 100$ or so, we’re talking about years before break-even and it may not make much sense.
Power doesn’t cost the same everywhere
Yes, I’m in Europe with this Ukrainian/Russian mess whatever you’re paying I can assure you I’m paying more than most people reading this and you don’t see me freaking out about a mini PC. Even if you multiply everything above by 4 (and that will certainly go over wtv someone is paying right now) you’ll sill be talking about a very little money compared to everything else you’re running in our houses.
While I agree 25W is not much, I pay around 1€ for 1W a year (Croatia) and I know there are countries that pay way more thn that. Still, we are talking about power that is close to SBC consumption, you cant go much lower. I think number of devices (drives etc) are more important than actual CPU idle power
I think number of devices (drives etc) are more important than actual CPU idle power
Yes, or even the BIOS setup, some of those machines allow to disable CPU cores and un-used hardware.
No one is “freaking out” except you.