I wonder if my system is good or bad. My server needs 0.1kWh.
Mine runs at about 120 watts per hour.
Between 50W (idle) and 140W (max load). Most of the time it is about 60W.
So about 1.5kWh per day, or 45kWh per month. I pay 0,22€ per kWh (France, 100% renewable energy) so about 9-10€ per month.
Are you including nuclear power in renewable or is that a particular provider who claims net 100% renewable?
Net 100% renewable, no nuclear. I can even choose where it comes from (in my case, a wind farm in northwest France). Of course, not all of my electricity come from there at all time, but I have the guaranty that renewable energy bounds equivalent to my consumption will be bought from there, so it is basically the same.
Thanks. I buy Vattenfall but make net 2/3rds of my own power via rooftop solar.
My server uses about 6-7 kWh a day, but its a dual CPU Xeon running quite a few dockers. Probably the thing that keeps it busiest is being a file server for our family and a Plex server for my extended family (So a lot of the CPU usage is likely transcodes).
Is there a (Linux) command I can run to check my power consumption?
Get a Kill-a-Watt meter.
Or smart sockets. I got multiple of them (ZigBee ones), they are precise enough for most uses.
If you have a laptop/something that runs off a battery,
upower
I came here to tell my tiny Raspberry pi 4 consumes ~10 watt, But then after noticing the home server setup of some people and the associated power consumption, I feel like a child in a crowd of adults 😀
I have an old desktop downclocked that pulls ~100W that I’m using as a file server, but I’m working on moving most of my services over to an Intel NUC that pulls ~15W. Nothing wrong with being power efficient.
we’re in the same boat, but it does the job and stays under 45°C even under load, so I’m not complaining
With everything on, 100W but I don’t have my NAS on all the time and in that case I pull only 13W since my server is a laptop
80-110W
Mate, kWh is a measure of electricity volume, like gallons is to liquid. Also, 100 watt hours would be a much more sensical way to say the same thing. What you’ve said in the title is like saying your server uses 1 gallon of water. It’s meaningless without a unit of time. Watts is a measure of current flow (pun intended), similar to a measurement like gallons per minute.
For example, if your server uses 100 watts for an hour it has used 100 watt hours of electricity. If your server uses 100 watts for 100 hours it has used 10000 watts of electricity, aka 10kwh.
My NAS uses about 60 watts at idle, and near 100w when it’s working on something. I use an old laptop for a plex server, it probably uses like 50 watts at idle and like 150 or 200 when streaming a 4k movie, I haven’t checked tbh. I did just acquire a BEEFY network switch that’s going to use 120 watts 24/7 though, so that’ll hurt the pocket book for sure. Soon all of my servers should be in the same place, with that network switch, so I’ll know exactly how much power it’s using.
The PC I’m using as a little NAS usually draws around 75 watt. My jellyfin and general home server draws about 50 watt while idle but can jump up to 150 watt. Most of the components are very old. I know I could get the power usage down significantly by using newer components, but not sure if the electricity use outweighs the cost of sending them to the landfill and creating demand for more newer components to be manufactured.
Pulling around 200W on average.
- 100W for the server. Xeon E3-1231v3 with 8 spinning disks + HBA, couple of sata SSD’s
- ~80W for the unifi PoE 48 Pro switch. Most of this is PoE power for half a dozen cameras, downstream switches and AP’s, and a couple of raspberry pi’s
- ~20W for protectli vault running Opnsense
- Total usage measured via Eaton UPS
- Subsidised during the day with solar power (Enphase)
- Tracked in home assistant
kWh is a unit of energy, not power
I was really confused by that and that the decided units weren’t just in W (0.1 kW is pretty weird even)
Wh shouldn’t even exist tbh, we should use Joules, less confusing
Watt hours makes sense to me. A watt hour is just a watt draw that runs for an hour, it’s right in the name.
Maybe you’ve just whooooshed me or something, I’ve never looked into Joules or why they’re better/worse.
Joules (J) are the official unit of energy. 1W=1J/s. That means 1Wh=3600J or that 1J is kinda like “1 Watt second”. You’re right that Wh is easier since everything is rated in Watts and it would be insane to measure energy consumption by seconds. Imagine getting your electric bill and it says you’ve used 3,157,200,000J.
3,157,200,000J
Or just 3.1572GJ.
Which apparently is how this Canadian natural gas company bills its customers: https://www.fortisbc.com/about-us/facilities-operations-and-energy-information/how-gas-is-measured
I guess it wouldn’t make sense to measure energy used by gas-powered appliances in Wh since they’re not rated in Watts. Still, measuring volume and then converting to energy seems unnecessarily complicated.
Thanks for the explainer, that makes a lot of sense.
At least in the US, the electric company charges in kWh, computer parts are advertised in terms of watts, and batteries tend to be in amp hours, which is easy to convert to watt hours.
Joules just overcomplicates things.
Wow, the US education system must be improved. 1J is 3600Wh. That’s literraly the same thing, but the name is less confusing because people tend to confuse W and Wh
Do you regularly divide/multiply by 3600? That’s not something I typically do in my head, and there’s no reason to do it when everything is denominated in watts. What exactly is the benefit?
Wasn’t it stated for the usage during November? 60kWh for november. Seems logic to me.
Edit: forget it, he’s saying his server needs 0.1kWh which is bonkers ofc
Only one person here has posted its usage for November. The OP has not talked about November or any timeframe.
Yeah misxed up pists, thought one depended on another because it was under it. Again forget my post :-)
For the whole month of November. 60kWh. This is for all my servers and network equipment. On average, it draws around 90 watt.
How you measuring this? Looks very neat.
Shelly plug, integrated into Home Assistant.
Looks like home assistant
Idles at around 24W. It’s amazing that your server only needs .1kWh once and keeps on working. You should get some physicists to take a look at it, you might just have found perpetual motion.
.1kWh is 100Wh
This is a factual but irrelevant statement
I ate sushi today.
Good point. Now it does make sense. I know the secret to the perpetual motion machine now.
My home rack draws around 3.5kW steady-state, but it also has more than 200 spinning disks
I think I would go over 10 kW if I fire up everything. Only obout 80 spinny plates of rust though.
What are you hosting?
My server rack has
- 3x Dell R730
- 1x Dell R720
- 2x Cisco Catalyst 3750x (IP Routing license)
- 2x Netgear M4300-12x12f
- 1x Unifi USW-48-Pro
- 1x USW-Agg
- 3x Framework 11th Gen (future cluster)
- 1x Protectli FE4B
All together that draws… 0.1 kWh… in 0.327s.
In real time terms, measured at the UPS, I have a running stable state load of 900-1100w depending on what I have at load. I call it my computationally efficient space heater because it generates more heat than is required for my apartment in winter except for the coldest of days. It has a dedicated 120v 15A circuit
Good lord, how much does electricity cost where you are? Combined with the air conditioning to keep the space livable, that would be prohibitively expensive for me
Yeah it’s a bit of a chonk. I don’t remember the exact itemization on the power bill and I don’t have one in front of me.
It’s always wild reading the power draw people wrote here.
I knew it was because this is a US & Europe centric site and many people from homelabs actually run Enterprise size rigs, but my 4 member household run on 2kW for the entire house lol and 75℅ of that is just A/C we use at night.
My household of 7 averages 900 watts year-round.