Hey
Ive been trying to work this out but cant. And i thought of here, STH would have the answer lol
My electricity price is
Peak Price - 54.164 cents per kWh (6am-10pm) after that its 25% of the price.
Power draw of my system on load is 2000-2400w. Idle is 350-420w.
How would i work out what it costs to run per hour?
Thanks in advance..
First steps would be to recheck your power draw of 2400W, that seems a lot? ( what is your system?)
Then you need to make assumptions on how your server is working during the day,
what % of time or hrs does it idle and which hrs of the day
what % of time or Hrs does is under full load? etc
what % of time or hrs is under half load or ?? etc..
Need to break it up into discrete segments if it is not working constantly hard during for the 24 hrs each day.
Once you have 3 line items of how it is used and what times of the day as far as load is concerned we can get to the next step of calculating power Used.
54 US cents/hr is a lot??
Next steps is to calculate for each period
Energy in kilowatt-hour (kWh) is the power P in watts (W), times the time (t) in hours (hr) divided by 1000: The 1000 gets you to KW from watts.
so
kilowatt-hour = watt × hour / 1000
or
kWh = W × hr / 1000
Example could be
Idle 8 hrs a day which is 8hrs x 380W /1000 = 3.0KWh at 54.2*75% =3 x 54.2 x 3/4 = $1.22
Max 8 hrs a day which is 8hrs x 2200W/1000 = 17.6KWh at 54.2c/hr = 17.6 x 54.2 = $9.54
Med Load 8 hrs a day is 8hrs x 1100W/1000 = 8.8KWh at 54.2c/hr = 8.8 x 54.2 = $4.77
24 hrs is 29.4 KW. and cost is $15.53?? / day = equals a lot per year???? may need to reconsider what you are running if it is a home server..with newer lower power gear..
You will need to substitute in tthe real hrs at each load, I assumed 8 hrs at 3 different loads
Someone please check my calcs and logic..the power draw of 2400W seems a lot...