Yes, I read that, and was rather surprised. Did you really have 300-400W worth of fans?
Hard to say, I've never really looked at the power consumption of a single fan - but I assume it's not a lot. But when a single 1U server could have as many as a dozen little fans in it, then the numbers soon add up.
Hmm, just had a quick look to see, here's
a small fan that takes over 6W at full power which is "a bit more" than I was expecting. Put half a dozen of those in a small box, and you've got a not insignificant additional load that will vary with intake temperature.
In any event, I think my other point remains. In what you describe above, you got the cooler air (on cooler days) for nothing, since it was merely the consequence of the weather! If you had had to explicitly cool down air to the same extent, I imagine that the energy required to do that would have taken a very large chunk out of that 300-400W saving, if not turning the 'saving' into the converse!
Indeed. Typical COP for cooling plant is around 3:1 (ish) - ie one unit of electric in for 3 units of heat removed. When heating, you also get your unit of energy you put in as heat out, so for heating uses, you can see up to around 4:1. It depends a bit on the engineering in the plant, but mostly it's determined by the properties of the refrigerant and the operating conditions. This is something I've actually done - got the manual for an aircon system and looked at the tables of performance vs indoor/outdoor conditions (IIRC at the time I was arguing with an idiot technician who was arguing that it was "the wrong sort of room" rather than fixing an obvious fault).
Assuming more or less constant "indoor" conditions (which you mostly can for a server room cooling application), as you raise the outside air temperature, you increase the head pressure the compressor has to work against (the boiling point vs pressure curve for the refrigerant), and increase the work the compressor needs to do. The refrigerate flow rate and compressor suction pressure will remain the same, being fixed by the heat input power (cf latent heat of evaporation x flow rate) and indoor temperature (again, boiling point vs pressure curve) respectively. At some point the system will hit a limit - either the mechanical design of the compressor will limit the pressure, or it will have to shut down to avoid overloading the drive motor and mechanical components.
As a secondary thing, there will most likely be an increase in power to the condenser coil cooling fans as ambient temperature rises.
Conversely, if you lower the indoor temperature, you will reduce the COP (lower temperature -> lower suction pressure -> higher pressure differential across the compressor) and increase the running costs for a given heat load. So running the server room at "put on a thick coat before entering" temperatures adds to operating costs.
Did you have servers where the clock speed and/or the VM loads on them were variable, and controlled by temperature? Could the variation not have been because with a lower inlet temperature the CPUs could be driven harder?
Well the actual computing load wouldn't be weather dependent, showing a regular daily variation that roughly followed business hours. These weren't "churn away at a dataset and finish it as soon as performance allows" loads, they were all "process something when a user does something" loads (web based business applications mostly).
It wasn't something that's easy to spot as you have to compare graphs from different days and estimate the difference. I'm sure someone with both the skills and inclination could have done a proper analysis to separate out the temperature correlation from the normal daily cycle - I had neither. But it did look to be in the order of 300-400W difference - the UPS only reported load to a resolution of 100W. Realistically, a 5% increase in power consumption doesn't seem too unrealistic when you consider how much air needs to be shifted in some of these compact boxes when fed with high temperature inlet air.[/QUOTE][/QUOTE]