I, Technology: Is using computers to heat houses such a daft idea?
2012-01-17
OK, it's cold. Winter has finally come - or come back, given the cold snap of early December. And there's nothing like sitting in a freezing room, worrying about the cost of oil, to focus the mind on alternative methods to heat a house.
Meanwhile, we are told, computers are excessively power-hungry. Particularly desktops with their 200-watt-plus power supplies. All that energy has to go somewhere, according to the law of conservation of energy: indeed, it is largely dissipated through the various processing, memory and other chips as heat, which is then sucked away and blown out of vents.
To all intents and purposes, then, a computer is running as a heater, albeit perhaps a rather inefficient one (though by the same token, the only other places that energy can be released are into noise and light, so it is quite efficient, in its own way). The daft idea is, why don't we make more of this capability? That is, actually use computers as a source of heat?
A precedent has already been set by data centre designers. Server rooms are notoriously un-green, in that they gobble whatever power is made available to them and spew it out as heat which then requires even more power to carry away. News stories about data centres warming nearby buildings or even greenhouses appear with reasonable frequency - the cynic in me says that it's an attempt to deflect attention, but equally, it's good that the heat isn't being completely wasted.
So, why not homes? Mainframes house chips on ceramic substrates, and ceramics also used in heating elements - I'm showing my ignorance to an alarming extent but could it be that a technology used to insulate, could also be used to distribute heat? Perhaps not enough to warm a whole house, one might argue, but… let's think about this a bit. Rather than a radiator, would it be possible to create a wall-mounted device, architected to achieve high temperatures purely by performing calculations? Of course, these could be random floating point operations, but perhaps, more usefully, they could be non-random, programmed to achieve a goal - help with finding the cure for AIDS, for example, or supporting the search for little green men,
To take this one stage further (and I know I'm stretching things to the limit here), perhaps such processor time could even be rented. For money. To an organisation that could make use of it. Given an appropriate, pre-tested network connection, maybe Amazon could take advantage of my radiators for burst capacity for its elastic cloud service. At a cost, which could offset, or even pay for my heating bills. I might even turn a profit.
But is this really stretching things? After all, micro-generation from solar panels might once have been seen as mere fantasy, but a flurry of neighbours have bought into such schemes recently. To the extent that passing conversations have moved from being about the weather, to the return on investment that a sunny day can bring.
Home heat through processing might seem similarly out there, but the major vendors are at least thinking about it. Microsoft Research calls the concept the 'Data Furnace', suggesting that data center (sic) owners could save hundreds of dollars per server per year in cooling costs, if servers were distributed around willing (and appropriately equipped) households.
While the concept does have associated challenges, such as data security (which could likely be handled with encrypted virtual machines), it does have a lot going for it. Not least that less heat would be generated by centralised data centres, and more heat would be available for households, saving on personal fuel bills.
Indeed, it's hard to think of one element of technology that doesn't already exist to make this happen today. Perhaps, in fact, the idea of using computers to heat houses is not that daft after all.
[First published on ZDNet]