Using electricity directly for heating is a colossal waste of resources. In a way, it's like burning money to stay warm, instead of buying fuel. Or perhaps more like buying general stuff from a shop and burning it, instead of waiting until it become garbage, and burning it then. The reason is that it's ridiculously easy to turn electricity into heat. In fact, it's so easy that almost no matter what you do with electricity, it ends up as heat in the end.
While I understand it's not very common in the world in general to use electricity for heating, it's quite common in private houses in Norway. In order to reduce this silly conversion of high quality energy like electricity into crappy quality energy like heat, I suggest to at least make it do something useful on the way. For example, if you install a one kilowatt electrical oven, or a one kilowatt Beowulf cluster, the result in terms of heating is the same. The full one kilowatt gets converted into heat in the end, with the difference being that a Beowulf cluster in addition can give you a few hundred gigaflops of useful calculations.
«But ah,» you say, «I don't have anything I would like to have calculated at the moment, at least nothing at that level, and besides isn't a cluster somewhat more expensive than an oven?». However, even if you don't need the calcualtions yourself, there are plenty of people who do. In fact, several schemes for doing useful things with your electricity on it's way to heat already exists. For example
SETI@home, which is a program you install, that uses your computer to search astronomical observation data for signs of extraterrestrial life. It runs in the background, only kicking in when you're not using your computer yourself, thus ensuring that your computer will be working at a fairly constant load, generating heat and doing useful stuff.
There are several other @home-initiatives, and most of these use
BOINC, which is a software package to set up distributed grid computing. So what I suggest is that someone should make a company which will sell you a Beowulf cluster for the same price as an oven with the same power rating, in return for all those gigaflops, which can in turn be sold to other people who actually have something they need to calculate. Actually, computing time is so much more expensive than electricity that such a company might give you the computer and the electricity for free, in return for not having to manage a huge datacenter, with cooling and all that jazz.
Currently, you can expect to get around half a gigaflops per Watt, meaning a one kilowatt computer would give you about 500 gigaflops. At a power of one kilowatt, it consumes one kilowatt-hour of electricity, which gives a cost of around 1 NOK for 500 gigaflops-hours. Incidentally, the gigaflops-hour is an existing unit of computation, also known as an Allocation Unit (AU). And according to
the HECToR user site, one AU on hector costs either £0.01 or £0.1, which is either 50 times more expensive, or 500 times more expensive, than what you get from a Beowulf-cluster in your basement.
-Tor Nordam
Comments