What would you do with spare 153GHz of spare CPU processing power?

tankman1989

Active Member
Reaction score
5
This is all hypothetical so I'm just looking for other tech's thoughts and ideas here. Here is the situation:

You have 8 computers running the AMD 6 core 3.2GHz CPU's. Running at 100% they put out the exact amount of heat needed for a specific application in a business you own, so these machines are being run at 100% (or maybe 80%, or whatever is discovered to be optimal processing rates for longevity) CPU capacity and are basically little heaters. Since heat is the main product needed, all the spare computing capacity is available for any application you desire. What are some uses for the space CPU utilization. What would be some ideal applications for this?

(I figured something like password cracking, rainbow table generation or the "folding"/distributed computing projects would be definite here)

Note: I know that GPU's may be more suitable for some applications, but the main need is heat generation so the electronic source is irrelevant - you can interchane GPU's with CPU's if you like. It might be better utilizing graphics cards as they can be upgraded more easily than CPU's, for the most part...
 
Why the hell are the PCs being used as heaters??

On an humatitarian leve, l and given ,its spare CPU time why not get involved in downloading one of the, for example, cancer programs where they want you to analyse some code and upload results back.

Try use for some good if its spare. A thought!
 
I would sell the 8 computers, purchase an $200 heater that puts out twice the BTU's of 8 fully used processors and pocket the cash left over save twice as much on the monthly electrical bill.

CPU's put out a lot of heat but are extremely inefficient heaters if that is their purpose.
 
I was trying to figure out a way to utilize the byproduct of computers (heat) in a productive manner. Computers are basically little heaters that can do magical things. I didn't have any one specific application for the heat but one thought is preheating water before it is used for domestic hot water. Imagine a restaurant that uses 500 gallons of hot water a day. The water comes in at 55 degrees F and needs to be heated to 180 for use in the kitchen and bathrooms. Before the water enters the hot water heater it passes through the computer system raising the temp from 55 to say 95-110; that means the water heater needs to heat it that much less, saving energy. Heating hot water is ideal as the water is consumed and there is always a fresh supply of 55 degree water (which can be pushed through as fast as the hose/pumps can handle, so over-clocking is a good possibility) which means you don't have to run radiators with fans. This is the perfect setup or massive cooling of computers.

Just as a note, in the winter I hardly run the heater in the room that is my office because the computer heats it but in the summer it is the room that needs the most cooling. Space heating is the easiest use of the heat by-product but in most area's it is only needed 3-6 months of the year.

So, do you understand the question now and why I said it was hypothetical and choosing the computer parts is very versatile (as heat is one of the main needs).

The cancer research is a good idea but I'm more interested in making the machines pay for themselves first. If that is completed or there is no other work then all spare clock cycles would be dedicated to that research.

I'd really appreciate if anyone has the opinion that it is "stupid" to use computers as heaters to just not post as you obviously don't understand the big picture (this is not directed to anyone specific; I just know how many people on this board seem close minded and like to discourage out of the box thinking). Thank you.
 
Note: I know that GPU's may be more suitable for some applications, but the main need is heat generation so the electronic source is irrelevant - you can interchane GPU's with CPU's if you like. It might be better utilizing graphics cards as they can be upgraded more easily than CPU's, for the most part...


Well, since the electronic source of the heat is irrelevant, I'd tell you to get a space heater with an adjustable dial and to quit trying to justify a reason to have your computers on. :D

So, do you understand the question now and why I said it was hypothetical and choosing the computer parts is very versatile (as heat is one of the main needs).

Sure, computer parts are versatile, but their function isn't a heater. What you're talking about is using the byproduct of their waste energy. You shouldn't be thinking in terms of heat as the main function.

So, with that in mind, I understand what you're asking...but it's entirely impractical. The only thing that it's even remotely practical for is being able to turn down the thermostat a degree or two in the winter, because the computer will be on and doing something useful anyway (Like being a POS, or a server, or something of that nature.) Running folding at home to heat your house is wasting your money, not saving it.

Do you understand what people are telling you in this regard? It's not that what you're asking is "stupid", it's that your thinking is backwards on it. The efficiency isn't there for the purpose you're asking, and since the heat is the byproduct, you shouldn't want to be generating more of it simple to do another (very small) task... I don't know how to make what I'm saying clear....but understand that I DO understand exactly what you're asking... I also don't know of anything it'd be useful for, unless you could somehow rent out your clock cycles for rendering or something. I don't know of any distributed computing project that pays for clock cycles...and that seems like the only thing to be able to do 100% cpu usage constantly.
 
Last edited:
I would sell the 8 computers, purchase an $200 heater that puts out twice the BTU's of 8 fully used processors and pocket the cash left over save twice as much on the monthly electrical bill.

CPU's put out a lot of heat but are extremely inefficient heaters if that is their purpose.

Really? The laws of thermodynamics aren't perfect, but what kind of $200 heater are you talking about? Anything electrical will convert ALL electricity into heat, one way or the other.

You can play with capacitance to try and fool the meter, but that don't change the laws of physics, Captain, she cannot go any faster! :D

For useful number-crunching, I recommend BOINC


Saying that, I would prefer selling a few computers and buying solar panels (both electric and the simple water-tube ones) :)
 
Last edited:
Really? The laws of thermodynamics aren't perfect, but what kind of $200 heater are you talking about? Anything electrical will convert ALL electricity into heat, one way or the other.

You can play with capacitance to try and fool the meter, but that don't change the laws of physics, Captain, she cannot go any faster! :D

For useful number-crunching, I recommend BOINC


Saying that, I would prefer selling a few computers and buying solar panels (both electric and the simple water-tube ones) :)


Not that I'm saying this is what was being said, but a space heater is more efficient, exactly because of the laws of thermodynamics. If the desired goal is to heat a room, for example, the 700watt space heater is going to be better suited to do that than would be 700watts of combined computer components. The reason for that is that the space heater is operating at a higher temperature and the heat is being transferred to the air at a better efficiency. The cpus are eventually generating the same amount of heat, but the rate at which it heats the air is slower... and the heat loss of the room to the outside causes it to take longer (if ever) to heat the room the same temp.

So...anyway...yes...that's all I'm going to say in this thread. I have better things to do. :D
 
Well, since the electronic source of the heat is irrelevant, I'd tell you to get a space heater with an adjustable dial and to quit trying to justify a reason to have your computers on. :D



Sure, computer parts are versatile, but their function isn't a heater. What you're talking about is using the byproduct of their waste energy. You shouldn't be thinking in terms of heat as the main function.

So, with that in mind, I understand what you're asking...but it's entirely impractical. The only thing that it's even remotely practical for is being able to turn down the thermostat a degree or two in the winter, because the computer will be on and doing something useful anyway (Like being a POS, or a server, or something of that nature.) Running folding at home to heat your house is wasting your money, not saving it.

Do you understand what people are telling you in this regard? It's not that what you're asking is "stupid", it's that your thinking is backwards on it. The efficiency isn't there for the purpose you're asking, and since the heat is the byproduct, you shouldn't want to be generating more of it simple to do another (very small) task... I don't know how to make what I'm saying clear....but understand that I DO understand exactly what you're asking... I also don't know of anything it'd be useful for, unless you could somehow rent out your clock cycles for rendering or something. I don't know of any distributed computing project that pays for clock cycles...and that seems like the only thing to be able to do 100% cpu usage constantly.

I think you are not understanding my intentions on either side of the equation as what I am proposing not only makes sense but analogous situations are standard practices in all production or manufacturing processes; It is often difficult to present the entire situation, fully detailed, on a thread and keep it short enough to be attractive to read - so we run into misunderstandings like this.

This situation would not be useful for the average person or even business.

Let's just say that there is one computer (Maybe a Cray or something :D) that requires 20kw/hr. You lease or rent the processing time out to companies for various applications; maybe to investment firms looking for trends so they can make more accurate predictions. SO, you have your machine being used for it's appropriate application, making income to pay off the machine cost. You also have a TON of heat coming off this machine, which can be water cooled if you choose to do so. Next door, in the same building (think duplex building), you have a laundromat (or other business that uses lots of hot water - You own both businesses). You run the water through the Cray before it enters the water heater, raising it 40 degrees (or whatever). At $.17/Kw this comes out to $30,000 per year in energy usage at 100% duty year round. Obviously you would see a decrease in energy/fuel cost in whatever you saved by pre-heating the water. If you hadn't used a Cray as the heater you would have had to spend X amount more to heat the water the first 40 degrees.

Now I know most people won't use Crays, but this can be done on smaller scales or larger. It can be broken down into smaller machines or even larger machines (think data centers or server farms). If you can't think of your own applications or if you don't have the vision to see this expanded/contracted to the appropriate size for your own application then I can't really say anything else. This application is neither impractical, undo-able or anything else like that. It seems that some people on this board think their way is the only way and they never think outside the box.

SO. Now that an example has been given (and again, if you don't have the vision to expand/contract or change the application then I can't help - no one can) - if you know of any business applications which utilize super computers and or number crunching (like the investment forecasting I mentioned earlier) I would really appreciate you mentioning it. I can figure out applications for which the heat can be used - that is not a problem.

Now if anyone thinks this is still impractical think of a data canter that is running 2,000 servers which uses maybe 500w/server. That would require 1Mw to operate at 100% utilization. This heat has to go somewhere. Piping water to each machine is no different than running electricity in many aspects and using that 1Mw of byproduct heat would only make sense (Maybe you co-locate a data center and a professional laundry service next to or in the same building - it's all about scale). Are you with me on this? Is it making sense?
 
Really? The laws of thermodynamics aren't perfect, but what kind of $200 heater are you talking about? Anything electrical will convert ALL electricity into heat, one way or the other.

You can play with capacitance to try and fool the meter, but that don't change the laws of physics, Captain, she cannot go any faster! :D

For useful number-crunching, I recommend BOINC


Saying that, I would prefer selling a few computers and buying solar panels (both electric and the simple water-tube ones) :)

Thank you for your reply! I totally agree with you about the thermodynamics. If I had to spend $30,000 per year to heat water I would much rather buy a $200,000 heater that also could be rented out at $100/hr (that's $876,000 at 100% utilization in a year) while it still performed the function of heating the water - all for the same inputs of energy/money (excluding purchase price of equipment - but that was NEVER the point of the thread as it is irrelevant). This is how I see this scenario - Please tell me if I have not taken some absolute factor into account which needs to be added to this equation. Thanks!
 
I think you are not understanding my intentions on either side of the equation as what I am proposing not only makes sense but analogous situations are standard practices in all production or manufacturing processes;



I understand you perfectly. However, you started out talking about doing this with something like eight consumer level CPUs, not something like a super computer that WILL be used for more intensive tasks, as that's its purpose.

Basically, your question was answered that what you asked wasn't at all efficient, so you changed the specifications.

When you have a chance to actually design a project like this, to use the generated heat off of a data center or a cray, or whatever, to heat water before it enters your laundromat...sure...bring the topic up again. But for your eight desktop computers, it's entirely and totally impractical.


By the way, check this out:
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=using+heat+generated+from+data+center
 
Last edited:
I like the idea of capturing waste energy and recycling it into other purposes, and I think your idea is a fun if not impractical idea. Kinda like wiring in 14 200-watt light bulbs to heat your room.
 
I like the idea of capturing waste energy and recycling it into other purposes, and I think your idea is a fun if not impractical idea. Kinda like wiring in 14 200-watt light bulbs to heat your room.

But that would probably be the cheapest heater you can find!
 
I would calculate the value of Pi to a trillion places recursively, WHILE running Folding@Home, WHILE simultaneously encoding hundreds of video files, WHILE using the heat to make me some Kraft Dinner :)
 
Back
Top