#### Comment Preferences

• ##### I like your numbers(2+ / 0-)
Recommended by:
lone1c, bryfry
The only complaint I have is that very few people take a shower with just hot water.  Also, 140 degree water is a tad uncomfortable.

A good plan today is better than a perfect plan tomorrow.

• ##### Doesn't actually matter(0+ / 0-)

At least not with respect to the thermodynamics.

In order to get 5 gallons of water at 120 degrees F (or 125, or whatever), you have to at least spend the energy needed to heat the water from "room" temeprature to whatever the final temperature is. Mixing in cold water with 140 degree water versus just heating it to whatever final temperature you use doesn't reduce the energy needed for the heating process. It just changes the "path" you use to get from the starting point to the end point.

The argument I give above assumes ideal processes, and tells you the minimum amount of energy needed under the circumstances; if you get away with using less energy, you've violated the laws of thermodynamics.

• ##### but temperature is wrong(1+ / 0-)
Recommended by:
lone1c

Shower water temperature is typically about 104 degree F and very few people take a shower at hotter than 110F.  But you underestimate the flow rate by a lot -- most showers use about 2 gpm -- so I didn't bother to correct your calculations.

Also, this calculation is actually easier in old fashioned units rather than metric.  A Btu is defined as the energy needed to raise 1 pound of water by one degree F.  So, if you assume 5 minutes, 2 gallons per minute, 8.3 lbs per gallon and a 40 degree temperature rise:

5  x 2 x 8.3 x 40 = 3320 Btus per shower

One kWh is 3412 btus, so that's about 1 kWh if you have electric hot water.  It takes 40 hours for a 25 watt bulb to use 1 kWh.

But I'm not so sure your comparison to a 25W bulb help people put this in perspective.  The CFL is very efficient after all.  The average home uses more than 10,000 kWh per year, so a single 5 minute shower uses less than one ten thousandth of the average US home'e electric usage.

BTW, if you want to get picky, the calculation is actually much more complex than you indicate.  In heating climates in the winter, a significant fraction of the heat in the hot water is dumped into the home before it leaves via the drain.  This heat can offset some of the house heating load.

• ##### You are correct, sir.(0+ / 0-)

I know that the actual calculation is a lot more complicated than what I had there. But I was trying to do a "back-of-the-envelope" calculation--enough to get reasonable numbers that people could put into perspective--and I don't think an exact analysis would be a huge difference. (Even using a heat pump would basically cancel out the effects of the higher flow rate, so the total changes all more or less cancel each other out.)

And you're right about one shower being not a large total--my own numbers back that up. It's enough energy to power one light bulb for a day. Against a year's energy usage, it's not much. But it can add up. For a family of four, that's about 1440 showers each year, or 1440 kWh per year. And that's a pretty big fraction of 10,000 kWh per year.

• ##### yes...(0+ / 0-)

showering can amount to a noticeable fraction of your energy use, but a single shower is not nearly as big a deal as many other actions...

For example, one shower is about equal to one half mile of driving your car.  A very small reduction in miles driven per week would save more energy than eliminating all showering...  I'd rather consolidate two trips together once a week than stop showering altogether, and so would my family ;)

• ##### I think so would most people(1+ / 0-)
Recommended by:
NRG Guy

I'm certainly not encouraging people to stop taking showers.

My goal was just to have people start thinking about energy--and water--usage in their daily lives. I thought this was an interesting result, and thought I'd share.

• ##### well. not really(1+ / 0-)
Recommended by:
lone1c

The argument I give above assumes ideal processes, and tells you the minimum amount of energy needed under the circumstances; if you get away with using less energy, you've violated the laws of thermodynamics.

You haven't really used any thermodynamics in this analysis, more like heat transfer.  But you could certainly heat water with less external energy input than your calculation indicates -- ever heard of a heat pump?  You could use a heat pump water heater to get efficiencies of over 200%.  Just like a heat pump can heat a home with greater than 100% efficiency (if you define efficiency in a simplistic way -- first law vs. second law efficiency...)

• ##### Huh?(0+ / 0-)

The definition of heat capacity is definitely thermodynamics. I haven't used any heat transfer arguments at all. It's all based on enthalpy content--I haven't said anything about friction losses in the pipe, heat transfer from the water to the pipes carrying the water, or any other fluid mechanics type arguments.

• ##### well...(0+ / 0-)

I covered these concepts in heat transfer class but obviously the content is covered in both.  It's just that the focus of thermodynamics is on concepts like work and entropy and heat engines -- not simplistic little heat capacity calculations which you cover in high school physics.

BTW, you are still incorrect in your statement that you have violated the laws of thermodynamics if you use less energy.

• ##### Less work, not less energy(0+ / 0-)

Yes, you can get more energy out than the work put into a heat pump. But you can't get more energy out than the total energy transferred to the hot "sink." (But that's not really relevant in this problem, since most home water heaters aren't heat pumps.)

(I'll agree I probably could have been more careful in my analysis--but like you said, I was just doing a "simplistic little heat capacity calculation," not do a detailed study. Sometimes there's value in the quick-and-dirty calculation, too.)

Subscribe or Donate to support Daily Kos.