Some quick thoughts on three years of changing television technology, including the introduction of 3D TV, can make three year old statistics covering U.S. household TV practices look a bit dated.
EIA RECS Survey Data, for example, the standard data set for household consumer research, proceeds on a four year publishing cycle.
A recent agency requested National Academy of Sciences methodological review provides direction for the publication of more granular and timely residential electricity consumption data sets. Until then, the most recent RECS survey data from 2009, as shown in the pie chart at the top of the page, reminds us of the importance of television in American life.
Only 1.3% of America's 113 million households lack a television. At the other end of the scale, 8.5% of American households own five or more televisions.
Providing a straight forward formula for discovering television operating costs generally reduces to a two step process involving measuring the amount of a television's electricity consumption and determining the price of the electricity based on local prices.
In 2009, approximately one-half of the population used standard definition televisions. The growing popularity of HDTVs will again be reflected in the next RECS data set, due for 2013 publication.
Given current data, it's reasonable to suggest that most households have at least one, if not more than one HDTV, plugged in and turned on during the day.
Two complimentary resources, the Environmental Protection Agency's ENERGY STAR Television Product List and the local energy bill (to get the retail price of electricity), serve as starting points for determining TV operating costs.
Step I: Find the model of your television(s), from the approximately four hundred and ninety different television brands and models listed on the Television Product List. Make sure number and size match, then proceed to the last column on the row which ranks the Estimated Annual Energy Consumption (kWh/year). The most recent models consume anywhere from 35 to 400 kWh/year.
A look at the notes indicates that the estimate is based on an average use of five hours/day. If the TV being researched is turned on ten hours per day, double the estimate. If the TV being researched is turned on two and one-half hours a day, reduce the estimate by one-half.
Step 2: With the yearly kWh in hand, determine electricity cost in kilowatt hours kWh.
Multiply the kWh/year estimate by the kWh/hour cost of residential electricity to come up with an estimate of the yearly cost of operating the TV model in question.
For example, a 42 inch Insigna LDC TV with a 1920x1080 resolution uses 208.27 kWh/year, when operated five hours/day. The cost of residential electricity in area X could be listed at an average of 11.52 cents/kWh.
Math: 208.27 x .1152 equals $23.99 the cost of running the TV in area X for the year.
Three TV families, with those same three TVs running ten hours a day, multiply the original result by six, and the cost for the TV household runs about $144/year. That does not count the electricity used for making popcorn to eat, or the electricity used to operate the lights, while household members watch the TV.
Consumers planning for net zero electricity homes, along with outdoor enthusiasts, run a similar type of arithmetic to determine the length of time a television will run on battery power.
In these instances, energy efficient, 12V televisions, provide the cost measuring standard. Usually the amps required to operate the set for a one hour time period serves as the measuring tool.
A TV requiring 2amps for operation, for example, would need a 10ah battery, a rather inexpensive, small deep cycle battery, in order to operate the television in an outdoor environment for a five hour time period.
Connected to a solar panel battery charger, the television would be capable of operating for the length of its lifetime without any energy costs except the initial cost of the solar panel battery charger.
© 2010-2102 Patricia A. Michaels