This diary is related to my 'smartmeter lied' diary. Its about learning the basics of elecricity and electronics. Its not hard stuff really. I was a technician for ten years and I got an accociates degree in electronics. Im not an expert by far but I do know a few things and Id like to help take the mystery out of electrical/electronic devices.
I want you to be more aware of your electric devices. It can help you save money. It can help protect you when they install a new smartmeter on your home. If they havent yet then Im sure they will.
The first step is to take an inventory of all of your appliances. You should list everything in your home and how many Watts each device is. If you dont have a lot of things, this can be quite easy. if you do have a lot of things, I would suggest taking a notebook and going room to room and looking at everything that is plugged in and making a note of the
device and the wattage. The wattage is usually on a tag on the device somewhere. If not, there are other ways of getting a rough idea. Most items that you have bought at a store really dont use that much energy. Its the home applainces that use the most. things like an electric heater or electric water heater or a central air conditioning system. those use more energy than other devices. I live in a one bedroom apartment and everything is electric here. i added up the total wattage for everything I own and its less than 2,000 Watts. The electric heater thats built in to the apartment though is at least 4,000 Watts. My hot water heater says its 5500 watts. those are the major users in my home. This isnt about getting an exact measure of everything in your home. Its just to get an idea.
A few basics:
Everything you plug in requires power to operate. Most electronics have standby power supplies. these power supplies dont use much energy on their own but many of them can combine to equal as much as one to two hundred watts. it all depend on your devices. A safe assumption is 5Watts per device. It may be more or less but its a ballpark. The manual that came with the device will sometimes tell you both standby power and normal operating power.
When you turn the device on, it requires more power to function. It will use up to its maximum rated wattage. It will either use the same amount of energy for as long as its on or it will use a varying amount up to its listed wattage.
A lightbulb uses its listed wattage. A hairdryer uses its listed wattage. a toaster uses its listed wattage. simple devices are either on or off.
Your TV on the other hand will vary its usage according to the images and sound its asked to recreate. An audo amplifier will vary it usage. a computer will vary its usage.
the more you ask from a device, the more energy it uses. but it doesnt use more than its maximum listed wattage. Just becasue an audio amplifier calls itself a "1,000 watt amplifier" doesnt mean it uses that much on a regualr basis. It takes a lot of current to drive some speakers under some conditions (heavy bass and drum lines at high volumes). Usually an amplifier uses much much less.
A device like an electric heater should (I think) use its maximum wattage while its on. Im not an electrican (never was) so I dont know as much about the high voltage AC devices but the basics of electricity are the same for any electrical device. The main controller of energy use with a heater or AC unit is the time the device is on and thats controlled by your thermostat setting. Niether the heater nor AC unit care what the temperature outside is. Its the thermostat setting (how long the device is on)that matters. A hot water heater will run on its own schedule or at least mine does. I am surprised how little energy it actually uses. It helps that its insulated. Note that using hot water will make the hot water heater run more and dropping the thermostat for the air conditioning by more than a few degrees a ta time will make it work more due to the way AC works.
A device like a coffee maker or electric stove will start by using more energy and then it will use less in an effort to maintain a certain temperature. an electric hot water heater does the same.
High wattage devices that are only on for very short periods (like a microwave oven or a toaster) generally arent big concerns.
If you have a smartmeter, you can determine how much energy each of your major users will use by unplugging everything else and then only activating that one device for a set period of time.
all devices will require power while they are on. Its both wattage and time that will determine your energy bills. If you have listed all of your devices, you should be able to more easily see which devices to keep a closer eye on. If a device only uses 10watts while operating at fullpower like my DTV converter box does than I can leave it on all day if I want and it wont cost me much money.
The TV onthe other hand says it requires 100Watts and so it will cost me ten times as much to run it for the same period. But my electric heater is thousands of watts so it completely dwarfs everything I own combined. One hour of heat is worth several hours of everything I own turned on. Please dont worry too much about precisely how much energy your major users will use. I only have control over the things I own. I simply try to use large items less. Things like the stove and dishwasher are only moderate energy users. the heater dwarfs them as well.
A brief note about your energy bill:
You get charged 'per kiloWatthour'. Its often 12 cents per kiloWatthour(kWh) or something like that. you bill might be 1,000 kWh in a given month. What is a kiloWatthour?
A kiloWatthour is one thousand watt hours. Its the product of wattage and time (specifically - one hour). If you ran a 1,000 watt device for one hour, you would use 1kWh. If you ran a 100watt device for one hour, you would use .1kWh. note that the hour remained thesame but the decimal point shifted. If you start with the wattage of a device and put that wattage in kiloWatts by multiplying by .001 and then multiplyingby the hours the device is on, you will get the kWh of energy it takes to run that device for that time period.
If you add up the wattages of everything you own and then put that number in kiloWatts and then multiply by 24 you will get the maximum possible kWh that your devices can possibly use in a 24 hour period. the trick is to account for all devices which can be difficult.
the main point of all of this is to get you thinking. Dont be alrmed by those that want to tell you that 'phantom power' is casuing your energy bill to skyrocket'. Its not true. One major energy user is worth a great many small devices. Its how you use them. One hundred watts of devices will add 2.4 kWh to a days bill if you run them all day long. At 12 cents per KWh thats 29 cents per day. It adds up but 'skyrocketing' is not what Id call it.
Its good to use CFLs. One 100watt bulb uses as much energy and costs you as much on your bill as five 20Watt CFL's. a low power desktop computer can do the same job for less energy. LCD monitors use less than old CRTs.
There are many ways to use less and becomming aware is a good first step. hope this helps.
other methods for determining wattage (in a pinch):
The formula P=ExI
P is power in watts
E is voltage (in volts)
I is current (in Amps).
Example: a modem uses a transformer and it says "Output 12VDC 800mA". that means its output is 12Volts(E) and the max supplied current is 800milliamps or .8Amps(I) and that would be 12 x .8 = 9.6watts (P).
This is not meant to be a technical discussion but feel free. there are many here who Im sure know more than I do about electronics. add and correct where needed. I just want people thinking. i dont want to burden people with apparent power versus true power for example. Its interesting but not needed I dont think.