Galvin Power is reader-supported. When you buy via our links, we may earn a commission at no cost to you. Learn more

How Many Amps Does a TV Use? – Amps Calculator

Written by Edwin Jones / Fact checked by Andrew Wright

how many amps does a tv use

The television is one of the most common household appliances. Most of us have one in our homes, and if you love watching TV shows, you’re probably curious how much that activity contributes to your energy bill.

So how many amps does a TV use? Model, screen size, screen type, and other features are all significant factors to consider when calculating the average TV amperage.

If you want to know how much energy your TV uses, you can read all about it below.

How Many Amps Does a TV Pull?


The average 113W, 120V TV will use 0.95 amps. However, watts used by TV will differ from model to model due to variations in features and screen dimensions. This is also the reason why different TVs pull different amps.

For example, you want to know how many amps your 55 inch TV uses. All you need to do is figure out its watts rating and divide it by its voltage to get the total amps.

If your 55-inch flat-screen tv uses 150 watts and 120V, it consumes only 1.25 amps. In this case, there will be no problem if you want to run your TV with other lower-amp appliances in the same circuit. 

What about how many amps does a 12V TV use?

A 12V television with 60 watts and a 19-inch screen size may have an average of 5 amps. Its usage time will depend on how high your battery capacity is.

As mentioned above, different types of TVs will pull different amperages. The following table has compiled the average power consumption data for various 32-inch TVs. The table provides a comparison of power usage among 7 common types of TV.

TV Energy Consumption Comparison
Tube TV 150W
Plasma TV 160W
LCD TV 65 – 70W
Smart TV 157W

1. Tube Television

Tube TV, also known as CRT, has the highest energy consumption, an average of 150 watts for a 32 inch screen and 80W for 19 inches.

2. Plasma TV

As the name implies, this TV uses plasma technology to generate an image. For a 32 inch screen, its average energy consumption is about 160 watts.


The LCD (Liquid Crystal Display) is a type of TV that uses liquid crystal for its function. It has a better picture quality than the two types above. Its energy consumption for a 32 inch screen display is an average of 65 to 70 watts.


LED (Light Emitting Diodes) TVs utilize LED lamps to process and produce an image. This type of TV is also known for its energy efficiency, demanding only an average of 35 watts for a 24 inch screen and 41 watts for 32 inches.


OLED TVs (Organic Light Emitting Diode) have better images than LED TVs. As the technology continues to improve, this type of device has been upgraded to AMOLED (Active Matrix Organic Emitting Diode).

OLEDs require an average of 75 watts for 43 inch screens and 57 watts for 32 inches.


UHD TV (Ultra High Definition) is the best regarding resolution or picture quality. It usually starts at 3840 pixels wide by 2160 pixels in height, commonly known as 2160p TV or 4K TV, and now we have the latest version, which is 8K UHD TV.

The average TV watts of a 4K UHD TV is 80 watts, higher than LED and OLED TVs.

7. Smart TV

A smart television can connect to the internet (through Wi-Fi or LAN cable) and allow us to watch videos online. This model of TV consumes 157 watts on average.

How to Calculate Amps Your TV Uses?

Calculating the amps used by your TV is very easy. If you want to know how many amps a TV draw, you can use the formula:

\text{Amps} = \frac{\text{Watts}}{\text{Volts}}

For example, your television has a power of 116 watts and a voltage rating of 120V. All you need to do is to divide the wattage by the voltage rating. In this case, you will get around 0.97 amps.

For a better understanding of these electrical ratings, here are short definitions of amperage, wattage, and voltage.

1. Amperage

The current flowing through the circuit is called the amperage and is measured using ampere or amps.

2. Wattage

Wattage determines the energy consumption a specific device has. Generally, higher wattage devices consume more energy than lower wattage devices.

3. Voltage Supply

The voltage supply refers to the main energy supply in every household. You may wonder how many volts we use in a typical household, since you’ve probably seen ratings of 110V, 120V, as well as 220V and 230V.

Don’t be confused. These ratings only refer to two main supplies: 110V and 220V, which are standard in the US. However, there are some countries like the UK that only utilize 220V as their standard outlet supply.

The terms and computation above can help you calculate amps for TV. Knowing the cost, on the other hand, requires a different computation.

What Other Factors Affect Your TVs Energy Consumption?

Generally, two main factors affect the energy consumption of a TV. These factors are the model and the size and type of screen.

Older CRT televisions are much less energy-efficient than modern LCD TVs. At the same time, as the screen size and type gets bigger and better, it also requires more energy.

According to the NPD research company (National Purchase Diary), the average American television size in 2020 is a 51 or 50 inch tv.

If you want to know how much your TV energy costs per annum, read on.

The table below shows the cost of using TVs of different sizes and average wattages, based on an electricity rate of $0.11 per kilowatt-hour (kWh). The calculations consider 5 hours of TV viewing per day.

Size of TV 50″ 55″ 60″ 65″ 70″
Avg Watts 108 Watts 118 Watts 131 Watts 150 Watts 163 Watts
Daily Cost (5 hours) $0.06 $0.06 $0.07 $0.08 $0.1
Weekly Cost $0.41 $0.45 $0.50 $0.57 $0.63
Monthly Cost $1.8 $1.9 $2.2 $2.50 $2.7
Yearly Cost $21.7 $23.6 $26.2 $30.1 $32.6

1. kWh Per Year

The total energy you consume is calculated as kWh annually. This includes the wattage (watts) or kW and the hours you use the device. Here’s how to figure out how much electricity your TV consumes annually.

You must first understand that 1 kW is equal to 1000 watts. Therefore, if a 65 inch TV uses 150 watts, you must divide that number by 1,000 to see how much kW it consumes. In this manner, you will obtain 0.15 kW, which you must then multiply by the number of hours of usage to obtain the kilowatt hours (kWh).

Suppose you watch TV for five hours a day on average, 0.15 kW per hour multiplied by five hours equals 0.75 kWh. You can determine the TV’s annual energy use by multiplying this figure by 365 days in a year.

As a result, your 65″, 150-watt TV consumes 273.75 kWh each year.

2. Cost per Year

Cost per year here refers to the cost of energy you pay annually. However, there is no set cost for energy, since it will vary depending on the utility provider where you live. Thus, it could be high or low.

Assuming that the energy rate in your place is 11 cents per kWh, you will need to multiply the total kWh per year by this rate. In this case, a TV with a total of 273.75 kWh annually will cost $30 in electricity per year.

Reducing Your TV’s Energy Consumption

If you want to reduce the energy consumption of your television, here are some tips you should consider.

  • Lower the brightness or turn on ECO mode
  • When listening to music using your TV, turn off the screen
  • Use the sleep timer feature if you tend to fall asleep while watching
  • Unplug your TV when it’s not in use
  • Purchase an energy-efficient model


Knowing how many amps does a TV use as well as its consumption rating is essential for the circuit’s safety and may also change how you use your appliance.

Modern TVs today use more energy, even when in standby mode as compared to the previous generation of LED TVs. This change is the result of adding new features and functions which come from technological advances.

If you want to save energy while watching television, our tips above will surely help you.

Furthermore, check the following guides to know the best way to calculate the amperage of home application, such as:

5/5 - (3 votes)