When you’re dealing with electrical equipment, you always see ratings about things like watts, amps, and volts. You know that these have to do with the current, but what exactly do they mean, and how do you calculate the right amount of each one?
Fortunately, it doesn’t require an advanced degree in math or physics. We can go through all three concepts, show you how to work with them, and why they even matter in the first place.
What Is an Amp?
Amp is actually short for ampere, and this is a unit of measure for an electric current. Specifically, the amp is a measure of how many electrons move past a point in a single second. One amp has roughly 624,150,000 electrons passing a point each second.
There are a couple of things to understand about amps. First, the number of electrons moving is another way of saying how much charge is moving per second. Many people use the water hose analogy. If you think of current like water running through a hose, the amps tell you how wide the hose is. That’s another way of measuring how much water is passing through the hose.
The other important takeaway is related to safety. While all metrics matter for safety, amps matter the most when it comes to electrocution. A person can withstand shockingly high numbers of volts, as long as the amperage is low. At a single volt, a current of 0.1 amps is enough to be potentially fatal.
Amps also matter for calculations when you’re designing circuits or working with electrical devices, but we’ll get to that a little later.
What Is a Volt?
A volt is another measure related to current, but this one is telling you about the electric potential. That’s a bit of a weird physical concept, so to keep this simple, the voltage in a circuit is telling you how much force is pushing or pulling the electrons when they move. Using the water analogy, the voltage is like the pressure in a hose.
Another way to understand voltage is to relate it to resistance. Mathematically, the voltage is equal to the amperage times the resistance (V=AR) where the resistance is measured in ohms, and this relationship tells you two interesting things.
First, a circuit with no resistance also has no voltage. In other words, you need a resistor of some kind for a circuit to work (ignoring the very weird case that comes up with superconductors).
Second, for the same circuit, if the resistance goes up, so does the voltage needed to supply the same current. Several things can impact resistance including the material used in a circuit, the width of the wires, and even the age of the circuit. As resistance goes up, so does the heat produced by it, which is an important safety concern.
What Is a Watt?
Our third unit of the day is the watt. This measures the power in a circuit. Let’s remember that power is a specific term in physics and electrical systems. One watt is equal to one joule per second. This tells you how much power will run through a circuit. In terms of IT equipment, your wattage tells you the amount of power that a device draws or that a power supply can handle.
We’ll get into math in a bit, but the most important thing to understand about watts is that the power of a circuit is determined by the draw of the endpoint device.
Say you have a power supply that can provide 1,000 watts and a router that needs 40 watts to run. Is it safe to plug the router into that power supply?
In this circuit, the router will draw the exact amount of power from the supply that it needs.
The takeaway here is that your power supply needs to have enough wattage to support everything you plug into it, and you don’t have to worry about a high-watt supply damaging your low-watt devices (the same is not necessarily true when comparing amperage).
So, how many watts are in a system? That’s what the next section covers.
How Do You Calculate Them?
Let’s get into the math of watts calculation, and it’s pretty easy. The wattage of a circuit is equal to the voltage times the amperage (W=VA). In other words, the power is determined by the number of electrons passing through a circuit multiplied by the force that is applied to them.
Let’s do a quick sample. Say you have a 120V outlet on a 10A breaker. The maximum wattage that an appliance can draw from that outlet is 1,200W (120V X 10A). It really is that easy.
To help with a little algebra, if you know any two of these metrics, you can figure out the third:
- V = W / A
- A = W / V
With these three formulas, you can always determine the metrics of your circuit.
Why Do You Need to Know?
Why are we even talking about this? There are two major concerns: planning and safety.
If you’re planning to use a bunch of electric devices, you need to make sure you have appropriate power supplies for them. That means you need to get the wattage right.
At the same time, there are safety concerns. While a wide range of devices can run on the same voltage, many devices can’t run on differing amperage. If there are too many amps for a device, it can degrade that device pretty quickly. That’s why complicated circuits use a bunch of different breakers and fuses to regulate the current.
And, there’s the issue of fire. As you pump more and more current through a single system, you raise the fire risk associated with it. This is why you want to segregate electrical currents to the extent that you can and try to prevent your three numbers from simply growing. Maybe you could power your entire network on a single, massive power supply, but you shouldn’t because you’re increasing fire risks when you do that.
With networking equipment, most power supplies are designed to provide fewer than 1500W. There are exceptions, and if you need those exceptions, you should probably be working with a network engineer who has already mastered these concepts and more.
Additional Learning Center Resources