Is PoE more efficient than plugging in adapters for each network device?
And at what scale does it start to matter?
—
From my perspective; I’m going for a 3 node mesh router, plus 2 switches, and was considering if in 5 years time the electricity difference would be less than the extra upfront cost. The absolute max length of cable would probably be around 30m
Devices pull power, it doesn’t matter how much the PoE switch supplies, as each device will only pull what it needs.
A switch will recognize a PoE device and push the full 15.4w by default. Then the device has the abi6to communicate back to the switch to specify how much power is actually needed.
The switch can put out 15.4W, but it doesn’t control how much power flows. The device can draw 15.4W if it wants to but it won’t necessarily do so. The switch can lower the voltage it supplies, and it can cap the power output by lowering the voltage it supplies, but it can’t push a certain amount of power. That would violate the fundamental physics of electronics.
Put a 2.4kΩ resistor as the “device”, and at 48V, the absolute maximum that will flow is ~1W. The switch would have to push 196V to force that resistor to use 15.4W which would put it way out of spec. And there’s nothing preventing the device from being smart enough to adjust that resistance either to maintain 1W. That’s basic Ohms law.
The device must negotiate if it’s going to use more than the default 15.4W, or it can advertise it’s low power so the switch can allocate the power budget to other devices as needed. But the switch can only act as a limiter, it can’t provide more than the device takes. It can have the ability to provide more than the device takes, but simply can’t force the device to take more.
It will allow up to 15.4W to be drawn from the port by default, if the device only uses 2W then it only gives 2W.
You can’t push 15.4W and have it go nowhere, that’s not how electricity works.