Is PoE more efficient than plugging in adapters for each network device?
And at what scale does it start to matter?
—
From my perspective; I’m going for a 3 node mesh router, plus 2 switches, and was considering if in 5 years time the electricity difference would be less than the extra upfront cost. The absolute max length of cable would probably be around 30m
I don’t think I would trust anyone here’s answer. The only way to know is to to test it. Theoretical talk about ‘more conversions’ is kinda discounting the entire field of power supply design. We need someone to slap a killawatt on a system using PoE, and then do it again on that system using external adapters.
I tried Googling to see if anyone had done that and didn’t see anyone doing real testing (on the first page of google at least).
I do have these findings to report: 1) PoE is marketed as cost saving, largely on the install and maintenance costs: fewer cable runs for weird AP locations, less electrical work, etc. Which means we cannot assume that if PoE is in wide usage, that it is due to electricity cost savings. And 2) increasing efficiency of newer PoE power supplies is an active area of development, meaning that a particularly old set of PoE hardware might be less efficient than expected.