It's Time for a New Power Standard Fri, Dec 23. 2005
This is absurd. This 'system' has no purposeful design, is physically chaotic, wastes a lot of power, and is unreliable. It's also unfriendly to alternative and mobile energy solutions.
It's time for a new power standard: a DC-based system, with a bit of intelligence.
Here's an offhand sketch of what this might look like:
 Devices would plug into the power hub with a multipin connector. Basic power would be provided at +5 and +12 V (an "A" connector - possible to provide from a car cigarette lighter adapter). Advanced power, which could power PCs and charge laptops, would include +5, +12, +24, and -12 V (do we still need -12?) (this is the "B" connector). Devices would have to report their maximum current draw in one of two ways: for dumb devices, they could represent the current by having selected connector pins shorted to ground, expressing the power draw as binary number. An 8-bit value with a 500 mA unit would permit any current from 0-128 amps to be expressed in half-amp steps. Intelligent devices would report their identity and power draw using i2c. (Why not i2c for everything? Well, from what we see with cheap "USB" devices that aren't USB devices at all but do suck power - lights, cellphone chargers, fans, you name it - we can be sure that companies will introduce devices that don't have intelligence. If we provide a way for those devices to easily report their current consumption then at least we'll avoid blowing fuses). Dumb devices don't ever get UPS power backup; intelligent devices may request it.
It wouldn't take much to get this standard off the ground -- if the uninterruptable power supply companies got together with the PC power supply manufacturers, we'd have a strong start; you wouldn't have to change PC design at all, just replace the PSU with a (mostly empty) box with a power hub connector on the back. Once consumers start buying power hubs to power their PCs, they would undoubtedly start putting pressure on the router/switch/LCD monitor/usb hub/inkjet printer manufacturers to start using the standard, and I predict that the manufacturers would jump at the chance to shave a few bucks off their production costs by excluding a power supply adapter.
Where could this go? Well, if it's a success in the PC world, I can see it moving to the home entertainment market. Perhaps one day power hubs will become part of our residential wiring systems, and we'll see AC and DC sockets both appearing at our wall-plates. With LED-based lighting, perhaps AC sockets will eventually exist only in the kitchen and the workshop. And perhaps in the future, intelligent house systems will communicate presence-status and other information to systems through this same interface -- when there's no one in the room, there's no point keeping displays, speakers, and lights turned on.
Another short-term application would be chargers for mobile devices (phones, music players, headsets, etc.) Several have already floated the idea in that space.
For such low-draw devices, however, I wouldn't be surprised if some existing, non-specialized standard were adopted instead. For example, if all phones had mini-USB ports from which they could charge (and normal charging cables were just power-only USB cables), then that might be "good enough", which I suppose is a shame since a custom-designed standard could be far far better.
The problem with using USB for power is that the USB standard relies on the devices being intelligent and reporting their power consumption in order to avoid overloads. Almost all "power-only" USB devices aren't intelligent. USB also provides only 500 mA at 5 vdc, not enough for many applications.
For charging purposes, the main problem with the draw limitations in USB is that what you really want is to chain devices, and that's probably not safe. My mobile phone charger is 5V and 500 mA (so you'd expect actual draw to be far less), and my headset charger is 5V and 450 mA, so it's absurd that neither device is USB-chargeable. My music player charger, however, is 2A@5V, but completely charges the player from dead in an hour or two. If you could control the draw, I'm sure it could charge from what was available from USB.
New way to distribute power.
I for one, having lived through wiring control panels for electric generating stations, gas pumping stations, and the like, found that the bundling of low and higher voltages together is asking for problems. Higher voltages (110/220) can pick up a lot of spikes from other appliances on the same electrical feed. Secondly, the danger of an insulation breakdown due to cable flexing to the distribution bar is a real possiblity. Why, a dog or mouse could partially chew through and allow the high voltage to fry the lower supplies.
What I would like to see is to go back to the ways where the power supply was built into the device. That is, the router, speakers, etc, only have a simple national standard plug.
Here is what I have, and it is a real dust collector and an eye sore.
On the floor I have a 500 watt ups with two power bars plugged. This is used to power the transformers for cable modem, router, viop telephone adapter, and two cordless telephones. On another connector I have the transformer for the speakers. There is both low voltage and mixed voltage (9v, 12v, etc) around.
Why can we not return to the era of the power supply built into the product with a plug in adapter cord. Let me have the plug in cord so that I could use a 6 or 9 footer. Now I could tie wrap the power cords togther, get rid of a dust trap, etc. And there would be no 5, 9 or 12 volt wires to get mixed up and dangle from the rear of these varied equipments.
I think adding another single point of failure to a desktop is a bad thing.
The problem with DC, is that it doesn't transport well, meaning rectifying to DC is best done right before the device which needs it.
As any laptop user knows, the power supply is a weak link, and when it dies, you're out of luck. It would be nice to tap the DC side of your UPS for any 48V appliances that you have. However, changing voltage in AC is generally very efficient, in DC not so much.
(Linked here via the 6 head linux box article)
Actually, DC transports very well.
(see http://en.wikipedia.org/wiki/Electric_power_transmission under Losses and HVDC)
The decision to use an AC grid was made before the invention of transistors which made DC to DC conversion extremely expensive (think f-ing huge tube oscillators).
Transformers are cheap but most computers use switching power supplies and DC to DC switching supplies aren't any less efficient than AC to DC. It's silly to switch 48 volts DC up to a 120 volt sinewave just to switch it back down do DC again.
Hey, I agree anonymous. As the low-voltage DC are always proven to be poor. The appliances may get damaged by using some of them.
I was thinking in those terms already around 1990, and in general I agree very much; it's just a major hazzle to have zillions of power adaptors and cords lying around. Quite often the power adaptors can be interchanged, the power output and plugs are specialized.
My original plan was scrapped, wiring up an apartment with low-voltage DC is probably not a good idea; power-hungry applications will quickly cause voltage drops along the wires. However, standardized "power hubs" that can be connected to the wall and feed a desktop computer with all auxillary equipment - or feed a home entertainment system - or just charge a cellphone - would be a very good idea.
In my humble opinion, both fax-machines and UPS'es are stupid devices, there shouldn't really be any need for them. I believe that in any device, computer, or eventually power supply, there should be a small accumulator (preferably replaceable and standardized) to cover up for power glitches. I have it in my laptop; that is primarly to improve my mobility, but it also serves very well as a built-in UPS. The unfortunate thing is that upon power glitches, the access to Internet almost always falls down - because none of the modems/routers/access points have built-in accumulators.
I don't buy the "adding another single point of failure" argument above. With standardized power supplies (HUBs) and standardized accumulators, and eventually accumulators both in the power supply and in the device, there are even fewer "single point of failure" than in the common setups of today. I've actually never seen a laptop power supply fail, but I have several times seen PC power supplies fail. I believe a well-designed power hub can be interconnected and made fail-over. Also, with better standards and with accumulators on the application side one could easily replace a broken power hub, even without causing down-time.
I believe standardized power hubs is a very good idea. Somebody should start a campaign, or start producing those
Power over ethernet will use 48 volts. I think it's probably easiest to go to that standard, and then have circuits inside the devices bring the voltage down. You keep the mess down to two wires, and keep people from getting too confused. Just use a different type of plug for the DC.
48 volts DC is also common is also for TelCo installations and 48 volt DC input PC power supplies (eg. ATX) are already available. Also 48 volt DC is one of the common standards for windmill generators and storage systems.
Imagine how much energy could be saved if greenies didn't need to convert from their 48 volt batteries (think multi-KiloWatt UPS) up to 120 volt sine waves to power their computers, home electronics, LED lighting, etc.