The Planted Tank Forum banner

1 - 5 of 5 Posts

·
Registered
Joined
·
308 Posts
Discussion Starter · #1 ·
I have a laptop power supply that puts out 16 volts at 2.2 amps DC current. I understand that I can run 5 3 volt LEDS but my question is whether the oversupply of amps will create a problem. The LEDS (XT-E Royal blue) are rated at 350 mA. My power supply supplies 2.2 amps (2200 mA). Will this create a heat problem or will the LEDs only draw as many amps as they require? In other words, do I need to step down the amps or look for a power supply that puts out 350mA or thereabouts.

Sorry for the basic question but this is all new to me. From the research I have done I have only seen discussions about how to step down voltages to avoid blowing LEDs.
 

·
Registered
Joined
·
354 Posts
You shouldnt be powering LEDs from a computer power supply. A power supply is a constant voltage source. No matter how much current you draw (up to the power supplies limit), the voltage stays constant.

What you need for LEDs is an LED Driver, which is a constant current source. No matter how much voltage you need for the LEDs, the current stays constant.

If you insist on using a computer power supply, you'll need some sort of current limiter, like a giant resistor. If you go this route, the resistor usually ends up dissipating a significant amount of power so you have to worry about getting rid of a lot of heat.

Its best to buy an LED driver. LED Drivers can also be dimmable, something you cant do with a computer power suppy.
 

·
Registered
Joined
·
25 Posts
LEDs are current driven devices. LEDs will draw the voltage they need. XTE LEDs can go up to 1500ma of current safely so that power supply is too much unless you wire your LEDs in parallel. You ideally want a constant current driver to power LEDs. The current will determine the amount of power the LED draws and the amount of heat and light it outputs. Every 350ma of current is equal to 1 watt of power.
 

·
Registered
Joined
·
3,350 Posts
Assuming you insist on using this supply, and your LEDs' forward voltage at 350ma is really 3V, then:

Three LEDs x 3V forward voltage in series = 9V total forward voltage.

Your power supply provides 16V - 9V = 7V excess.

To provide current limiting, you need a 7V / 0.350A = 20ohm resistor in series with the LEDs.

The resistor will waste 7V * 0.350A = 2.45 watts of excess power as heat, or about 44% of the total power. I'd round that up to at least 3W so you're not running it at the limit. Higher wattage is fine, and is actually desirable (I'll explain later). It will be hot to the touch, that's perfectly normal.

The actual current through the LEDs will NOT be exactly 350ma, but somewhere in the region; as the forward voltage is never exactly what's stated in the datasheet, and also varies with temperature.

Should a single LED becomes shorted by failure or wiring error, that changes the total forward voltage, and the resistor will no longer be of the proper value. The new current will be (16-6)/20=500ma. Still an acceptable safety margin in your case, at least for the LEDs; the resistor will now be dissipating 5W, and if not rated for at least that, can get dangerously hot.

So it's doable, although inelegant and hackish, for a relatively low-power LED system like yours. With higher power lighting, the wasted power and heat get excessive quickly; plus one shorted LED can cause them all to fail if run closer to limits.
 
1 - 5 of 5 Posts
Top