LED Lighting - Electrical switch wiring on retrofit
I have undertaken a project to retrofit my AquaOne 120 tank (40 foot, 90G) which has a built in Hood light. Previously the hood had 3x 36" (3 foot) T8 bulbs in it. I am upgrading those to 3x 12 LED lights mounted on Aluminium Channel.
I am doing the wiring of the hood and I realised that the switches for the T8 globes (and fluro drivers) are Single Pull - Single Throw (SPST) switches.
What this means is that in the AC circuit the Neutral Wire (blue in Australia) goes directly to the Fluro Driver and the Live Wire (brown) goes to a single switch and then to the driver. When the switch is thrown it only disconnects the one Live wire.
In retrofitting this I am removing the Fluro tubes and Drivers altogether and am replacing them with 3x Meanwell 60N-48 LED Drivers. I was going to use the same circuit to wire the drivers in however On closer inspection I realised I may have hit a problem.
Since the switches are SPST I am just wondering whether that may throw an issue for the Meanwell LED Drivers. The switches were fine for the Fluro setup (from the manufacturer) but I'm not sure whether they'll be fine for an LED Driver setup.
Basically can I just disconnect power to one lead of an LED Driver (specifically Meanwells) and it will disconnect the circuit or will this cause issues and I need to upgrade the switches to Single Pull - Double Throw (SPDT) switches to disconnect both wires (Neutral and Live)?
Attached is a crummy MSPaint picture of my problem. I hope that someone can provide some information as Google hasn't really returned any help.
(in case my picture uploading doesn't work I have put the image online at http://i.imgur.com/IOUD6.jpg)
FYI: I'm not an electrician nor am I from australia. However, I did some reading to try and help you out. I came across a discussion on another forum. The OP was advised to NEVER wire led in parallel, for when one gets hotter, it uses more current, and when its cooler, less, meaning when either of these conditions occur the hotter bulb will steal current, leaving the other bulbs with less. The hot bulb will draw more and more as it gets hotter causing it to run even harder and burn out more quickly, allowing the other bulbs to draw more and well you get the picture. He also said to run one led driver capable of supporting all bulbs. So one driver for number of bulbs multplied by wattage. The OP happened to be wiring a set of three leds as well so it seemed relevant enough.
I belive the spdt switch is unneccessary.
Thanks for the reply. My diagram is incomplete (and as I mentioned, pretty crappy). What I left off was the LED Driver wiring to the LED circuit. I left the Red and Black DC + and - wires there to show what I'd be doing. I am going to be running them all in series. So three channels (with 12 LEDs each) wired in series.
The only thing that's wired in parallel is the LED Drivers but they have mechanisms to deal with this (they have fuses etc.) so there's really not an issue with this diagram.
The problem you mention, I have encountered discussions on it before. There's an issue when you wire in LEDs from 1 power source in parallel. If one LED blows then that parallel portion gets increased voltage / current and this can overheat and cause great issue with the LEDs and surrounds. However my LEDs are just a string of lights in series together wired to an individual driver.
I've already wired up my LEDs and tested them a while ago and they work fine. I only powered one LED Driver direct from mains power and I then swapped to the other channels. It's more just the isolation circuit (switch) that I was interested in. Since AC alternates I was wondering whether knocking out one half of it (via the SPST switch on the Live circuit) would "shut down" the circuit or whether you need to close both sides (SPDT switch on Neutral and Live) to fully close it.
|All times are GMT. The time now is 05:56 PM.|
Powered by vBulletin®
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.