this is kinda what i meant... you said it much easier to understand.
and yes i was implying current along with voltage.
And im fairly sure u do drive LED's with voltage as well. Watts / Current = Volts. You cant say LED's are not driven by voltage. :P
DC circuits only use the current they can use... if u have a psu with 100Amps.. and a DC circuit which uses only 1mA, the PSU will put out 1mA in the circuit.
Its like a PC with a 1000W PSU running an ATOM Cpu... its not going to TAX the 1000W psu in any degree.
You can put a potentiometer or a rheobus in the circuit to lower the voltage and hence lower the LED's brightness as you say.
Like Hoppy said, it's not that simple. What you are not factoring in is voltage drift due to die heating, voltage differences due to binning, etc... It's not nearly as simple as just saying that an LED runs at X voltage at Y current, so if I apply X voltage, I should see Y current. It's never that simple. LEDs are not simple resistive devices like a light bulb. In fact, they have an inverse dynamic resistance which complicates things considerably, and makes generalizations like yours inacurate.
Another issue here is when you start to run many LEDs in series. Each LED will differ slightly in the voltage drop it has at a given current. If you drive by current, the forward voltage becomes irrelevant, provided the driver used is capable of supplying the necessary voltage. This way, each LED produces the same light output. In a voltage driven setup, the voltage drop across each LED is identicle, but the current seen at each LED will not be, and as a result, you get differences in light output, and heat generated. You may think this can't happen due to ohms law, but remember that the resistance internal to the LED is dynamic, and complicates things.
I'm not trying to bash on you. Just trying to educate.
Keeping LED's alive has to do with how cool you can keep them, because heat is an ultimate killer in electronics.
Also how bright you run them will effect life of the LED (Voltage migration... meaning as u run current, you slowly etch away at the circuit's life), on top of how well your PSU is built.
Hence your absolutely correct in what you mean by tightly regulated current.
I do remember using bad PSU's to test LED's was a quick way to burn them out...
I think what you are refering to is current creep, which is a phenomenon associated with over volting CPUs. LEDs do not suffer from this, as current creep is due to the fact that the traces and gates in the CPU die are so close together, you can induce a voltage on one trace/gate by turning on an adjacent one. This can damage the isolation in the CPU.
Damage done to an LED by heat is just from that, not current alone. In theory, as long as the die can be kept cool enough, and the bond wires in the LED can handle the current (and not act as a fuse), then you can pump as much current into the LED as you like. Flashlight guys have been doing this for years.
Heat damage to an LED can be at any current. Take a Cree XR-E for example. What do you think is worse; 160C die temperature at 100mA, or 80C die temperatures at 1200mA? 1200mA is out of spec for an XR-E, but with a much lower die temp (really good heatsinking) and will probably make 30K-50K hours. 100mA is well within the LED spec, but the temperature is way out of spec, and will dramatically shorten the life of the LED as a result.
So, after al of this, yes, you can drive an LED by voltage, but you shouldn't. Beyond the fact that the LEDs are far more stable when driven by current, trying to adjust the forward voltage using basic devices like potentiometers in a usable range for the LED is difficult. Not so hard when you are driving large series strings of LEDs, where the total voltage change from minimum to maximum current is larger, but when you are trying to adjust less than a volt for even 3 LEDs, it can be a major pain.