3 of 3 people found this helpful
Length of the "leads" as you ask probably won't have a direct impact since leads are usually short (few cms), in the other hand distance/length of wire -which is what most likely affect the outcome- definitely has effect on current and voltage. Resistance is directly proportional to length. As the length increases, resistance increases, as a result current decreases. This is known as Voltage Drop.
There is a good reason why length of the wire is limited for certain applications, for example when using Cat5e cable on a network, there is a distance limitation of 100meters because the voltage drop, noise and other factors at a higher distances may have big impact on the quality of the signal. Since you are trying to light some LEDs, the best you can do is to measure the voltage at the end and compensate or you could try to carry a higher voltage, and have a voltage regulator at the end (part of the LED circuit) to produce clean and stable voltage for your LEDs.
5 of 5 people found this helpful
You can use Ohm's Law to calculate how well the Cat5 cable would work along the lines of what Luis has described.
- Determine the resistance in the cable - Wikipedia gives DC loop resistance of Cat5 as less than 0.188 ohms/meter.
- Determine the LED current - A typical 5mm through hole LED indicator might have a maximum of 20 mA.
- Determine the forward voltage of the LED - A typical yellow LED might be ~ 2.1V, a red maybe ~1.5. The voltage at the LED should be at least this high.
- Determine how long the cable will be - let's say 100 m.
Assuming only one LED is lit at a time, then V = IR and the voltage drop will be (0.188 ohms/m) * 100 m * 0.02 A = 0.376 V - call it 0.4 V.
Then you would need a voltage source at least equal to the forward voltage plus the drop in the cable = 2.1V + 0.4V = 2.5V for the yellow LED. In other words you should be able to get it to work with a 3V or 5V source. You could then put current limiting resistors in series with the LEDs to keep from getting over current. The current limiting calculation is R = V / I, remembering there is resistance in the cable but a bit of a safety factor is always a good idea. Or just throw say a 330 ohm resistor on each LED for 5V or less. Or use a constant current source if you want to be fancy.
The assumptions I used are to show how the calculation is made. If you were to use high power LEDs for example things would change. All of the assumptions should be checked when ordering or purchasing the parts if you are not sure.
4 of 4 people found this helpful
Frank is right. Cat5 usually uses 24 gauge wire which is about 26 ohms per 1000 feet.
If you are using anything over 200 ohms as a current limiting resistor for your LEDs, the apparent brightness difference due to 1000 feet of cable will be negligible.
1 of 1 people found this helpful
The main consideration should be the forward current specification of your selected LEDs.
you didn’t mention your available bias supply, but if you monitor the current of each DC supply and adjust it to the brightness level that remains ‘in-spec’, the length of everything becomes moot.
What impact does length of the leads have on an LED. I'm working with a live multi-camera system and want to send two LED's to a remote camera operator (Tally Lights). Yellow for "On Preview" and one red for "On Air". I was thinking of using Cat5 cable at varying lengths. Three leads in total (1 ground and the two +). What do I need to consider?
Sorry i this is in the wrong location. I'm not sure where it would go.