Originally Posted by michigan400
+1,, good question.
Edit - Just seen your response wadester. So if more light requires more amps, more amps at the same voltage (12 I'm assuming) means more watts consumed. How can 1 change and not the other 2? If 2 lights use the same LED's, have the same output in lumens and run at the same voltage how can they not consume the same amount of watts?
I'm not trying to be a smartass or anything, just curious as well because from my understanding something sounds off. I'm no expert by any definition so the problem very well could be in my understanding of it.
Part of that is how you read light emission specs. Are either of the light mfg's in possession of test reports of the light emission of their device? Or are they quoting the specs of the LED mfg for light output? I haven't pulled spec sheets for those LEDs, but I bet it says "900 lumen output" on it. You want to claim something else? Pay for a lab to rate your device - with enough samples to truly have a statistical validity.
Two roads diverged in a wood, and / I took the one less traveled by, and / now where the hell am I?
This isn't a "you're doing it wrong" topic. (Originally Posted by Human Ills, 7/1/14)