Last week, the lighting sector encouraged (by way of formal proposal) the U.S Environmental Protection Agency’s (EPA) Energy Star program to support higher color rendering criteria in its new lamp specifications.
At Leapfrog, we agree that a higher standard in LED light quality is important, as we’ve witnessed the market saturation of far too many subpar lamps, effectively harming the reputation of the technology as a whole. And that is undeserving. A lack of standards—or more specifically, a lack of high standards—is at the crux of the problem.
That said we hope any new color rendering specifications do not decelerate any other performance-based advances.
But before we look at that, let’s examine the justification for the higher color rendering criteria request:
These are all valid reasons for placing increased focus on the quality of light emitted from LED lamps and ensuring all lamps meet a certain color rendering standard. However, as physics tell us, any increase in Color Rendering Index (CRI) decreases luminous efficacy (one point of CRI = approximately -2% lm/W). As such, high CRI can only be achieved at the expense of luminous efficacy.
Source: Limits on the maximum attainable efficiency for solid-state lighting.
So in order to increase the adoption of LED lighting, light quality—rated highly important in the purchase decision—must be improved by way of introducing higher color rendering criteria. But the present path of development and the related science suggests that luminous efficacy will be negatively affected.
Does this boil down to an argument of energy efficiency/cost savings vs. attractive emitted light? Which path do you feel is the most important to explore? Or perhaps it just places increased pressure on manufacturers to research and develop solid-state lighting technologies?