Mel Brooks’ 2000-year old man used torches in his cave, but today’s lighting is a tad more sophisticated, having gone through half a dozen stages, each producing more light out of less energy i.e., efficacy, than its predecessor.
Torches made of moss and animal fat, and crude oil lamps were the mainstay for indoor lighting until the processing of animal fat into wax gave us the candle in roughly 3000 BCE (efficacy: about .1 lumens/watt). That option worked for much of early recorded European history but, during the Islamic Golden Age (roughly 900-1000 AD), a Persian doctor refined kerosene from crude oil and used it in the first manufactured oil lamp and the first street lights in Andalusian Cordoba, now Spain.
It wasn’t until the late 1700’s that European oil lamps (at ~.3 lumens/watt) became widely available and accepted due to improvements in design and the whaling industry’s ability to produce sperm oil. That refined product burned cleanly, didn’t smell too bad, and was relatively cheap compared to commercially-made candles.
The Industrial Revolution in 19th Century Europe gave us dynamo-based electricity and the first carbon arc lamps (at ~2-4 lumen/watt). They were used mostly in open areas such as parks, street lighting, and large industrial spaces, and rail yards. Arc lights didn’t get much traction in the US until much later, due to the lack of an electric distribution system. During that century, gaslighting and kerosene lamps dominated.
In the early 1800s, gaslight (initially at less than 1 lumen/watt), used coal or natural gas from mines or wells and was relatively common in urban England. Its use expanded rapidly after the development of the incandescent gas mantle around 1890. That device more than doubled the efficacy (to 2 lumens/watt) of gaslighting, using a filament containing thorium and cerium, which converted more of the gas flame’s heat into white light. Gaslighting then costs about a quarter of candles and less than kerosene. It created a great demand that eventually led to the creation of the first utilities. Gaslight dominated for several decades (through World War I, for those having access to it), until electricity became available, gradually supplanting both gas and kerosene lighting.
As the 20th century approached, Edison created his electric incandescent lamp (1.4 lumens/watt), and the power industry adopted it as a standard. During this period, Nikola Tesla, and others, demonstrated forms of fluorescent lighting where gases in a tube glowed when charged by electricity. Efficacy began to exceed 20 lumens/watt. It wasn’t until 1926, however, that the first fluorescent lamp received a patent and not until after World War II when commercial fluorescents (~60-70 lumens/watt) began to supplant incandescent.
With the rapid development of technologies in the postwar era, other forms of lighting were developed, producing the first diodes emitting visible red light, the Light Emitting Diode (LED) in 1962. Mercury vapor and metal halides came to market in the late ‘60s (at 50-100 lumens/watt), and sodium vapor lamps in the ‘70s, with efficacies up to 180. Although more efficient, several of these light sources, especially sodium, provided light that distorted colors and was far from white.
Compact fluorescents with built-in ballasts came to market in 1981 (50 lumens/watt), the first white high-pressure sodium lamp (the Philips white SON) in 1985, and the first commercially available electronic ballast appeared in 1987. The early ‘90s saw an explosion of new and ever more efficient light sources: electrode-less fluorescent aka induction lighting in 1991; the first ceramic metal halide (CMH) in 1992; and, in 1994, the first T5 fluorescents and sulfur-based lamps. Efficacies for white sources were now routinely exceeding 70 and approaching 100.
But the lighting revolution was just getting started. After almost 30 years of development, the first phosphor-based blue (and later white) LEDs were seen in 1995. As cost dropped, and efficacy rapidly improved, they began to supplant sources that had ruled some types of fixtures for decades. In only a few years, tritium lighting which used a slightly radioactive glowing gas, common in exit signs, essentially bit the dust even though it used no power for illumination.
In the 21st century, competition among sources continues. Fluorescent lamps, powered by electronic ballasts operating in computer-designed fixtures, are producing greater efficiency and longevity, giving LEDs a run for their money. High-output fluorescents are pushing metal halide aside, just as induction lighting competes with LED for street lighting. Efficacies near 100 are now considered the norm with LED having a theoretical potential exceeding 200.
The biggest loser in all this competition is the incandescent lamp, which ruled downlights and table lamps for almost a century. Screw-in LED units, while still relatively expensive, are quickly supplanting Edison’s venerable invention. Through an international competition, government regulation, and various financial incentives, incandescence is fast becoming the “whale oil” of our time. During its heyday, however, many billions of Edison’s bulbs were produced, and used by billions of people, making it the king of indoor lighting for decades. To paraphrase King Louis XVI in the History of the World – Part 1, “It was good to be the King”.
Click here for the full LA Confidential Winter 2014 issue.