The summer solstice is now behind us. The sunlight fades, the flowers droop, the darkness creeps a little closer with each passing day. Do you feel the chill in the air? Winter is coming.

No need to be glum about it, though. Our early ancestors conquered the darkness roughly half a million years ago, give or take, when they learned how to control fire. The light was flickering and dim, yes, but the taming of fire meant that the night was finally less dark, less full of terrors.

It wasn't easy. The eternally optimistic data nerds at the libertarian Cato Institute's HumanProgress project recently highlighted a fun solstice factoid: Back in the prehistoric era a person would have to gather, chop and burn wood for roughly 10 hours a day for six days straight in order to produce the equivalent light of a modern lightbulb shining for about an hour.

Today, the same amount of labor could light a room for over 50 years.

Those figures are courtesy of a fascinating 1994 paper by Yale economist William Nordhaus. He was trying to construct a measure that could compare standards of living across radically different time periods — say, the Neolithic era and today.

He settled on lighting as a way to do that. The archaeological and historic records paint a fairly complete picture of lighting technologies over the millennia. Pick a standard quantity of light output, calculate how much labor it would take to create that much light given the technology of the era and voila -- you've got a fairly robust and comparable metric of quality of life going back millennia.

The first major improvements over open fires were, in Nordhaus's telling, oil-burning lanterns. Around the time of the Babylonian empire, circa 1750 B.C., 60 hours of labor could buy the equivalent of 88 minutes of today's light. Conversely, "a rough calculation indicates that an hour's work today will buy 300,000 times as much illumination as could be bought in early Babylonia," Nordhaus wrote.

Then along came candles, which dominated the interior lighting landscape from the Greco-Roman era to the 19th century. Around the year 1800, you could get about 10 hours of modern-equivalent lighting from animal fat candles for 60 hours of labor. Not too shabby, if you didn't mind the smell of burning animal byproducts.

Around this time, none other than George Washington estimated that the cost of burning a single candle for five hours each night worked out to about 8 British pounds a year, or well over $1,000 in current dollars.

Then the Industrial Revolution brought with it a revolution in lighting. The first was gas-powered street lighting, showing up in London around 1807. Sixty hours of labor would net you 16 hours of lighting. Not bad, if you didn't mind the risk of explosion — as D.C. residents learned in November 1898 after an incident at the U.S. Capitol.

That explosion happened at a turning point in lighting history, right after the introduction of Thomas Edison's incandescent electric bulbs around 1880. These were far more efficient than lighting by earlier methods — 60 hours of work would translate to 72 hours of lighting, nearly a five-fold efficiency increase over early gas lights.

Lighting efficiency improved exponentially in a short period of time, particularly with the introduction of fluorescent lamps. By 1950, the technology had progressed to the point that 60 hours of labor would light a bulb for a whopping 28,723 hours, or nearly 1,200 days.

By 1994, at the time of Nordhaus' paper, the new hotness on the market was the CFL, or compact fluorescent. For 60 hours of labor, your typical short shorts-wearing cool ’90s dude could light a bulb for over 51 years, fanny pack not included.

The intervening decades have witnessed the introduction of LED lighting, pushing efficiency even further. Light is now something most of us take for granted, rather than a luxury.

"So the darkness shall be the light," as T.S. Eliot wrote, "and the stillness the dancing."