It's a popular belief that daylight saving time is in place to give farmers a few more hours of daylight for work, but it's actually related to energy use.
Daylight saving time first drew global attention when Germany started using it in 1916. At the time, the country was two years into World War I and needed to conserve fuel used for artificial lighting for war efforts. Other countries involved in the conflict soon followed suit.
The U.S.' early policies were optional, but it made daylight saving time mandatory during World War II. It became optional again when the war ended, but many states continued to follow the time change because they thought it saved money.
But it's not clear if those policies do that. When Indiana adopted daylight saving time as a statewide policy in 2006, researchers found that while household demand for lighting decreased, heating and cooling costs went up by an estimated $9 million per year.
It also has another costly trade-off — an increase in carbon pollution. When the days are longer, people spend more time away from home, often driving from place to place. In fact, the Chamber of Commerce, which represents stores that sell more than 80 percent of all gasoline in the country, is one of the biggest lobbyists of daylight saving time.
It's unclear if the U.S. will ever move away from the practice, but until it does, it'll prove costly. One study found that the simple act of switching time on clocks costs the U.S. about $860 million per change.