Wednesday, April 20, 2016

Global warming has made the weather better for most in U.S. but that will change study says

Since Americans first heard the term global warming in the 1970s, the weather has actually improved for most people living in the U.S. But it won't always be that way, according to a new study.

Research shows Americans typically — and perhaps unsurprisingly — like warmer winters and dislike hot,...



via L.A. Times - Science http://ift.tt/1NBvjb1

No comments:

Post a Comment