Every time i watch tv everyone seems to have this acceptance attitude that there is global warming. It seems to be everywhere, even in advertisement. Instead of it's coming everyone is like it's here. Yet i don't recall anything official announced?
I believe the Sun has a far greater influence on your climate than anything we do so I won't believe man made global warming until it's official.
So have i missed anything or is it just companies trying to cash in on going green?