If you are looking for something more recent, Google ebooks and Google Scholar provides a significant number of chapters for more current books and journal articles on GIS and geospatial technologies including from 2013. Google Scholar also includes a patent search, legal documents, Federal and State courts. This is interesting reading even if it isn’t specifically in your industry, because you might just come across a new area of interest.
We’ve been following this news item for some time, and I have to say I, for one, never dreamed these scientists would be convicted. An Italian judge has decided six scientists and one government official are criminally negligent for failing to predict the L’Aquila earthquake. They face up to 6 years in jail for their actions. The judge was quick to point out the verdict isn’t based so much on the lack of prediction as their failure to adequate phrase their warnings in a sufficiently alarming way. It isn’t too much of a stretch to say this is going to have a drastically chilling impact on scientific reporting, particularly in Italy. I’d like to say something hopeful out of this, but frankly it is all quit too depressing.
Now that we have that over, The Guinness Book of World Records has officiated the oldest note in a bottle ever found. The note is over 98 years old and it is an old National Geographic note from a 1914 scientific study concerning ocean currents. The note asks people to return the bottle to Captain C. Hunter Brown of the Glasgow School of Navigation. Apparently they released nearly 1,900 bottles but only got back a bit over 300. That seems about right for survey return rates, I think 🙂
Ars Technica is reporting that some researchers are having issues with the US’s pricing of carbon emissions. The price of carbon emissions is notoriously difficult to pin down, but these researchers are suggesting the US might have missed the mark by as much as a factor of 12. The problem centers around the discount rate, which is the cost of not spending the money on other uses, such as interest or capital investments, for instance. Apparently the researchers claim the US is setting this rate too high. They do not seem to be factoring in certain work that’s been done not just within climate change research, but also with economics and discount rates more broadly. It seems to me this shows an interesting interplay with different social and physical disciplines. Often what’s going on in one area isn’t translated or accounted for in another. Then policy makers have to come up with some sort of semi-educated guesstimate of how to integrate all of this stuff into a cohesive policy. It’s a thorny issue that’s beyond just climate change. However, I unsurprisingly believe we geographers might be a good nexus point within disciplines for just these sort of complex issues. Perhaps we should get involved more deeply with these sorts of estimates to attempt to redress such widely variant estimations. That’s not to discount the important work geographers are already doing, but just to suggest maybe we can get a little more vocal about our great work and how we can contribute.
This is a great education and outreach opportunity to help inform people about water quality issues. It is an extension of the UN’s World Water Day (March 22) which focuses on educating the public by getting them to conduct water quality tests of local water bodies and share the data. The challenge is coordinated by the Water Environment Federation and the International Water Association, and sponsored by organizations such as the USGS and EPA.
As a focus for the challenge, Tuesday, Sept 18 has been deemed World Water Monitoring Day. Thousands of participants, individuals and classes, will be heading out to test water quality near them. You can check the event web site to find out if there is a local event going on in your neck of the woods. If you can’t make it to one of the organized events this week you can also order test kits from the website.
While this gives folks a chance to get a little bit of field experience, there is also a wealth of data, including webmaps, from previous years available to play around with.
Ars Technica is reporting an interesting article for science, I think. Researchers at George Mason University have looked at climate reporting in the New York Times, The Washington Post, The Wall Street Journal, and USA Today between 1998 and 2010 to see how often climate change models are referenced. The answer? A depressingly few number of times – 100 out of 4,000. Why does it matter? Well, how can anyone really understand the conclusions without at least understanding some of the methodology that went into the conclusion? Without understanding the utility of models in general and climate change models in particular, it is all too easy to cast aside climate change as junky science.
If you ask me, irrespective of the climate change debate, I fear we don’t do enough to explain the science behind the conclusions, particularly with highly politically charged issues like climate change. It seems a bit disingenuous to me that we would present one argument without explaining the logic behind it. It is then up to the reader to decide which argument makes more sense to them. At the very least, we potentially raise scientific knowledge among the general population, and that can’t be a bad thing.
Thanks to Real Genius for the title. Climate scientists are engaged in a little damage control after Britain’s Time Comprehensive Atlas of the World mistakenly claimed Greeland’s glaciers are melting at a breakneck rate. If you compare the ice cover from 1999 and 2011, the Atlas reports a 15% loss in ice coverage. Climate scientists report the real number is closer to one-tenth of 1%. That’s a healthy difference! Scientists have been quick to point out the error and the publishers are attempting to address the issue (although they go through great pains to keep from acknowledging the Atlas is wrong). Nobody’s really sure why the error was made, however one scientist attempted a little ‘cartographic forensics’ and claims someone has confused a thickness for an extent. The publishers deny this happened, but have offered no alternative theory.
NOAA just released a fascinating video showing the birth and death of hurricane Irene as seen from space. The video was created from imagery captured by the GOES-13 weather satellite. This lovely new satellite captures a view every 30 minutes and has been running for a little over a year (more to be found about this satellite at the link).
Climate models have predicted this for years, but it’s never been observed… until now. Ars Technica discusses the issue in brief. For the non-physical geographers out there (of which I count myself), storm tracks are the mid-latitude storm patterns that bring most of the precipitation to the heavy population centers in the world. As the climate changes, these storm tracks should gravitate to the poles. Scientists have been using data from The International Satellite Cloud Climatology Project to attempt to track the movement of storm tracks. They note lots of issues with the data, but repeated sampling and analysis methods have shown a clear trend – the tracks are moving as predicted. On top of that, apparently we’ve lost 2-3% of our total cloud cover worldwide!
So what’s the takeaway from all of this? It seems to me that the issues with the data combined with the need to track this stuff in a more comprehensive and accessible way point to one major conclusion – we need more satellites to get more accurate and timelier data. It really doesn’t matter where you fall on the climate change issue. Better information can only lead to a more informed scientific community and public, which is always a good thing.