A recent post on the Mapperz blog has links to several of the presentations on SlideShare that were given at this years State of the Map held in Limerick, Ireland. It is a pretty interesting group of talks and, as usual, we wish we could have gone to see them in person but it is great to have access to the content for those of us who couldn’t make it. You may want to keep an eye on the State of the Map or OSM sites for more information on the presentations and other content that will surely be rolling out of the conference.
Ok, so this is kinda obscure AND it’s like the fourth time we’ve linked to XKCD…. but it’s entirely appropriate! Not to mention darn funny!
The IDC has just released a report (PDF) that says the amount of digital data we’ve collected exceeds the amount of space we have to hold it all. Right now, we collect nearly 45GB per person. That’s an amazing amount of data. The expectation is that by 2011, we’ll only be able to store half of what we produce! Video and imagery accounts for the lion’s share of this growth.
Sitting over here this afternoon attempting to backup nearly a terrabyte of aerial photography, I can’t help but notice this problem might be something we in the geospatial community might what to keep in the back of our heads.
Via ars technica
The proposed Broadband Census Bill (H.R. 3919) was approved on Tuesday by the US House Energy and Commerce Committee. The main goal of the legislation is to collect data about broadband availability throughout the US and use that data to generate a searchable map that will provide consumers with information on what is available in their areas, and also provide the base for grant programs to expand and improve broadband service. The plan in the most recent draft of the bill was to map to the 9-digit ZIP code level or census tract level, with data on demographics, broadband providers, types of technology used, and bandwidth tiers. The funding for the census is currently requested as $12 million per fiscal year for 2008, 2009, and 2010.
In addition, the proposed grant program includes $300 million in funding for a number of goals related to expanding and improving broadband access. The bill was only referred to the committee on Oct 22nd, so that is pretty quick action. I believe the bill will move to the full House now, but there’s nothing on the schedule yet as to when it might come up for debate and vote.
This would be an important step in helping many areas catch up in terms of Internet access, so write to your Congressmen to express support for this bill. If you would like to see what the bill includes, here is a nice summary.
University College London’s Centre for Advanced Spatial Analysis has done a lot of great work in understanding spatial behavior, and one of their current projects, CAPABLE (Children’s Activities, Perception and Behaviour in the Local Environment) focuses on children’s activities patterns across space and time, looking at things like patterns of travel between home and school and other daily movement. One of the issues the researchers are hoping to understand is the possible relationship between patterns of travel and obesity in children.
There are 3 example animations of children’s GPS tracks, walking a dog, walking home from school, and playing football, which are mapped onto a Google Maps interface and also show the changing levels of activity throughout each track. Geospatial technologies and data have reached a scale where we can look at issues at the true local level, and I think we are only at the beginning of the curve in terms of fine scale analysis.
Episode 16 of InDigital includes a ride along and discussion with TeleAtlas (at 6:48). It goes for about 3 minutes and the little discussion among the hosts afterward is a great example of non-geospatial professional perspectives of data collection and use. I would definitely recommend that everyone take a look.
Oh, and apparently TeleAtlas is looking for interns.
The good folks over at Ars Technica are reporting an AP article that says the US Intelligence community wants to have the license to censor satellite imagery. The idea is for the government to be able to control what the public can and cannot see in times of war or emergency via satellite. That way, people can’t take advantage of the situation by using the imagery. Although I can see the NGIA’s concern, I have to say I’m highly skeptical of the ability for the government to even begin to do this. Buying up all the data like they did before Afghanistan is impractical and the US isn’t the only game in the space town anymore. How can you stop data from around the world making it onto the Internet for all to see? Also, it’s always important to remember that that which can harm often can do good as well. Certainly however one feels on the issue, it will be an interesting development to follow.
Slashdot is reporting an interesting article that claims AOL and Skyhook Wireless have mapped “the majority of residences in the U.S. and Canada” along with their wireless service (or lack there of). It is supposedly part of their “Near Me” service which allows AOL IM users to see people physically located near themselves.
I’m of multiple minds on this issue. One the one hand, it’s pretty cool to see the virtual linked to the physical. On the other hand, I don’t want people knowing about my personal WiFi. On the third hand, considering our state is going through some major address mapping issues, how the heck do we get that residential mapping data?
Wired Magazine is reporting on a story that the amount of digital data moving around the world today is something on the order of 161 exabytes. Although the exact number (and methodology) might be disputable, it appears the research is in the right ballpark. Just to put that into perspective (using base 10 instead of base 2), one exabyte = 1,000 petabytes or 1,000,000 terrabytes or 1 billion gigabytes (again, slightly different in base 2). They even estimate we’re likely to hit slightly under 1,000 exabytes (or one zettabyte) by 2010.
They also estimate that 161 exabytes is in the same ball park as the total amount of storage available world wide! So go clean out your inbox, will ya? We need the space!
The EPA is making a bold and rather inspiring move in making an effort to post their data online for use in Mashups and online Mapping applications. As of Wednesday, they have posted a few hundred of the Superfund sites they have maintained for the last 25 years or so. The article indicates they plan on publishing additional data fairly soon. This is a great move by the EPA that I hope is closely mimicked by other Federal and International agencies.