ESRIUC Final Notes! New in the Better Late Than Never File!

Ok, I promised these like three weeks ago. In my defense, the move took a lot out of me. Of course in my condemnation, I could have posted these before the move! Anyway, leaning on the “better late than never” principle, here are my notes from the sessions I attended. Unfortunately I didn’t get to attend as many sessions as I would have liked, as we had a pretty full interview schedule. The first session details all the notes I took concerning new analytical tools coming, most notably the spatial statistical stuff. The second session details a lot of the really cool stuff coming on the server side of things. In the first session, the presentation was fairly methodical and I was able to take stuff in and then write it down. However in the server session, they covered SO much material in such a short period of time that I was more or less typing a sentence or two behind what they were saying. Be sure to let me know if I missed or misrepresented any of the details!

===

Tuesday Morning – ArcGIS Analysis

Dave (McQuire?) is giving a presentation on data analysis. I haven’t had enough opportunity for this level of GIS work. First, he’s talking about modeling the world. He’s talking about the coastline of Britian. Different “official” agencies report the length as different. What was discovered is that the length of something is related to the instrument size being measured, which makes sense if you think about it. This leads to the important point of being extremely exacting in defining terms and defining how you’re going to measure those things.

Spatial autocorrelation – the idea that things automatically correlate spatial. For example, people tend to live in cities and air pollution tends to correlate with cities, which makes sense. Statistical models tend to assume there is no spatial autocorrelation.

Modeling platform is that we model the world into Vectors, Rastars, Netowrks, Agents, and geostatistics. Geoprocessing frameworks are the key way to analysis these models. Now we’re looking at ArcToolbox. I’ve never liked the Arc Toolbox frame/dialogue, but I’ve never been able to come up with a different user interface to suggest. You can interact with the geoprocessing framework in a few ways – tool dialogues, visual model (looks like flowcharts), ArcObjects programming, scripting through Python, and the ever loved/hated/despised with the passion of 1,000 firey suns command line.

You can also link your GIS models and data to other procesing tools. There’s a couple of ways to do that. First is loose coupling so that information only is passed back and forth. Alternatively, you can tightly couple, which is more or less remote procedure call heavy programming type stuff that is what is helping me thin my hair in frustration. We’re seeing some models linked to other analysis systems. Most of them are linking to other geospatial tools. They have a new extension coming out next month that links to (???) . It’s currently in beta.

Now on to spatial statistics, which is why I’m here right now. They’ve created python scripts and put them into a toolbox for statistical analysis. They’re also interested in partnering with anyone who wishes to create new Python statistical scripts and include them in the toolbox. They’re showing a lot of statistical tools that have been adapted to GIS, like mean, median, and mode. There are also several other stuff in the toolbox and it’s all in 9.2.

On to regression analysis which is new in 9.3. They’ve created a geography weighted model, which looks pretty impressive. It’s sort of like OLS regression, but geography related. We’re seeing a weighted geography map that has a sort of heat scale showing hot spots, cool spots, and neutral spots. The hot spots indicate those data points that are definitely geography correlated and the cool spots show where it is no different from random. The brown spots it is impossible to tell if things are random or geographically correlated. I’m not exactly sure what the difference between blue and brown are, as in stats, you tend to reject or fail to reject the null hypothesis. Wouldn’t brown and blue both be effectively failing to reject the null hypothesis? From the regression, they’ve created a new graph type that’s a form of scatter plot. Querying the map dynamically on attributes will dynamically upgrade the scatter plot chart. As Dave mentions, this will help for exploratary spatial analysis to get your hands on what your data is trying to tell you. Another new tool is the spatial weights matrix. This figured all the distance between the features in your data. In and of itself it isn’t that useful, but it’s a stepping stone to lots of other statistical tools. You can calculate these values and save them to speed up the other processes. Something very interesting is that he just did a data statistical run on over 2,000 data points on a typical desktop machine. Having done something like this in STATA, I have to say that’s some impressive performance. The result is a chart similar to nearly exact to any statistical package, showing alpha and beta levels, correlations, coefficients, t-values, etc. Another output is a map of data values that are 2 standard deviations from the mean, which is useful for analysis.

From OLS we move on to the geographic regression. This is based upon some work done from the University of Ireland. The dialog looks very similar to any stats package, asking for variables on which you which to work. From the example, they run the geographic regression and show that the vast majority areas of the map which they’re analyzing is much better explained by geographic regression than OLS regression. It’s pretty impressive because the over 80% of the spatial area is unexplained by OLS but explained by geographic regression.

More on geoprocessing tools. In 9.3, they’ve added a bit more programming type functions. There’s iteration, the ability for a tool to report back to an earlier tool, random functions, some more programming data type structures they’re calling lists but sound like arrays to me, and a geoprocessing layer that allows you to define a static symbology layer for all the outputs from your iterations. They’ve also added branching (IF-THEN) type stuff. All of this stuff is pretty standard programming stuff. The Python scripting and the ArcObjects model have had this type of functionality for awhile now. However, it is pretty nifty they’ve added this stuff to the visual modeler.

ArcServer – The Road Ahead

Got stuck in an interesting interview with Redlands Police Department and got here late. We’re seeing some sort of OGC map author tool. Actually, I’m not entirely certain it’s only for OGC layers, it’s just that they’ve only added WMS layers. Next up is WFS service. As of 9.3 they support WFS 1.1 and for WFS transactions, which will allow for updates on the client to the data. WFS can be connected to Arc Clients via the Data (???) Extension. Now we’re seeing WCS services. These are published coverages. You can publish ArcGeodatabase Raster layers using WCS in either a map, geodatabase, or an Image Service. In 9.3, Arc Desktop can now work with WCS services out of the box. It can be used to add data or to do geoprocessing. On to a demo of these things.

While the demo is going on (more or less each demo is a variation of “add to ArcCatalog, pick your layer(s), add to ArcMap”. Pretty cool you can do that but nothing super special), I started looking around. The room is really full, quit a testimate to the interest in ArcServer. I’m actually sitting on the floor in the back trying not to block the people trying to get in. I wish I hadn’t missed the first few minutes as they’re using some sort of client with menus that look like current MS products (no the ribbon bar). It’s not ArcMap, but it’s what they’re using for all the OGC additions. The client itself looks pretty snazzy.

The next slide features only the words, “High Performance Image Services”. Yesterday we spoke with Peter from the Image Server group. I wonder if we’ll get to hear more about this product? Image services can be used to publish images directly to the server. It allows for resampling and compressin on the fly to the client. It can publish in any format the client requests. It supports SOAP and WCG services (which they just got done talking about). Image Server was published in 9.2, but they’ve been kinda quiet about it. Now we get a full blown system in 9.3. As stated, the client can request via WCS or SOAP interfaces whatever compression, file format, or projection they wish. That’s pretty cool. You can load your data in native format and Image Server will push it out on the fly. No more tileing larger image datasets. Woohoo! The demo is showing how you publish an Image Service. Wow, the wizard makes it stupid easy. Peter said yesterday that setting up the service is pretty easy for people to pull off. If it’s as easy as publishing then this thing will be a joy to install and configure (unlike the old ArcIMS days… yuck!)

Caching map services is up on the table now. A very unsexy concept that makes most people yawn, but this stuff is critical. Being able to cache map services will just make everything go that much faster. Especially if you’re publishing publicly accessible data over web technologies you’re going to want to thank these guys for this updates. It’s kinda like fuel injection vs. carburetors on cars… most people could care less which they’re running, but it really does matter for performance. Hold on, big bomb that they slipped in there… the caching demo shows a dialog that features an option for setting your tiling scheme. While that in and of itself isn’t huge, what’s noteworthy is that the caching can feature both Google Earth and MS Virtual Earth tileing schemes. Apparently you’ll be able to publish your image services on these two platforms fairly easy. Remember, the data stays in the original format and projection; the caching is what allows you to change things on the fly. Uh-oh… the fun of live demos. Some data should have pop on the screen and it didn’t. Now she’s checking the data and the dialogs to see what might have gone wrong. This looks pretty familiar – I do it all the time on my desktop back at work 🙂 The problem seems to have gotten sorted out and now she’s showing the cache files in the requested format and directories. So make sure you have the hard drive space to handle however many levels of caching you want for your application. This should be relatively self evident, but I’ve seen (and done) this mistake to know that the biggest danger of the obvious is that it can get overlooked. They haven’t given any idea how much space extra one will need for caching since it will vary wildly with the original data, the cached data format, and how many levels. Clearly individual testing will have to be done for more concrete results. I can tell you from personal experience that it takes more than you think… no matter what you think. Now there’s a demo of using these cached services in ArcMap. Super fast… heck of a lot faster than asking on demand from, say, SDE. If you’re using the images as a background only, this will be a big boon to speeding up your apps. Wow, we spent a lot of time on caching.

Everyone and their brother who tries to make Arc Server 9.X products secure knows the first step is to go ahead and throw your monitor out the window in frustration. Luckily, ESRI has heard the cries of its users and added better security management in 9.3. Right now you edit some config flies to make users and define what access that users might have. To do it right, you have to get your hands messy with programming. Chuck that headache out the window. Now all that stuff is in ArcGIS Server Manager. ‘Bout damn time. You can add token to authenticating users. You can manage user permissions and roles. Roles have users. Unfortuantely, this stuff only works for services, NOT finer levels like individual functions or layers. Damn. For that stuff, you still have to get your hands direty with coding. If you have a need to do the more indepth coding stuff, you have to turn to Java or .Net environemnts. In fairness, you don’t HAVE to, but the system is setup to use that directly. This shouldn’t be a major suprise to anyone, as pretty much all the other Arc products are very Java/.Net heavy. Users and roles in the demo are stored in SQL server. You can use other platforms as long as they’re supported by ASP rules. That may leave out a lot of open source products like, say, MySQL. It’s hard to say since I’m not super familiar with the range of ASP.

Ok, on to web mapping applications. They’re working on a lot of tightening of the development tools. Map navigation is nicely streamlined and there are lots of new tools and buttons to make things more intuitive for users. There’s a zoom slider. They’ve implemented Google-esque push pins. The finder is easier to pull up and execute and get results. There’s less multifunction tools that take like 4 steps to get anything useful. Map tips have been added, again similar to Google. There’s a new print task that allows users to print more full data. You get the map, scale bar, title, etc. Not only have they focused on the user but the developper too. There’s a mashup API for javascript. You can make AJAX enabled web apps now with fairly quick tools. All of this sits on SOAP and OGC services. All of this stuff existed before, but now its even easier for developers to create. There’s now a services explorer that allows anyone to see what public services you have available. ADF (application development frameworks) now exist for all major IDEs. Manager can be used to develop apps quickly. .Net ADF integrates more closer with AJAX. There’s a new client side Javascript Object model that will be published and they can connect and interact with the server side tools. Using these ADFs, you can interface materials in ArcServer with Google Maps or from Virtual Earth. So Virtual Earth may be the front end, but your backend could be ArcServer running your own data. This is all done through the javascript API. Great for Mashups. You can mix data from any ArcGIS Server, either on your machine or over the Internet, with the materials in, say, Virtual Earth. This is huge, as most of the Google/Virtual Earth solutions that allow you to put your own data tend toward the hacking side. Being able to publish this stuff will be great. Using REST services, you can role out lots of cool functionality like geocoding and the like. My biggest problem with all of this is that I have to wait until 9.3 to play with this stuff! I have like 4 projects that can use it right now!

They’re demoing the services explorer. It’s a pretty cool way to look at the capabilities of a service. Now we’re looking at code for a web application. If you’re remotely familiar with Virtual Earth of Google Local develppemt, then you’ll feel right at home with this API. Three or four lines of code and you have a map. I have to say that the data loads fast, at least on the default map as he pans around. I wonder if these are those cached services they mentioned before? Now they’re showing a geocode service that’s pretty nice. I assume the geocode sergvice comes from their data, not local. I wonder if you can use your own geocode data that might be more up to date or better resolution? Ahh, confirmation – all of this is cached services, which is why it’s so fast. Now he’s moving onto a Google Maps app. He’s pre-cached the service to mach the Google Maps tiling scheme. He’s added some parcel data and used Google Maps to do the geocoding and push pin display. He’s got the data from the parcel attribute table to pass data to the Google Push pin. Freakin’ cool! Next up is Virtual Earth. He’s got a couple of layers with what I believe is VE’s photo layer with their Vector layer. Again they have push pins created by VE but filed with information from the ArcServer data. This data transfer is done with the REST API. Again, the thing that sucks most about it is that we won’t get it until early next year. Bummer.

Recent Mapstodon posts

Loading Mastodon feed...

Archives

Categories