advertisement

Print

The How and Who of Where

by Daniel H. Steinberg
07/01/2005

The two days of O'Reilly Media's Where 2.0 conference seemed to fly by. To support this latest generation of location-based applications, you still need to begin by collecting and organizing the data. A9.com, NAVTEQ, and Eyebeam representatives explained how they gather and organize data. Then Ron Ondrejka brought down the house with his description of how his team used to gather images of the Earth from spy satellites in the 1960s.

Visual Queues from A9.com

Some people are better at looking at maps than others. Think about how you give directions. You might say "take Ellis to Market and take a right onto 4th street." But you are likely to also sprinkle in visual queues such as "at Market, you will see the Apple store on your left." Udi Mander from A9.com showed their new Block View Images at Where 2.0.

Follow the link and select a city. San Francisco has good coverage, so it is a good choice. You will see maps at two different resolutions. There is a MapQuest view that you can zoom in and zoom out of and another that indicates with a red rectangle what area is represented by the first view. But that's not the cool feature. Click on the check box labeled "Mark Streets with Block View." This highlights the streets for which pictures have been taken of one or both sides of the street.

Click on an area for which there are images. In the lower right corner, you will see a sequence of images (where they are available) from each side of the street. Click on "Go" to move in one direction or the other, and the pictures scroll in opposite directions to reveal that part of the street.

This visual tour of the city was created using a digital video camera mounted to the top of a car and connected to GPS. The drivers just drive around town. Everything else is synchronized. Mander answered a question about privacy--some of the pictures include people who are captured by the camera. He said they haven't had problems but will remove or replace pictures if asked to do so.

Visualizing Democracy

Where 2.0 Conference.

If you are concerned with privacy issues in having your picture taken while you are walking around, then having your political donations available online may alarm you. Michael Frumin from Eyebeam has been working to help create maps that display political information.

He has been working to visualize democracy while collaborating with the Fundrace project. In this mapping project, the first iteration involved getting all of the information and generating indices about candidates. You can view the average contribution made to each candidate. They have also produced national maps to compare contributions to candidates, showing exactly where the contributions came from and how much they were.

You most likely saw these red-blue maps after the last U.S. presidential election. The granularity is fairly fine, and you can view the red versus blue breakdown by county, zip code, or state. But, as it turns out, geocoding any address in America seems to be fairly easy. Fundrace even produced city maps that let you see contributions broken down by street address. In the San Francisco map, you can see clear concentrations of red and blue.

In this view, there don't appear to be any privacy issues, and the data that underlies this map is public information. You can perform a search of your neighborhood and get a listing of how much each of your neighbors contributed and to whom. Although this information is public, it has not been so easily accessible. The Fundrace site provides many different views into the same data set. There is a phone application that uses the data to indicate whether you are currently in a predominantly red or blue area. Frumin's final point was that maps and geosearching can appeal to someone in a way they didn't even know they are interested in.

Finding and Organizing Geodata

Gathering data is a huge task. NAVTEQ supplies data to many of the competing mapping platforms, as well as to non-public customers. Robert Denaro described the process of "Driving the Mean Streets." He explained that it's not enough to just know where the streets are. You need to know which are one-way streets, where there are permissible left turns, etc.

He said the process is "horrendously expensive, but it's the only way to get the right data. Their goals are to be early with new data and to field-verify everything. He showed an example of a housing development in which the streets appeared to be roughed out from the satellite pictures. When his team arrived at the site, they found a development that had been planned since the 70s with no current plans to ever build the streets that were visible from above. Denaro showed examples of streets that have gates that aren't visible from the sky and concluded that you just can't replace having a team actually visit the site.

The 50,000-Foot View

Related Reading

Assembling Panoramic Photos: A Designer's Notebook
By William Rodarmor

Ron Ondrejka ended the conference with a look at what was being done in the early days of Where 1.0. In "The Role of the First Spy Satellites," he provided a look back to the work he was doing forty-five years ago as coordinator of the mapping element of the CORONA systems. After Gary Powers and his U2 were shot down in May of 1960, the United States stopped the overflights of the Soviet Union. Previously, they had used balloons that went up to between 50,000 and 100,000 feet. The success rate was not good for recovering these balloons, so they were replaced with U2 flights.

A few months after Powers was shot down, the U.S. launched a photographic spy satellite. This was the first satellite that needed to always face down rather than spin. The pictures were taken with a one-meter resolution using a panchromatic stereo system in space. This was used so that x,y, and z coordinates could be recovered for the items photographed. The satellite was launched in an orbit that kept it in sunlight over the Soviet Union. To retrieve data, someone in Alaska pushed a button, and the satellite turned and launched a capsule of film attached to a parachute. An airplane would intercept the data buckets before they hit the ground. The capsules had salt plugs, so that if the planes did not snag the capsule in the air, it could not be picked up by the waiting Soviet submarines.

Ondrejka talked about the evolution of the project between 1960 and 1972 and held up actual film taken from the satellite at different stages in the project. He proudly told the audience that the space craft, cameras, and everything else used in the project were designed and built without computers. Difficult calculations were made using slide rules. The project successfully identified IRBM and ICBM launchpads in the Soviet Union. More film was taken by the satellites in a day than was covered in four years of U2 flights. He credited the pictures with enabling the talks to limit the arms race. He closed his talk with a quote from a 1919 publication which predicted that "the guidebook of the future will be incomplete without various arial views.

Daniel H. Steinberg is the editor for the new series of Mac Developer titles for the Pragmatic Programmers. He writes feature articles for Apple's ADC web site and is a regular contributor to Mac Devcenter. He has presented at Apple's Worldwide Developer Conference, MacWorld, MacHack and other Mac developer conferences.


Return to the O'Reilly Network