Remote Sensing in the Mainstream: Warm and Sunny with Improving Conditions By Rick Pendergrass The functional analogy is quite apt: a stage setting complete with props, drapes, lighting, actors, a director, a script with plot and, of course, an audience. But perhaps Shakespeare's metaphor that "all the world's a stage" is even more apt, since in this case, the world is literally the stage, though the actors are clouds, rain, lightning, floods, earthquakes, fires and other natural and manmade phenomena. Incorporating remote sensing imagery, digital elevation model (DEM) data, three-dimensional perspective rendering software and frame sequential animation, the world now is presented as a stage daily to millions of television viewers in the U.S., Latin America, Japan and several European countries. The actors they watch and the plots they follow in Tokyo, San Francisco, Frankfurt, Boston and Spokane are today's weather - and tomorrow's weather. And the stage is set by remote sensing; after more than 20 years, the technology is making its way into the homes of business executives, cab drivers, farmers, teachers, grocery clerks and sushi chefs - right where the remote sensing community has been trying to place it all these years. Weathernews Inc., Media Division, based in Sunnyvale, Calif., has been setting the stage and broadcasting this drama daily since the spring of 1994 to a growing list of television stations and cable companies worldwide. The company has developed a suite of software capabilities into two systems called WeatherVision and WorldView. The result is a daily production schedule of broadcast weather graphics that is transmitted by satellite daily to client stations. The newer WorldView system includes the broader capabilities for animation of events beyond weather, such as natural disasters, wars, sporting events and others. Each station specifies the animation "flight path" it wants to show on its evening (or morning) weather program. Weathernews meteorologists fill in the forecast graphics using the "actors," or render in the actual or "analysis" clouds of the previous 12 hours to show what weather has occured during that time. In either case, the weather, which includes not only clouds but rain, snow, sleet, lightning or fog, can be rendered into the imagery in the three-dimensional space above the Earth's surface. The key to the presentation is the stage setting, and that is where remote sensing takes center stage, so to speak. To date WNI has incorporated a variety of remote sensing imagery for different clients, including Landsat thematic mapper (TM); the Japanese Marine Observation Satellite (MOS); the U.S. Advanced Very High Resolution Radiometer (AVHRR), and even aerial photography to create the stage settings. The system can accept imagery from any sensor, though the clients' requirements for the most realistic views possible tend to limit the imagery to visible band (RGB) multispectral. For that reason, TM data is most appropriate for local "flight paths" because of its visible bands (3-2-1), reasonably high resolution at 30 meters, and optimal scene coverage at 170 x 185 km. For regional "flight paths," the lower resolution AVHRR (at either 1 or 4 km) is used to present larger surface areas. Realistic colors are achieved through manipulation in the IHS color space (as they are in the case of MOS data). The base technology for producing the 3-D animation is not unique to Weathernews. Three-dimensional perspective views of satellite and aerial imagery have been around for some time, and the now legendary "L.A. the Movie," animation featuring 3-D rendered remote sensing imagery, was produced in the 1980s. What is unique to the Weathervision service is the implemen-tation of the technology and data. Each day, WNI renders custom animated flights for dozens of clients based on that day's weather, then broadcasts the finished weather graphics to the clients in time for the evening weather show. What's more, the rendered animations are designed to look as realistic as possible, from the Earth's surface to the clouds and weather "actors" themselves, giving the appearance of a film shot from the cockpit of an aircraft actually flying through the weather, and looking down at San Francisco Bay, the plains around Amarillo, Texas, the slopes of Mt. Fuji or the Rhein River Valley. To achieve that level of realism, a television station would have to invest in hundreds of thousands of dollars of hardware and software, as well as hire personnel experienced in remote sensing and computer graphics. Subscription to a service is a fraction of that cost. Again, the key to both the technological implementation and the successful business model of the service is the realism of the imagery. As the viewing public has become more sophisticated over the years, it has become more demanding of realism. In an age of "I Love Lucy" and "The Honeymooners," television as a live medium did use stage settings in front of a live audience. In the 1970s and 1980s, however, viewing audiences began to demand more glitter, gore and special effects. That trend has evolved into what has become today's typical fare of CNN, "COPS," the O.J. Simpson trial and live coverage of the shelling of Baghdad. Paralleling this evolution, the evening weather has gone from hand scribbled warm fronts and cold fronts to colorful, semi-animated weather maps (with little suns in dark glasses and smiles dotted across the map) to "film loops" from the NOAA weather satellite showing clouds moving across the continent. The next logical step, therefore, is to show the real clouds as they appeared over the past few hours, tracking in 3-D space over a natural color landscape in 3-D relief. Voila, now we have the evening weather's version of 90's realism. Michael Espinoza, executive news direstor of television station KXLY in Spokane, Wash., is one of WNI's earliest clients. The station uses both the regional (AVHRR) and local (TM) image models for weather forecasting and for showing the actual weather from the previous day. "The advantage to our viewers, and therefore for us, is the feeling of personalization one receives from the local flights," said Espinoza, speaking of the TM coverage. "The viewer sees a bird's eye view of our city and can place himself right in that spot in the view. "Also, the 3-D flights give the weather itself the realistic visualization that flat weather maps and graphics simply can't show. Even when there isn't anything particularly dynamic in the day's weather, we sometimes use the daily flights to help illustrate news stories, as well, so the service has that added benefit," he said. Though some stations have begun the service with in-depth stories about the technology that was used in the creation of daily flights, KXLY elected simply to bring it on the air, introducing Weathervision as the latest and greatest in TV animation technology, but not going into the "hows" behind it. "What we've learned over the years is that viewers are more interested in the end result and its quality than in how you get there," Espinoza said. "And we have gotten nothing but positive response from the public." If you're splitting hairs, what WNI broadcasts isn't actually real. It isn't virtual reality in the vernacular sense, because virtual reality most often is used to illustrate what isn't otherwise truly viewable: a structure still in the planning stage; the inside of a molecule; "cyberspace." No, the weather animation created with Weathervision is more like real virtuality, a computer compilation of real imagery that would be viewable if we could be in exactly the right spot at exactly the right time, but aren't. Here's how an animation sequence is created: 1.The location determines the imagery to be used. Let's use the San Francisco Bay area for example. First, the area coverage required is relatively broad, covering three major cities and outlying areas across an area about 200 miles (330 km) square. The weather in the Bay Area, as everywhere on Earth, is affected by conditions much farther away than that, meaning we need to show multiple levels of detail (resolution), starting at a high altitude and flying in close to the cities of San Francisco, San Jose or Oakland. 2. In the case of the lower resolution, high altitude view, the AVHRR 1 km data is used to view the entire West Coast, plus a considerable portion of the Eastern Pacific Ocean. If the flight path is to start by viewing the entire globe, the 4 km resolution is used. 3. For the high resolution view of the Bay Area, Landsat TM scenes are selected. To cover the area of interest defined above, three scenes are required. The TM scenes must be mosaicked, color matched and enhanced for realistic colors, and detail sharpness. (Viewers want to see clearly the local highways, airports, city centers, sports facilities and other prominent features.) 4. The resulting mosaic must be accurately georeferenced, since other image data, in this example AVHRR, will coexist in the Weathervision rendering suite, as the flight path drops altitude from one level of detail (LOD) to the next. Also, the true cloud data received from the GOES satellite is georeferenced. All flight paths rendered in Weathervision use a lat/long reference grid for point of view, center of view and field of view. 5. The DEM data, no matter the source or the resolution, must also georeference accurately and be of the same equirectangular projection. Otherwise, of course, elevations will be mismatched, lakes and oceans will drape the sides of cliffs, rivers will run uphill and canyons will become hills. 6. The DEM and RS data sets then become the input to Weathervision's tiling process, creating a gridded array of polygon tiles containing both the color and elevation information. The tiles become the input to the rendering process, which uses only those tiles in the field of view to render each frame of the animation. In this way, the 3-D imagery can be created at very high resolutions and large area coverage without overpowering the computer system or storage capacity available. The tiling protocol also allows for seamless addition of coverage adjacent in two or three dimensional space in multiple levels of resolution. 7. Rendering an animated perspective sequence is a frame by frame procedure, the speed varying by the complexity of the scene, the area of the image in view (field of view), the extent of weather (cloud coverage) and the number and complexity of actors (rain, lightning, annotation or other objects in the scene). Using the example above, a low altitude flight through the middle of the San Francisco Bay area with no weather or actors takes about 2-4 seconds per frame to render on a Silicon Graphics Onyx, or about 4-6 seconds on an SGI Indigo. Add cloud cover, rain actors and annotation, and the rendering time increases somewhat, depending on the complexity of the clouds and actors. The flight path itself and the view the audience sees along the way are interactively created using Weathervision's GUI tools. A basemap of the scene - a subsampled version of the remote sensing image used in creating the perspective - is displayed, and the operator draws the path with a cursor, then draws the path of the center of view. The operator enters values for flight altitude and field of view along the path. Both paths, as well as FOV, altitude and aircraft roll are constrained by a spline function to prevent the animation from becoming too jerky or dizzying. At this point, the operator then imports or creates the desired weather effects, also using the Weathervision GUI. "Analysis" cloud cover consists of real imagery from one of the meteorological satellites (GOES, METEOSAT or GMS), and is imported and converted into Weathervision as 3-D data suspended in the atmosphere above the rendered topography. The operator can select cloud altitude and height. The operator also can create appropriate lighting by placing the sun angle for the scene at the desired time of day, or have the sun traverse the sky for an animation sequence which simulates passage of a full day from morning to dusk. As the sun moves, shadows cast by topography as well as actors track the sun's position accurately. Forecast weather can include cloud coverage as well as the rain, snow and other actors, all of which are drawn using the cursor on the basemap display. Weathervision allows the operator to create multiple cloud types and altitudes and to place multiple weather actors in different parts of the image showing snow falling in higher elevations, for example, with rain at sea level, pockets of fog in valleys, scattered lightning, etc. The sequential frame animation gives these actors the appearance of motion, and even the severity of rain, snow and other events can be controlled. Since the actors used in the development of weather graphics are arbitrary graphical objects, Weathernews has begun to demonstrate the capability to other applications areas, particularly the news media. Just in the past three months, the following realistic actors have been developed and demonstrated: Oil spills, which link a spill on the ocean to wind and water currents as the animation progresses. Forest fires, complete with grey, black or other colored smoke plumes. In the demonstration animation, WNI even had tanker planes fly through the scene, drop chemicals and douse the flames. Floods, using the accuracy of the DEM elevations, accurately show where rising waters will accumulate and where new, temporary shorelines will occur. This can be used to show unusually high tides, lake overflows, or widespread lowland flooding. Earthquakes can be illustrated using the special effects capabilities, though a true "image" of an earthquake has yet to be envisioned. Weathervision creates a local dynamic ripple effect in the image to indicate the quake activity. Because the final imagery is being used as a "stage" setting and not for traditional remote sensing analysis data - Weathernews Media Division to date has not been required to do any thematic mapping, calculate crop estimates or look for oil - WNI is in a position to take a certain amount of "artistic license" with the remote sensing data. The objective in creating the stage includes that of zero cloud cover, a gremlin with which all remote sensing professionals are well acquainted. The daily weather service must show the cloud cover for that date, so it wouldn't do to have the same clouds in the basemap scene day after day. To achieve cloud-free (and haze-free) images, it is sometimes necessary to "paint" out the clouds or other artifacts in an image. In traditional remote sensing procedures, the near infrared bands can be used to reduce the effects of haze, but the visible bands used in the natural color requirements of the Weathernews service are highly susceptible to atmospheric distortion.Other cosmetic touches include painting all water bodies a single, consistent color, lighten dark forested areas, darken overilluminated desert regions, selectively sharpen some urban areas and major highways, and even "build" bridges, sports facilities and other major features which may have been constructed in the real world since the date of the Landsat image with which one is working. In addition, it is possible to alter the global colors in an image to adjust for seasonal changes, to "create" blackened areas following forest fires, or to change the appearance of croplands following significant flooding. These color enhancements and image touch-up tasks are performed using Adobe Photoshop, a software package designed primarily for the pre-press industry, but quite effective in retouching blemished remote sensing imagery. Since realism is the ultimate objective, great care must be taken to "paint" obliterated areas with patterns of adjacent landcover, using accurate maps and other resources, including the DEM data and even aerial photography, when available, as references. The use of remote sensing in such a broad-based, mainstream industry will go far to demonstrate to the public the pragmatic usefulness of remote sensing technology, even if it doesn't go into the more esoteric areas of traditional remote sensing - classification, thematic mapping and scientific analysis. Use of satellite imagery as an information vehicle will expand rapidly as this first daily production application proves itself valuable. Remote sensing imagery, rendered into three-dimensional dynamic perspectives will be used to enhance and illustrate news other than weather, and will be used more and more as part of the entertainment industry, providing backgrounds and special effects for television programs and motion pictures. This experience also will go far to help introduce remote sensing to the educational community, providing a more mainstream entry for geographical education in the K-12 public schools, either through the television networks and cable stations - including public TV - or through multimedia packages offered on CD ROM or over the Internet. Again, the quality of realism will be essential to the success of this technique in generating and keeping an interest level among grade school children. Weathernews Inc., with operations in 12 countries, is the largest private provider of weather information in the world, employing more than 550 professionals in the fields of meteorology, navigation, remote sensing, graphics and mapping. Weathernews currently provides weather broadcasting services to a large portion of the media market in Asia, Europe and North America. WNI also operates and broadcasts a 24-hour weather channel in Japan. About the Author: Rick Pendergrass is a remote sensing specialist at Weathernews Inc. in Sunnyvale, Calif. He can be reached at 408-522-8352. Back |