Current Issues
Archives
Media Kit
Editorial Guidelines
About Us
Contact Us
Subscribe

 

 


HOME > ARCHIVES > 1995 > JUNE
The Great Flood of '93: Research Project Becomes Model for Federal Information Revolution
By J.D. Wilson

In the upper Mississippi Valley basin, recovery continues nearly two years after the Great Flood of 1993. While most peoples lives slowly are returning to normal, those whose responsibilities are focused on managing and setting policy to deal with future floods are finding that the 100-year flood changed more than the terrain of the river basin - it is changing the way they do their jobs.
      With a response nearly as vast as the flood itself, some 32 members from 12 federal agencies that comprised the committee aggressively collected volumes of information on everything from floodplain hydrology to weather information, from ecological pressures to socio-economic impacts.
      While this interagency, interdisciplinary task force was formed as a limited-time response to a major natural disaster, it foreshadows how government may ultimately be reorganized. Information, not departmental divisions, will be the organizing dynamic, based on shared information and collaboration among diverse professional disciplines with different interests and objectives.
      The speed and detail of the research in the upper Mississippi River basin in the last two years, can be directly attributed to the multidisciplinary efforts of the Scientific Assessment and Strategy Team (SAST), a sub-group of the task force, which spearheaded the collection and compilation of information from hundreds of sources and dozens of government agencies and private organizations.
      This information has been compiled into a 240-gigabyte GIS database of the information necessary to make sense of the disaster, and establish a mechanism for ongoing analysis, monitoring and damage mitigation activities.
      Consequently, the Great Flood of 1993 - which holds the record as the largest and most destructive hydrometeorological event in modern times - also stands out as the best documented, most carefully studied natural disaster in the history of the world.
      Research and management agencies at the federal, state and local levels are sharing the data in a way and to an extent they have not shared information in the past.
      The SAST project was mandated by the White House, but has a time limit. At the end of this fiscal year - which concludes in September - participants from the various agencies will be reassigned to other projects within the agencies that employ them. After that it will be up to each agency to maintain the ongoing efforts necessary to assure that policy and management affecting the floodplains are applied with the same communication and collaboration that characterized the crisis response.

Carrying the Torch
"It's unlikely that new money will be allocated to continue this effort under this structure. Each participating organization will have to provide the necessary funds for its own part of the activity," explained John A. Kelmelis, Ph.D., chief of the science and applications branch of the USGS and SAST director.
      Kelmelis hopes the existence of the database itself will be encouragement enough to keep the lines of communication open among all the agencies in the basin.
      "It is important that the river basin is managed as a system rather than a series of independent projects," he said. "We need to encourage future management of the basin using an interdisciplinary approach based on sound scientific information - including social and economic sciences as well as the natural sciences and engineering."
      With the success of SAST at collecting, disseminating and evaluating myriads of complex and diverse spatial and non-spatial data about the flood and flooded areas, it's approach has been held up as a model for orchestrating interagency and interdisciplinary research and operational efforts, according to Kelmelis.
      "Pieces of its approach are being used in hazard assessment, systems analysis and ecosystem management, as well as for data management," he said. "It was hard to see while we were in the thick of it how much we were accomplishing, but when I meet people who have seen our results and are modeling their efforts after what we did, I realize we have made an important impact both within the river basin and in other areas as well."

A Data Management Model
Now this database is on-line, freely accessible via the Internet through the National Geospatial Data Clearinghouse (NGDC) - a distributed, electronic network that connects geospatial data producers, managers and users.
      Kelmelis expects the clearinghouse concept - in which data is managed by the primary user and distribution is coordinated via the World Wide Web (WWW) - will provide the impetus for maintaining and using the data more fully. "The data exists, and groups will continue to develop new data for their needs," Kelmelis explained. "The problem is that once the primary analysis is completed, the data is set on a shelf somewhere and forgotten. When another group needs the same data, they have no way of knowing that it exists, much less where to find it."
      By cataloging and coordinating the availability of existing data, groups can save significant time and cost. Before they initiate new data-gathering, they can access the clearinghouse and see if the data has already been compiled.
      Over time, this approach will become standard procedure for all agencies performing geospatial data collection and analysis. As the database network evolves and grows, it's value will be compounded. The data-sharing model will serve to: eliminate the cost and time-loss of redundant data-gathering efforts; allow groups to share their findings more readily with others for whom the data may be useful; and, form a more complete historical picture of the region and the interaction of geomorphology, hydrology, meteorology, engineering solutions and socio-economic activities.

National Data Management Program
As part of the National Spatial Data Infrastructure (NSDI), the clearinghouse allows data users to determine what geospatial data exists, the condition and utility of these data for their purposes and what means are available to access the data.
      The NGDC has been in the works for a number of years, but the SAST database is the first comprehensive, distributed database to emerge. As such, it serves as an model of how geospatial data may be organized and shared from now on.
      "The SAST database is an important example of how to structure data management, and what can be achieved," explained Michael Domaratz, a member of the Secretariat to the Federal Geographic Data Committee (FGDC), a steering committee which is charged with developing the processes, procedures and standards for the NSDI.
      "It demonstrates the importance of interagency cooperation," he said. "Unless agencies work together they won't get the value they need out of their data collection and maintenance efforts."
      He explained that more than 90 different agencies use some amount of geospatial data. "The federal government spends nearly $4 billion every year collecting, maintaining and disseminating geospatial data," he said. "This doesn't include state and local expenditures, nor does it account for actually using the data."
      Domaratz concedes there is waste in the system, mainly because of duplication of efforts. "Despite the diversity of uses, there are a lot of common data that can be collected once and shared by many groups," he said. "Data themes like roads, surface and cadastral information, ownership and political boundaries are the same regardless of the purpose of the research."
      He emphasized the need for secondary use of data, and a viable mechanism for cataloging and delivering data. "Lots of agencies collect data as best they can. We hope the NSDI will help agencies better organize and coordinate their data collection efforts," he explained.
      "The SAST is prototyping many of the concepts and building the interfaces that will be used in NSDI," Kelmelis added. "SAST is also meeting with representatives of states, federal agencies and others to help them link to the SAST node of the clearinghouse and to become part of the NSDI."

Cornucopia of Opportunities
Many users in government, academia and private enterprise are enthusiastic about the geospatial data clearinghouse.
      "There is more opportunity for public participation in geospatial issues," declared Roger Mitchell, vice president of business development for Earth Stellite Corp. (EarthSat), Rockville, Md. "One group cannot solve the problem alone. If we can share the information, we can collaborate to find better solutions."
      Mitchell predicts that greater accessibility to data could lead to perceptual changes in how solutions to geospatial issues are addressed and determined. "It makes it possible for anyone to look at an issue and have their input heard," Mitchell said. "I think you'll see a lot of innovative solutions expressed from otherwise unlikely sources."
      Mitchell sees many untapped opportunities for private enterprise to take advantage of this widespread availability of federal data. For example, with a grant from NASA's Earth Observations Commercial Applications Program (EOCAP), EarthSat developed a product to repackage weather, floodplain and land-use information into a value-added product for predicting the precise locations and amounts of flooding. Its "Floodwatch" and "Floodcast" products help pinpoint high-risk areas and perform what-if scenarios in preparation of impending flooding.
      For educators, having access to a complete GIS database, like SAST's, has added a new dimension to their GIS curricula. "Teaching students about GIS has been difficult because there are so few real-world examples on which students can experiment," said Richard B. Newton, assistant director of the Office of Geographic Information and Analysis at the University of Massachusetts in Amherst. "This is a complete case study, with a clear purpose and a historical, real-world reference that students can relate to and understand."
      "Universities are using the SAST report as a textbook," he added. "Students can study the findings and recommendations of the task force and then access the database and see for themselves how the conclusions were drawn. They can even run original analyses of their own."
     Like so many, Newton is excited about the project and what it means to researchers. "GIS really is coming into its own," he said. "This is a complete leap forward in data availability, especially with the idea of getting it through the Web."

Danger of Losing Momentum
Newton also shares a few common concerns. The success of this data-gathering process was crisis-driven. The task force was formed by executive order, directly from the White House, and motivated by the magnitude of the floods, the severity of the human tragedy and the intense need to solve serious problems quickly.
      Now that the crisis is past and the driving energy of the task force will soon be diverted to other issues, will the database continue to thrive or will it become static and out-dated?
      Limited resources and the need to change work-flow processes within the government will be the biggest obstacles. With or without project-specific funding, it is still difficult for government to get the right resources to the right place at the right time and focus on the right objectives.
      "These are the problems that are still unresolved," Domaratz agreed. "Data continues to be compiled every day. Each new research project adds value that needs to be added to the data network. The database needs to be maintained and constantly updated."
      "Ultimately, we're reinventing workflows and reorganizing government," Domaratz concluded. "The activities and successes of the SAST project give us glimpses into how it is possible for individuals to work together across agencies and across disciplines.
      "This is the awakening stage for government," he added. "Technology is changing so rapidly, we can't predict where it will go."

About the Author:
J.D. Wilson is a freelance writer in Denver, Colo., specializing in the GeoTechnologies. He can be reached at 303-751-7636.

Back

 

©Copyright 2005-2021 by GITC America, Inc. Articles cannot be reproduced,
in whole or in part, without prior authorization from GITC America, Inc.

PRIVACY POLICY