Floodplain
Management Program Spawns Massive Data Collection Network
By J.D. Wilson
Nearly two years after the
waters subsided, the clean-up is still in progress
throughout the upper Mississippi River Basin. The nation's
eyes have turned to other events and in the heartland life
is slowly - but never completely - getting back to normal.
The Great Midwest Flood of
1993 entered the record books as the largest and most
destructive hydrometeorological event in modern times. It
set records for amounts of precipitation, upland runoffs,
river levels, flood duration, area of flooding and
economic loss.
To make sense of this
mammoth disaster, the multi-department, cross-discipline
Interagency Floodplain Management Task Force (IFMTF),
began collecting information from as many sources as
possible. They gathered meteorological, biological,
environmental, geological, critical infrastructure, social
and economic information that will serve as the baseline
data for future floodplain management plans and disaster
responses.
Herculean Task
Measuring and understanding the multiplicity of variables
that conspired to create a disaster of the magnitude of a
100-year flood is something akin to trying to count the
hairs on your head. After all, the upper Mississippi River
Basin covers nearly 714,000 square miles in 13 states -
and nearly 12 million acres were under water at the
flood's peak.
The IFMTF's first step was
to gather the volumes of data on the event and compile
them into a useable form - everything from floodplain
boundaries and wetlands, to soil types and flow rates, to
climate patterns, precipitation and temperatures, to
buildings and businesses and farms, even location and
nature of toxic wastes.
The team gathered data from
approximately 20 federal agencies, 13 state governments
and hundreds of regional, county, and local government
entities, as well as insurance companies, banks and other
private organizations (see page 33; Creating a Super-GIS).
Earth Satellite Corp. (EarthSat)
was among the private GeoTechnology companies enlisted to
help compile the data. "This was a major effort for
our organization," said Roger Mitchell, vice
president of business development for EarthSat. "We
ran three shifts, nights and weekends, to meet critical
deadlines."
Mitchell said EarthSat
produced nearly 130 gigabytes of raster and vector data in
just 55 days, including a 200-square-mile digital mosaic
of the basin using LandSat images from before, during and
after the flood. The firm also digitized information on
floodplain and wetland boundaries, flooding extent and
high water marks.
Tight deadline pressures
compounded common data collection problems that occur on
any large data collection and synthesis projects.
"One of our biggest challenges was locating and
gaining access to special data sets," according to
John A. Kelmelis, Ph.D., chief of the USGS Science and
Applications Branch and head of IFMTF's Scientific
Assessment and Strategy Team (SAST). "We didn't want
to go out and create new data where useable information
already existed."
SAST found that some data
vitally important to making informed decisions on the
floodplain were not available or were not in a useable
form or level of quality. When data were available, the
data sets were often incompatible and, in the absence of
widely-accepted standard transfer formats, migration was
complex and tedious.
Kelmelis said that
organizations were generally cooperative and many went far
beyond what was asked of them, because they understood the
future value of the database.
Accurate Data, Sound Analysis
As data became available, scientists began conducting
their analyses. The task force performed more than 100
different studies and analyses, Kelmelis said.
Some of the studies were
relatively simple, like analyzing flood insurance activity
and responsiveness patterns, according to Kelmelis. Other
studies were far more complex, like developing a model to
determine the pre-flood land use and land cover in the
flooded areas and identifying sedimentation and scour
patterns. Other notable studies included:
• Hydraulic models of the uplands and floodplains.
• Literature reviews for relevant information on
hydrology, wetlands, geology, ecology, economics and other
topics.
• Field studies on energy flows in selected sites on the
floodplain.
• Analysis of where and why levees broke.
• Locations and impacts on toxic sites that were
inundated by the flood.
As a result of the SAST
project, scientists and government decision-makers know
more about the '93 midwest flood than any other natural
disaster in history. The combined data and analyses reveal
its causes, its energy and destructive force, its impact
on human lives and commerce.
Moreover, the studies make
major strides in identifying how human actions within the
water system and our attempts to prevent flood damage
affect the nature and results of a flood, both positively
and negatively.
"The most important
thing we learned is that we must treat the whole river as
a system, instead of a series of independent projects and
activities," Kelmelis explained. "We must
recognize that every action we make changes the nature of
the risk," Kelmelis admonished. "It's not enough
to know that a particular land treatment works for a
certain area. We must ask if this is the best treatment
for this area and how it will affect other areas."
A New Approach
Consequently, the task force is attempting to change an
approach that has persisted for as long as people have
lived in the river basin.
It's report, presented to
the White House this spring, suggests a far-reaching
approach to floodplain management which "seeks to
balance competing land uses in a way that maximizes the
net benefits to society."
The report recommends
redefining federal-state-local relationships and
responsibilities and identifying ways to improve
coordination and efficiency of planning and management
activities at all levels. At a time when federal purse
strings are tightening and budget cuts are a virtual
certainty, Kelmelis emphasizes that multi-level government
collaboration is more important than ever.
"The resources we have
to work with are limited and shrinking," he
cautioned, "but the problems we face will only get
worse, if we don't act decisively."
The report proposes a
combination of strategic and operational goals:
• Reduce the vulnerability of the nation to the dangers
and damages that result from floods.
• Preserve and enhance the natural resources and
functions of floodplains.
• Streamline the floodplain management process.
• Capitalize on technology to provide information
required to manage the floodplain.
This fourth goal, to make
full use of information technology, may prove to have the
most far-reaching impact of all of SAST's actions.
Information as a Hedge Against Uncertainty
The SAST's first full year of existence was devoted
primarily to gathering and processing the available
information upon which their analyses and subsequent
conclusions and recommendations were based. These data now
form the basis for a computerized regional geographic
information system (GIS) of the entire upper Mississippi
River Basin.
This 240-gigabyte
"super-GIS" contains both spatial and aspatial
(textual, or descriptive) information, organized in more
than 100 separate data sets all integrated through an
ARC/Info server which can be accessed on the Internet. The
GIS enables scientist from remote locations throughout the
U.S. to access, download and use data for their own
research.
Historically, data of this
kind has been collected for a single purpose, used once
and then discarded. "Most agencies develop data for a
specific need and then put it on shelf until someone
remembers it and knows where to find it," said
Charles Trautwein, project manager for the SAST database
at the EROS Data Center in Sioux Falls, S.D. "There
is a real need for integrating and making available data
that exists for anyone who might need it."
But if compiling the data
and creating the original database was a Herculean task,
ongoing management must seem a job like for the gods.
Housed at the Eros Data Center, Sioux City, S.D., the SAST
database is being established as a clearinghouse on the
Internet. It will serve as a prototype for the Federal
Geographic Data Committee (FGDC) and will help promote the
National Spatial Data Infrastructure (NSDI).
The distributed
clearinghouse model returns responsibility for updates and
maintenance of data to the various departments that
originally collected or most commonly use a particular
data set. Consequently federal, state, local and private
groups all manage various parts of the overall database.
Despite its apparent
complexity and logistical obstacles, the clearinghouse
approach actually provides the best potential for
long-term success. It does not require any new
departments, agencies or levels of government and, once
the initial development is complete, it will require
virtually nothing in the way of additional expenditures to
maintain.
"Each agency needs to
retain responsibility for its own data," Trautwein
said, noting that if the data were kept centralized, it
would not be updated regularly and would lose its value
over time. If agencies retain ownership of their data, and
continue to maintain it based on their use, the database
will remain current, dynamic and much more useful.
To that end, Trautwein and
a team of 20 computer professionals at Eros are working to
develop standards, protocols, query and retrieve tools,
and documentation for the distributed database approach.
He explained the
clearinghouse model relies on two key elements. The first
part is the metadata - a database of directories, query
tools, descriptions and instructions which enables users
to find out what data sets are available, what they
contain and where they can be found. The second element
incorporates dynamic links that provide direct access to
the individual databases.
Using the capabilities of
the Internet, Trautwein explained, providing access is
relatively simple. "It's just a matter of adapting
existing tools - putting them together in the right way.
The most important thing now is the documentation,"
he said, explaining that each agency is preparing on-line
documentation for its data sets.
A Database for All Reasons
By the end of the year, Trautwein expects the distributed
clearinghouse model to be operational. It will be a
seamless database which users can access and extract data
based by quadrangles, county boundaries, watersheds or
whatever definition their research dictates.
Already users are lining up
to access the data, even though only a sampling of the
data is on-line, pending documentation. Furthermore,
potential uses go beyond floodplain management, although
scientists involved in the flood analysis will continue to
be the major users.
EarthSat's Mitchell, for
example, said his firm is already using SAST data for
other applications, in addition to its floodplain work.
"We're using some of the data to analyze the
pesticides, farmland run-off and the effect on endangered
species," he said.
"Producing a database
of this magnitude and quality has never been done before.
The fact that it's there means people will use it,"
he said.
Between July and December
of last year - the first six months the database was on
the Internet, and while only a limited portion of the data
was accessible - some 2,500 individuals in 266
organizations accessed the SAST database.
"And the database has
not yet been formally announced or advertised,"
Trautwein said.
Of course, with full
implementation just months away, there is still much work
to be done, and some obstacles will persist for years.
Different groups which share data have different hardware
and software systems, different requirements for data
quality and detail, different applications for their own
analyses and use of the information.
Many of the problems
that arose in compiling the first database will recur with
each successive update. Implementation of final data
standards is years away. Spatial incompatibilities will
continue to thwart efforts for consistent analyses.
Different computer systems still cannot share data as
easily and seamlessly as users expect.
Nevertheless, the SAST
database is pioneering new territory in distributed
spatial data that will ultimately overcome one of the
biggest problems facing the GeoTechnologies industry -
access to consistent, detailed geotechnical information.
EarthSat's Mitchell
attributes a dynamic, almost living character to the SAST
experiment. "Its own progressive evolution and growth
will increase its utility over time," he said.
"The fact that it's available and easy to use will
keep it alive."
For SAST team leader,
Kelmelis, such statements from outside parties is
affirmation of the team's concept and recommendation. The
very dynamic nature of the data will help drive the kind
of communication, cooperation and sound decision-making
that will help solve the floodplain risks the upper
Mississippi River Basin still faces. As for the rest of
the applications, they are a bonus.
About the author:
J.D Wilson is a freelance writer in Denver,
Colo., specializing in the GeoTechnologies. He may be
reached at 303-751-7636.
Back
|