From
the Publisher
By Roland Mangold
INDUSTRY INSIDER Viewing Geospatial Database Acquisition as an Investment By Kas Ebrahim The proliferation of Geographic Information Systems (GIS) into a wide variety of applications has resulted in the generation of numerous-some of them very large-geospatial databases. Given that more than 70 percent of an estimated $2 billion North American GIS market is used for data acquisition, millions of dollars are pouring into the geospatial database generation. However, due to lack of standards, it is highly unlikely that existing geospatial databases are interoperable. Moreover, we observe that due to insufficient funds, GIS buyers often look for ways to cut corners. One of the most common ways to reduce the cost of GIS implementation is to relax map accuracy and omit budgeting for proper database maintenance. It is not uncommon for organizations to realize that, by being overly cost conscious, they now have systems that are only marginally functional. This lack of standards, and the prevalence of cost consciousness in the development of geospatial databases in support of GIS implementation, results in poor quality with little or no interoperability. The impact of poor quality on geospatial databases is self-evident. However, it is important to point out the major impact caused by lack of interoperability. Lack of interoperability between geospatial databases-an integral component of the GIS-is the greatest stumbling block to reaching the GIS network effect. In this case, the network effect can be defined as the ability to join geospatial databases, similar to how a jigsaw puzzle covers a larger area. For instance, databases of cities can be joined together to cover a county, and counties joined together to cover a region, or even the entire state. The broader coverage will enrich the functionality of the geographically smaller GIS, making the geographically larger systems economically feasible. In other words, the network effect GIS will gain economies of scope as well as economies of scale. A closer look reveals that the lack of standardization and the motivation to minimize the data acquisition component of a GIS implementation are closely linked to one another. This lack of standardization is not due to an absence of standards. Organizations such as the American Society for Photogrammetry and Remote Sensing (ASPRS) from the mapping side, and the Open GIS Consortium (OGC) from the GIS side, have the foundation from which geospatial database standards can be developed. However, they have done little to create an incentive for GIS buyers to adhere to any such standards. The question then is, how can GIS buyers, from government to private-sector, be motivated to adhere to geospatial standards, even at a cost of paying more for database development and maintenance? Viewing acquisition of the geospatial database as an investment, and treating the resulting database as an intangible asset, may well provide the incentive needed for organizations that are considering GIS implementation to adhere to an established geospatial standard. In most cases, organizations categorize the geospatial database acquisition costs as an expense, one that is usually incurred over a relatively short time period. Consequently, to improve the probability of funding the GIS project, the GIS coordinator's effort becomes focused on cost minimization rather than system optimization. In other words, the tendency is to take a something-is-better-than-nothing approach. The drive to lower data acquisition costs forces GIS implementers to sacrifice quality, especially in the base-map area, and often they do not have a long-term, cohesive geospatial database maintenance plan in place. Lowering geospatial database quality affects overall GIS performance, hence raising questions concerning the financial viability of the GIS solution in the first place. In these instances, GIS and its proponents lose credibility and find it very difficult to finance future projects. Although there are organizations that properly fund their projects and successfully implement a GIS, those in neighboring cities or counties that do not, ultimately prevent any sort of network effect. Besides, even if neighboring organizations do everything right, there is no guarantee that their respective systems will be interoperable. One way to solve these issues is to employ an alternative financial approach that views the geospatial database as an asset, using Net Present Value (NPV) analysis to determine the financial viability of GIS implementation in an organization. First, we must discuss the components of NPV analysis to justify a GIS project. According to an NPV analysis, a GIS project is viable if the incremental additional cost is less than the present value of the discounted foreseeable future cash flows that are directly generated as a result of the GIS. For example, assume an organization intends to invest $1 million in the geospatial database required for a GIS. Due to cost concerns, the organization has no specific plans to routinely maintain the geospatial database. The managers understand that, without maintenance, the useful life of the database cannot be more than three to four years. After a four-year period, it is estimated that only about 30 percent of the geospatial database remains useful. Given these parameters, and assuming a 16 percent discount rate for the project, the organization's realized incremental cash flow-as a direct result of the GIS project-should be at or greater than $19,838 per month in order to justify proceeding with the project. The additional or incremental cash flow can be realized by increasing the profit margin or, in the case of governmental agencies or nonprofit entities, through a reduction of operating costs. One of the factors that an NPV analysis can monitor is the useful life of the capital employed, and the additional capital needed to extend the system's useful life. Although maintaining the geospatial database costs money, funding it improperly will reduce the system's useful life, thus requiring much higher incremental cash flow to bring about positive NPV. Using the parameters of the example given above, let us assume that instead of zero maintenance, the organization decides to maintain the database by allocating an annual budget equal to 15 percent of the initial investment. Thus, no major database modification is required. Once again, assuming a 16 percent discount rate, a four percent inflation rate, and an eight-year project lifespan, the organization needs only to realize an incremental cash flow of $4,850 per month to justify proceeding with the project, even given the additional cost of maintenance. This dramatic reduction in required cash flow is due to a longer time horizon and the residual value of geospatial database. Undoubtedly, such a dramatic reduction in cash flow requirements will significantly increase the likelihood of a GIS project being funded. To achieve this second scenario, it must be substantiated that the geospatial database, under certain conditions, has the ability to retain its value over time. And in order for the geospatial database to retain its value, it should be possible to liquidate it when necessary. For geospatial database to have an open market, it must be useful to others. This is where the significance of interoperability comes into play. Also, geospatial databases have to be objectively appraised, which is predicated on the ability to compare a geospatial database with like or imminently similar databases that were recently sold or are currently on the market. A comparison of geospatial databases is possible only if certain development and maintenance standards exist, where independent and impartial organizations can review the database development and maintenance to certify compliance with established standards. An organization such as ASPRS is well-suited to carry out such a task. It is conceivable that consulting firms will be required to help such organizations obtain certification. Other firms can routinely audit the organizations' geospatial database development and maintenance procedures. These firms would be similar to ISO9000 consulting firms and certified accounting auditing firms. Such a process provides the infrastructure necessary for proper and objective valuation of geospatial databases. Through such a process, organizations will be motivated to obtain a higher rating for their geospatial database, not unlike any other asset they may hold. The current trend to consider acquisition of a geospatial database as a cost of GIS implementation is putting an unnecessary strain on GIS buyers to justify such a significant investment. The fear of most decision-makers concerns project failure, where the benefits of GIS implementation may not be fully realized. During the process of risk assessment, oftentimes no attempt is made to evaluate the liquidation of acquired assets to implement the GIS. If the geospatial database-the greater portion of overall implementation costs that include hardware and software consulting, and training-is properly designed, it can serve other needs in the organization, or it can be eventually resold on the open market. Today, due to the inability to objectively evaluate and certify geospatial databases, and a lack of motivation to adhere to certain mapping standards, organizations engaged in GIS implementation have become highly cost driven in order to mitigate risk. By providing the means for geospatial databases to have a secondary market, a significant amount of this risk is mitigated, enabling organizations to offset their geospatial database's residual value with future incremental cash flow. And with processes in place to enable the objective valuation of geospatial databases, they can be efficiently traded in an open market. Consequently, organizations interested in GIS implementation will have the motivation to be value driven to achieve risk mitigation. Geospatial database standardization will increase the overall data quality by providing reliable accuracy and significant interoperability among various GIS. Higher levels of interoperability can significantly increase the GIS network effect, hence freeing its immense capability. The VisiCalc spreadsheet has been given the distinction of being the "killer app" that placed personal computers into their current prominence. Recently, there has been a discussion within the GIS industry that such a "killer app" is needed to truly put GIS into mainstream computing. Perhaps viewing the cost of geospatial database acquisition as an investment will have this effect, freeing up significant financing, forcing data standardization, and ultimately putting GIS into the mainstream computing market. About the Author: Kas Ebrahim is the president of Landata Airborne Systems, a mapping and GIS services firm located in Irvine, Calif. He has been involved with digital mapping and GIS for nearly 20 years. His level of involvement extends from technical production and management to marketing and finance. Mr. Ebrahim was graduated from the University of California (Irvine) with a degree in economics, and he has been involved in the field of corporate finance for more than five years. He can be reached via e-mail at [email protected].
Back
|