The initial impetus for developing a specialty in ocean geography resulted from the need to resolve applied problems in coastal resources, as opposed to development of oceanographic research methods and concepts. However, the development in the last 10-20 years of sophisticated technologies for ocean data collection and management holds tremendous potential for mapping and interpreting the ocean environment in unprecedented detail.
With the understanding that ocean research is often very costly, yet deemed extremely important by large funding agencies, geographers now have the opportunity to perform coastal and marine studies that are more quantitative in nature, to formulate and test basic hypotheses about the marine environment, and to collaborate with geographers working in corollary subdisciplines (e.g., remote sensing, GIS, geomorphology, political geography as pertaining to the Law of the Sea, etc.), as well as with classically-trained oceanographers.
This piece reviews, for the non-specialist, the newest advances in mapping and management technologies for undersea geographic research (particularly on the ocean floor) and discusses the contributions that geographers stand to make to a greater understanding of the oceans.
The Evolution of Ocean Geography
Of the many forms of representation that geographers use to characterize the Earth - numerical, statistical, graphical - maps often come to mind as the foundation of geographic inquiry, particularly in the minds of the public (Woodward 1992). And yet geographers have been largely absent from the push to map the final frontier of the planet: the oceans. Indeed, if one were to define marine or ocean geography as that branch of geography concerned with "location, place, region, and space" (Couclelis 1992) on the surface and bottom of the oceans, as well as the distribution of life within the water column (e.g., Kracker this issue) then a survey of published literature within recent decades would reveal that this sort of work is not being done by geographers. Indeed, as geography has traditionally embraced both the physical and human sciences and sought to provide a bridge between the two, the study of the oceans beyond the realm of the nearshore has escaped attention. Steinberg (this issue) ably discusses various reasons for this omission.
Within the past 10 years, as the greater scientific community has seen a dramatic increase in support for research into global Earth systems and the effects of human-induced change, geographers began to broaden their focus past traditional boundaries. Ocean geography as it was known in the past is no longer limited to the coast, but is slowly embracing studies of the deep ocean realm, as well as lakes and small seas. Such an embrace may be very costly however, both in research dollars and time spent at previously inaccessible study sites.
It will therefore be important for geographers to continue fostering interdisciplinary relationships with classically trained oceanographers and ocean engineers to secure the necessary funding along with a broader sense of community with these scholars, a theme that will be discussed in the paper's conclusion. Steinberg (this issue) states that terrestrial geographers may learn much from the study of ocean space.
Similarly, classical oceanography (the broad science of physics, biology, chemistry, geology, and geophysics applied to the ocean realm) may glean much from a geographical approach. For example, as will be discussed below, several important scientific questions have arisen from a decidedly geographical approach to interpreting ocean floor data, focusing on geographic information systems (GISs).
As Mark Monmonier noted, "what a friend we have in GIS" (Monmonier 1993), which has ushered in new ways of analyzing these data, that in turn have facilitated new interpretations. Indeed, research issues endemic to oceanographic applications of GIS, such as the handling of spatial data structures with the ability to vary their relative positions and values over time, geostatistical interpolation of data sparse in one dimension as compared to the others, volumetric analysis, and the input and management of very large spatial databases will advance the body of knowledge in GIS design and architecture, as well as the body of knowledge in the broader field of geographic information science.
If one accepts the case made by Steinberg (this issue) for ocean space being viable research fodder for geographers, then surely ocean mapping is a viable research area in geographical information science. New opportunities continue to arise for geographers to venture offshore. One of the most pressing is in the realm of mapping the deep ocean floor, where, we have made more progress in the past 20 years mapping the surface of neighboring planets than we have in the past 500 years mapping the abyss of our own planet (Macdonald et al. 1993a).
However, the development in the last 10-20 years of sophisticated technologies for ocean data collection and management hold tremendous potential for mapping and interpreting the ocean environment in unprecedented detail. What follows is a review of these emerging technologies and how they are being applied to the ocean floor, as well as case studies that illustrate the important role that the coupling of geographic techniques with these technologies play in linking observed patterns to physical processes.
Ocean Mapping Technology: From the Entire Globe to a Single Structure
"Seeing" Through Inner Space
When viewed from space, the Earth appears strikingly as a planet of water, with 70% of it surface covered by a fluid envelope averaging 5000 m in depth. This envelope a vast inner space which, along with the underlying ocean floor, is still only dimly perceived by humans. What can be perceived of the water column and ocean floor must be done mostly with the aid of sound, as sound waves are transmitted both farther and faster through seawater than electromagnetic energy.
In order to "see" the ocean floor for instance, sound is essential not only for determining depth to the bottom, but for detecting varying properties of the bottom. Sound reflects back after striking an object on the seafloor and the intensity of this reflection, or backscatter can be used to sense the shape of objects or the character of the bottom (e.g., heavily sedimented and thus non-reflective or glassy with fresh lava flows and thus extremely reflective).
Depths are commonly measured by timing the two-way travel time of a sound pulse released from a research vessel, towed instrument, or vehicle. As the speed of sound in seawater varies linearly with temperature, pressure, and salinity, the conversion of travel time to depth must take this into account (Kennett 1982).
Global Scale Via Satellites
In terrestrial terms, research vessels and their associated instruments and vehicles must travel quite slowly during mapping efforts (anywhere from 1 to 12 knots, or the speed of an average bicycle commuter). In order to map the global ocean floor at this rate it would take approximately 125 years and $5 billion (Sandwell 1995; Yulsman 1996), even with the latest acoustic mapping tools described below. Fortunately, satellite determinations of sea surface height can be used to estimate ocean floor topography on small scales (covering the globe in a little over a year at a cost of ~$80 million) (Yulsman 1996), leaving shipboard techniques to tackle the larger scale mapping efforts.
Radar altimeters aboard satellites take advantage of the fact that the surface of the ocean bulges outward and inward in a fashion mimicking the topography of the ocean floor because of minute variations in the earth's gravitational field. Altimeters can measure these bumps and dips in the geoid height to sub-meter vertical resolution.
For example, the Geosat satellite launched by the U.S. Navy in 1985 mapped geoid heights to a vertical resolution of 0.03 m with a horizontal resolution of 10-15 km (Sandwell 1995). Recently declassified data from Geosat, combined with data from the ERS-1 satellite of the European Space Agency, have provided the best detailed view of the ocean floor on a global scale, revealing structures never before seen, particularly in the southern oceans where ships have difficulty in rough weather (Sandwell 1995; Figure 1).
Regional Scales Via Ships and Towed Vehicles
While satellite gravity data reveal all discrete ocean floor structures larger than 10 km horizontally and 1 km vertically, operations at sea are still required to detect and interpret structures at a higher resolution. Surveys at this regional scale are carried out with multibeam swath mapping systems that are either mounted to the hull of a research vessel or towed behind the vessel as a vehicle. These systems are termed "multibeam" because their transmitting arrays send out several simultaneous, downward-looking beams of acoustic energy across the track of the vessel, creating a 1-10 km-wide swath of coverage (depending on the water depth). Although limited to vessel speeds of 10-12 knots, these multibeam swath mapping systems rapidly generate high resolution bathymetric maps (i.e., of depths to the ocean floor) over much larger areas of seafloor than their single beam predecessors of the 1950s and `60s.
One hull-mounted system that is now accepted as a de facto standard within the marine mapping community is Sea Beam, whose 16 beam-array can map ~1200 km2 of seafloor per day at a contour interval of 10-20 m (the coverage of a single-beam system is ~400 km2 per day at a 100-200 m contour interval). The Sea Beam 2000 series was introduced in the early `90s with a 121-beam array (Figure 2) that often triples the bathymetric mapping output of the original Sea Beam system (Asada 1992).
Several towed vehicles have been developed to work in combination with hull-mounted systems such as Sea Beam. These vehicles are equipped with long-range, side-looking sonars that measure not only the traveltime but the strength and direction to an ocean floor reflector. So in addition to high-resolution bathymetric maps, these vehicles provide the acoustic equivalent of an aerial photograph of the ocean floor (called sidescan sonar imagery).
The vehicles are usually towed in order to shield their sonars from ship noise and propeller turbulence and to enable their use with more than one research vessel during a field season. Towing depth is at least below the thermocline (~150 m depth) to minimize acoustic wave bending due to temperature changes and to give the vehicle greater stability than the pitch, roll, and yaw that the vehicle would be subjected to at the surface.
Currently the most popular towed vehicles among North American marine scientists are the HMR-1 vehicle of the Hawaii Mapping Research Group, University of Hawaii (towed 150 m below sea surface, 20-30 km swath width), GLORIA II of the U.S. Geological Survey (towed 30-60 m below sea surface, 14-60 km swath width), and DSL 120 of the Deep Submergence Laboratory (DSL), Woods Hole Oceanographic Institution (towed ~75 m above the ocean floor, 300 m swath width; Figure 2). Argo II, also of the DSL, is a towed vehicle primarily for visual imaging (Figure 2). It is towed ~10 m above the ocean floor and gathers real-time video, 35 mm photographic, and electronic still camera digital imagery, as well as side-looking sonar, bathymetric, and often magnetics, water conductivity, and temperature data (Fornari et al. 1997).
Hull-mounted swath mapping systems and towed vehicles have led to some remarkable discoveries, particularly along the mid-ocean ridge, the largest, most striking feature on the planet at a length of 70,000 km throughout all the world's ocean basins. The mid-ocean ridge comprises 95% of all the plate tectonic boundaries found on Earth, and dominates the volcanic flux of the planet, creating an average of 20 km3 of new oceanic crust annually, which in turn contributes up to 66% of the heat lost from the interior of the Earth (Macdonald et al. 1991).
Bathymetric maps produced by swath mapping and towed vehicles, in concert with various geophysical studies, have shown that instead of a continuous rift valley as originally conceived, the ridge consists of segments, each a few kilometers in length and linked end-to-end along the axis of seafloor-spreading. This strongly suggests that processes in the lower crust and upper mantle respond to plate separation in a discontinuous, but systematic way (Macdonald et al. 1991). Many more maps and bottom observations are needed to document the amount of extension along the ridge and to characterize degrees of segmentation.
Local Scales Via Submersibles
Mapping is of course a fundamental component of ocean floor investigation but must often go hand-in-hand with sample collection of rocks, biota, and sediments from smaller areas on the ocean floor (typically 0.1-1 km2). It is here that the submersible becomes a valuable and necessary element of deepsea investigative capability.
The submersible couples not only acoustic, but visual imaging with a system that permits complex and interactive sampling and equipment deployment, usually via powerful and dexterous robotic arms (DSSC 1994). The fundamental limitation of the vehicle is the human component. As a consequence of needing to provide life support for the occupants, bottom time is limited to an average of 8 hours, no more than 12. The time, however, is well spent as often the scientist returns from a submersible dive with a completely different, hypothesis-altering perspective of the ocean floor (e.g., Martin 1992; Haymon et al. 1993; Wright 1996).
Underwater vehicles with human occupants date back to the 1600's. However, the first true submersible, so classifed because of it maneuverability, was the Trieste, which took Jacques Piccard and Donald Walsh to a record-breaking depth of 10,912 m in 1960 (Piccard 1961). In the wake of Trieste's success, the number of scientific submersibles grew dramatically, continuing to the present with the Americans, the Japanese, and the French leading the way.
The most distinguished and heavily-used of U.S. submersibles is Alvin (Figure 2), famous for its 1986 visit to the wreck of the H.M.S. Titanic (Ballard 1987). Over the last 30 years Alvin has taken over 8300 observers to the ocean floor during more than 2800 dives over the time span of 18,000 hours underwater (DSSC 1994). In 1974 Alvin and the now retired French submersible Archimede, dove on the Mid-Atlantic Ridge as part of Project FAMOUS (French-American Mid-Ocean Undersea Study), which helped to confirm the theory of plate tectonics and continental drift. Sea Cliff and Turtle, both of the U.S. Navy, are also in use by the North American scientific community, and the French submersible, Nautile is the distinguished workhorse of the European scientific community.
Nested Strategies With Remotely-Operated Vehicles (ROVs)
During the last 25 years of high-resolution exploration of the ocean floor, there have been not only fundamental, exciting scientific discoveries made (see Macdonald et al., 1993a; Macdonald et al. 1993b; Detrick and Humphris 1994; DSSC 1994; Fornari et al. 1997 for a fuller description), but also much has been learned about what combination of tools is needed to fully investigate the interdisciplinary scientific questions at hand. For example, a towed vehicle collecting swath bathymetry and side-looking sonar, towed ~100 m off the bottom at a speed of 1.5 knots, may image ~130 km2 of ocean floor per day.
A similar platform with camera/video capabilities, towed ~10 m off the bottom at 1 knot would image less than 1 km2 of ocean floor per day. A manned submersible with a bottom time of 6 hours may traverse 1-3 kilometers of ocean floor during a dive or carry out sampling or experimental tasks at a single location. Given these spatial limitations, a nested investigative strategy (analogous to "multistage" in terrestrial remote sensing) is in order (Fig. 2): (1) use the high-resolution mapping capability of the towed vehicle to resolve properties of the ocean floor at scale large enough to place the results of a near-bottom investigation into a regional context; (2) deploy a submersible, like a dart aimed at a specific target, to investigate, sample, and characterize a limited number of diagnostic locations with the regional framework defined by the towed vehicle (DSSC 1994). As effective as this strategy is, it is still limited by the short time scale of the manned submersible investigation.
The last 10-15 years have seen the development of remotely operated vehicles (ROVs) for scientific and industrial applications in shallow coastal waters (less than 1000 m). ROVs have all the characteristics of a towed vehicle with an additional capability to maneuver on a tether for high-resolution investigations and interactive tasks on the bottom. In the last decade ROVs have been aggressively developed for scientific research applications in deep water. A host of engineering problems unique to working at great depths have largely been solved, and the technological innovations pioneered for shallow-water ROVs (e.g., lights, cameras, thrusters, robotics, etc.) have been adapted by their deep-water counterparts.
Like the towed vehicles for deep water, these ROVs are powered by a conducting cable to the surface ship and can carry out scientific missions for several days to weeks on the bottom; like submersibles, these ROVs can remain stationary and perform complex sampling and imaging tasks (DSSC 1994). An operator may control the ROV from a distance using either full robotic control, manual control, or a continuous series of combinations between the two. On the whole these ROVs, when well navigated, can carry out mapping tasks at a range of scales that are unprecedented both on the ocean floor and up in the water column.
The m- to cm-scale sonar and image-based maps that can be produced are beyond the resolution of the hull-mounted and towed systems and are achieved much more effectively than submersibles. The marine science community is just beginning to ascend the ROV "learning curve" in terms of system configuration and implementation, but clearly the ROV represents a significant step foward in humankind's ability to characterize the global abyss (Newman and Robison 1992; Travis 1991; DSSC 1994; Robigou and Ballard 1994).
Medea-Jason, developed by the DSL, has seen a marked increase in scientific service over the last 5 years. Medea-Jason is a dual vehicle ROV system, with Medea serving as support linkage to both the ship and Jason, and Jason functioning as a multi-sensory imaging and sampling platform (Figure 2; Robigou et al. 1993; DSSC 1994). Also available now are Ventana and Tiburon of the Monterey Bay Aquarium Research Institute (Newman and Stakes 1993; Stakes et al. 1993), and the Advanced Tethered Vehicle of the U.S. Navy (DSSC 1994).
In addition to ROVs, autonomous underwater vehicles (AUVs), which are still largely developmental, hold great potential for future mapping and data collection as they are completely untethered. Elimination of the tether frees the vehicle from the surface vessel, removes the need for large and costly handling gear, and allows for continuous operation up to 1 year. Communication with the vehicle through the water and over such a long time period is still extremely difficult, as is equipping the vehicle with the necessary on-board intelligence to successfully complete all of its tasks without human supervision. Two free-swimming vehicles that have been tested recently for deep-water science missions are the Odyssey II and the Autonomous Benthic Explorer (ABE; Figure 3), both developed at the DSL of WHOI. In 1995 ABE completed its first official science mission by successfully gathering magnetics data and visual imagery from the CoAxial segment of the Juan de Fuca Ridge in the northeast Pacific (Yoerger et al. 1995; Tivey et al. 1997).
Issues in Ocean Spatial Data Management
The introduction of these sophisticated tools has necessitated the development of reliable data management systems for the various data streams. The cost of acquiring the data alone (seagoing operations run upwards of $25,000 per day) justifies the development of dedicated systems for the management and integration of these data. And there is always the goal of using these systems as an analysis tool to optimize scientific interpretations and facilitate the rethinking or reformulating of hypotheses.
As mentioned before with nested surveying strategies, bathymetric data from a swath mapping system located underneath a ship may need to be georeferenced to underwater video images or sidescan sonar data collected from a vehicle towed behind the ship and several meters above the ocean floor, to sample sites, observations, temperature measurements, etc. collected from a submersible or ROV launched away from the ship and operating directly on the ocean floor, to earthquake data obtained from an ocean bottom seismometer anchored on the seafloor.
The data produced by these different sensors will invariably have different dimensionalities, resolutions and accuracies. And as transmission rates of up to several Gb per day at sea become more and more commonplace the ability to assess ocean floor data collected at these different scales, in varying formats, and in relation to data from other disciplines has become crucial. Here geography has made a contribution through the introduction of GISs, that fulfills not only the requirement of data integration but of combining or overlaying data of the same dimensionality.
This also serves as an efficient means of assessing the quality of data produced by one instrument as compared to another. Geographers have also contributed greatly to assessing and solving many problems that apply very much to the ocean environment, including: the management of very large spatial databases (Star 1991; Stonebraker et al. 1993; Frew 1994), uncertainty and error propagation (Goodchild and Gopal 1989; Buchmann et al. 1989; Burrough and Frank 1996; Heuvelink 1998), the designation of "core" or "framework" data sets for sharing and archiving (Frank et al. 1995), and the development of standards for spatial data and metadata, i.e., information about data (National Research Council 1993; Federal Geographic Data Committee 1995).
The development of appropriate metadata (information about data) and metadata standards for ocean floor and other types of oceanographic data is also an important issue. This is pertinent to oceanography regardless of whether or not the data have been used in a GIS context. The growth in information technology has led to an explosion in the amount of information that is available to researchers in many fields.
This is particularly the case in the marine environment where a state-of-the-art "visual presence" (through real-time video or 35-mm photography) may result in the acquisition of data that quickly overcomes the speed at which the data can be interpreted. The paradox is that as the amount of potentially useful and important data grows, it becomes increasingly difficult to know what data exist, where the data are located, and how the data can be accessed. In striving to manage this ever-increasing amount of data and to facilitate their effective and efficient use, metadata becomes an urgent issue.
As this article goes to press, the Federal Geographic Data Committee will have reviewed the first proposals ever to address the issue of standards development for coastal and marine metadata, under the National Spatial Data Infrastructure (NSDI) Cooperative Agreements Program (Mapping Science Committee, 1994). One of the most ambitious and effective efforts to date in the area of ocean data management and metadata creation is the Statewide Ocean Resource Inventory (SORI) project, funded by the National Oceanic and Atmospheric Administration (NOAA) through the Florida Coastal Management Program.
The SORI project aggressively fills gaps in ocean floor and other types of data along the Florida coast, maintains them in a GIS and executes a distribution mechanism via the World Wide Web, with the goal of providing information in a format that can be used by those developing Florida's ocean policy (Westlake et al. 1997; Friel et al. 1998).
As a result of various issues raised by Florida's Ocean Policy Roundtable, NOAA's office of Ocean and Coastal Resource Management has formed a partnership with the NOAA Coastal Services Center to advance ocean GIS throughout the South Atlantic region (Fowler and Gore 1997; Friel et al. 1998). An ambitious effort on the Gulf coast was the Water Quality Information System (WaterQUIS) of the University of Alabama, with a goal of characterizing the spatio-temporal dynamics and linkages between various physical, chemical, and biological processes of relevance to water quality conditions and forecasts throughout the Mobile Bay watershed.
On the west coast, and directly related to the ocean mapping technology described previously, is the Vents GIS project, funded by the Vents program of the NOAA Pacific Marine Environmental Laboratory. The Vents GIS focuses on geological, geophysical, chemical, and biological data from the Juan de Fuca seafloor spreading center and is the first major GIS project designed exclusively for deep ocean data and metadata (Fox et al. 1996; Wright et al. 1997).
It integrates data from multiple conventional sensor and sample analysis systems, as well as interpretive information from video and sidescan sonar data analysis. One important component in the effective implementation of the entire data managment infrastructure has been the definition of a semantic model that allows users and designers to capture the meaning of the various databases. A semantic data model essentially defines objects, relationships among objects, and properties of objects, thus providing a number of mechanisms for viewing and accessing a database schema at different levels of abstractions (Ram, 1995).
A number of semantic data models have been developed and described in the computer science and information management literature (e.g., Hull and King 1987; Peckham and Maryanski 1988; Miura and Moriya 1992/93; Mann and Haux 1993; Yazici et al. 1995). The successful application of a marine geography database is closely linked to various unique requirements for data acquisition and near real-time analysis, previously considered only by Gritton et al. (1989) for undersea data from a single vehicle.
These requirements are considered by Wright et al. (1997) for multiple data types from a variety of sensors in the development of a semantic data model for the southern and central portions of the Juan de Fuca Ridge. Currently in progress, and based on that semantic data model, is the development of a metadata archiving and distribution mechanism in support of the NSDI for the northernmost segment of the Juan de Fuca Ridge .
Related to the concept of metadata is that of spatial lineage, or the history of how the spatial data were derived. A report of lineage is therefore intended to serve as a communication mechanism between the data producer and the user, a kind of "truth in labeling" statement regarding the nature and quality of GIS-derived products (Lanter 1990).
Because oceanographic data of all kinds often come from a variety of sensors, differing in resolution and covering different geographical areas, lineage documentation is especially important for assessing data quality, data history, and error propagation. The integration of remotely-sensed images (such as the aforementioned Geosat gravity data) with in situ data is an important consideration as well. However, the values in remotely-sensed images represent parameters in an area whereas the in situ data are point observations.
The fact that datasets have been routinely collected at different times is a further consideration. The most recent data set is usually assumed to be the most "correct," provided that no special error conditions are known to have affected the sensor. In practice, small variations in time within or between datasets gathered at sea are often ignored to simplify the analyses and modeling .
Mechanisms will need to be put in place to provide information on the source of data input to the GIS, as well as on database and cartographic transformations performed on the data within the GIS, and on input/output relationships between source-, derived- and product-GIS data layers. Lineage documentation is still rarely used in GIS because its structure has not been well understood or well implemented (Lanter 1990). The studies of Lucas et al. (1994), one of the first to investigate metadata and lineage requirements specifically for oceanographic applications, and Lanter (1991) are important exceptions.
New Scientific Discoveries: From Pattern to Process
The list of scientific discoveries in the field of ocean floor mapping continues to grow. To provide an example, the following is a description of oceanographic fieldwork recently completed in the East Pacific with the aid of geographers, many of whom are beginning to make important contributions not only to the management of data gathered from the deep ocean but also to the science derived from it (see also the works of Mason et al. 1994; Gold and Condal 1995; Li et al. 1995; Bobbitt et al. 1997; Goldfinger et al. 1997).
In October to December of 1996, a nested survey strategy was used to map and interpret the narrow crest of the superfast-spreading southern East Pacific Rise (EPR) at 17-18 ° S (Figure 4; Haymon et al. 1997). The major goal of the survey was to test the hypothesis (based on Argo I data from the EPR at 9-10 ° N) that along-strike thermal gradients set up by the segmented pattern of magma supply from the upper mantle to fast-spreading mid-ocean ridges exert primary control on the distribution and types of hydrothermal vents and vent biota, as well as on variations in fissuring and other fine-scale volcanic and tectonic structures along the axial zone.
Along a segment of ridge only 45 km long, the sub-surface geophysical data of Detrick et al. (1993) show that an axial magma chamber (the portion of the upper mantle that supplies magma to the seafloor) in this region changes along-strike from a flat-topped body at relatively constant depth of 1200 m, to a peaked cupola or spike that intrudes to within 800 m of the seafloor at ~17 ° 25'-27'S. This is in marked contrast with the only other magma chamber to be imaged along the EPR, a flat-topped chamber beneath the EPR at 9-10 ° N, and represents the most extreme along-strike variation in thermal gradients known to exist on the global mid-ocean ridge system.
In order to precisely locate the axial zone in this superfast spreading environment (full spreading rate at ~14 cm/yr; Hey et al. 1995), the DSL 120 was towed at an average height of ~75 m above the ocean floor for the collection of high-resolution bathymetry and 120-kHz sidescan sonar imagery.
The DSL 120 data made possible the determination of how closely to space the Argo II tracklines. Argo II was then towed at an average height of 9 m above the ocean floor to map hydrothermal vents, fissures, fault scarps, lava flow ages and morphologies, and biological communities with video, 35 mm, electronic still camera imagery, as well as with 200 kHz sidescan sonar, a 675 kHz down-looking scanning sonar, and a conductivity-temperature-depth sensor. These observations, essential for studying the variable, fine-scale character of the ocean floor, were digitally logged in real time at sea and then merged with the vehicle's navigation.
Fifteen 45 km-long, axis-parallel lines through the axial zone with line spacings of 10-30 m provided 100% saturation coverage where the axial zone is less than 100 m wide, down to a minimum coverage of 45% where the axial zone widens to 400 m (Figure 5). Video and photographic coverage during the survey included more than 6 million m2, yielding a tremendously large and diverse data set that would have been extremely difficult to interpret without input to a GIS.
This allowed the seagoing scientific party to immediately archive four different types of seafloor observations (hydrothermal, volcanic, tectonic, and biological) in an organized and consistent manner, and to interpret plots of the locations of vehicle tracklines and observational features, most notably locations where the Argo II video camera crossed a fissure, a high- or low-temperature hydrothermal vent (smoker, smoky or cloudy water issuing from vent), a hydrothermal deposit (mound, chimney, edifice, hydrothermal sediment/staining), scattered biota or dense animal community, lava flow fronts, and isolated fault scarps, as well as Sea Beam bathymetric data gathered before the expedition.
Additional overlay of vehicle tracklines on any combination of observations was useful for noting gaps or scarcity in data coverage or to test the validity of distributions, particularly in portions of the survey area where density of trackline coverage is minimal. Buffering of tracklines along which particular ages of lavas were observed were useful for estimating the areal extent of these flows after the methodology of Wright et al. (1995).
Several important scientific questions have arisen from a GIS approach to analyzing these data, chief among them the changing patterns of fissuring with respective to the ages of lava flows. For instance, it was thought previously that the widest (presumably deepest) fissures on the mid-ocean ridge are not primarily eruptive, and should be most abundant in areas of older lava flows, having increased in width with time.
However, this was not observed during the Argo II survey. Some of the widest cracks have been mapped at the "Spike" region (17 ° 25'S) where an eruption of the ridge crest is estimated to have occurred some time in 1993-1994 (Auzende et al. 1996). These fissures may be deep enough to reach a layer of magma intrusion several hundred meters below the seafloor, which in turn is linked to the axial magma chamber imaged by Detrick et al. (1993). Although Argo II video observations and sonar data, as well as DSL 120 sidescan and bathymetry, are still being processed and GIS maps still being prepared, it is hypothesized that the narrowest, presumably shallowest, fissures correspond to the narrowest, oldest, and most hydrothermally inactive portions of the ridge crest in the survey area. The wide fissures are hypothesized to be the deepest and thus primarily eruptive in origin, whereas the narrow, shallow cracks may be primarily tectonic, and associated more with far-field plate stresses or thermal contraction due to cooling of the crust.
Figure 6 is a simple map created in a GIS while at sea during the survey showing the distribution of axial lava ages upon which these hypotheses are based. It shows a first-order assignment of relative lava ages for the Argo II survey areas based on the apparent thickness of small sediment ponds in between lobes of lava (the more sedimented a lava flow is, the older it is, having been created at the axis of seafloor spreading and then carried away from the axis over a period of time long enough for sediment to accumulate on it). Haymon et al. (1991) estimated Age 1 lava flows to be <50 years old, Age 2 flows to be ~100-1000 years old, and Age 3 lavas to be 1000-5000 years old. However, based on Nautile submersible observations made in the survey areas during a 1994 dive program (Auzende et al. 1996) and Alvin observations during 1991 and 1992 dives to the EPR at 9-10 ° N (Haymon et al. 1993), these estimates are now believed to be too old. Absolute age dating of lava flows in this area and calibration of an absolute age scale are planned for the future.
Figure 6: Map resulting from a GIS analysis at sea showing the approximate, along-strike distribution of relative axial lava ages in the area surveyed with Argo II. Map projection is Mercator. |
In the survey area there is a demarcation between older lavas to the north of the "Spike" region at 17 ° 25'S and younger lavas to the south along the length of the ridge (Figure 6). The pattern of lava ages appears to be driven by the process of laterally-propagating intrusions of magma (dikes), moving southward along the strike of the EPR away from the shallow, localized site of magma injection at 17 ° 25'S.
The high rates of spreading, crustal formation, and even seismicity on the southern EPR, in comparison to slower-spreading ridges such as the Mid-Atlantic Ridge, beg the question of whether volcanic and tectonic events become larger and more intense, or simply much more frequent (Mottl et al. 1996).
Still to be determined are the implications of these findings for harnessing geothermal energy, garnering oil and mineral wealth (especially the copper, iron, manganese, and zinc found at hydrothermal vents), assessing seismic hazard (particularly tsunami detection and emergency preparation), and finding natural substances that may yield new medicines or new classes of industrial chemicals. Pharmaceutical and biotechnology companies are already analyzing the sulfur-eating bacteria clustered around hydrothermal vents, along with various species of deepsea fish and plants, in the search for a substance that might someday turn into a miracle drug (Lemonick 1995).
Says Bruce Robison of the Monterey Bay Aquarium Research Institute, as quoted by Time magazine: "I can guarantee you that the discoveries beneficial to mankind will far outweigh those of the space program over the next couple of decades. If we can get to the abyss regularly, there will be immediate payoffs" (Lemonick 1995). As we cross the physical/human divide to assess the role of humans in the deep ocean processes that they address, we must consider what geographical assessment is needed in order to protect this realm, and what practical uses of the deep ocean (mining, harvesting of sulfur-eating bacteria, etc.)
should be implemented and then increased in the short-term, the mid-term, and the long-term. It is useful to think of these issues of sustainable use within the context of various conservation strategies, which in turn may be linked to issues of deep ocean recreation for island states, as addressed by Trist (this issue). Two United Nations documents that are extremely important to consider are Chapter 17 of Agenda 21, as discussed by Vallega (this issue), and the Law of the Sea Convention
. Agenda 21, drafted during the United Nations Conference on Environment and Development in 1992, provides guidelines for the sustainable development of the oceans, with specific references to the detailed gathering of geographic characteristics of regions, the development of information management systems (read GIS), and attention to various issues of information management (collection, analysis, and assessment of spatial data for resource management). The Law of the Sea, entered into force in 1994, enshrines the notion that all problems of ocean space are closely interrelated and should be addressed as a unified system. Marine science research is addressed in Part XII of the document.
Conclusion: Strengthening Geography's Foundations Through an Ocean Perspective
Once again, the importance of community cannot be overstated as it is only through collaboration that the expense and technical expertise of achieving these scientific goals can be realized. Such work requires high levels of government support which are difficult to obtain but not impossible. For example, sharing time at sea does a great amount of oceanographic research. Several principal investigators may be awarded a huge grant (several hundred thousands of dollars to meet the costs of doing fieldwork at sea), and then invite several collaborators to join them to carry out ancillary research. This makes for the most cost-effective use of ship time while requiring the collaborators to seek funds only for the transport of themselves and their equipment to and from port. Geographers have enjoyed high levels of participation in the remote sensing community, another realm of extremely expensive science, but connections to NASA and other agencies within the community have paid off (e.g., the EOSDIS Mission to Planet Earth Information System.
Geographers must continue to build on and expand the tradition of exploratory fieldwork and to foster the appreciation and use of geography by non-geographers, such as ocean scientists, so that "the capacity to make use of [geography's] perspectives, knowledge, and techniques grows along with the capacity of the discipline to supply them"
No comments:
Post a Comment