Our interaction with EOS Distributed Active Archive Centers (DAACs) and other NASA data archive facilities may be grouped into the following four categories:
The principal example of DAAC products in support of validation and verification are the usage of the TOPEX/Poseidon Interim Geophysical Data Records (IGDRs), Geophysical Data Records (GDRs), and Merged Geophysical Data Records (MGDRs) used for operational analysis of the altimeter data. These data sets are obtained as they become available, via a variety of media, from the JPL Physical Oceanography Distributed Active Archive Center (PO.DAAC). Our WWW homepage is linked with the PO.DAAC homepage.
Current participation in development of additional satellite data archive products includes creating a new ERS-1 CD-ROM (based on ESA's OPR data) with corrections and modules consistent with the above-mentioned TOPEX/Poseidon MGDR for distribution to U.S. WOCE ERS-1 investigations. The primary goal is to validate the production of this CD by the PO.DAAC and provide the wet and dry tropospheric corrections based on the European Center for Medium Weather Forecast (ECMWF) meteorological fields.
Also ongoing at CSR are reviews of DAAC, archive, and educational outreach products. The principal investigator and one co-investigator serve on the PO.DAAC advisory committee. One co-investigator serves on the EOSDIS Advisory Committee. We have conducted verification of the EOS Core System (ECS) Version 0 Information Management System (IMS). Center personnel are active contributors to various EOS Data and Information System (EOSDIS) Ad Hoc Working Groups (AHWG) which aid in defining and refining the content and user interface to the DAACs. This interactions includes forums such as the EOSDIS working group for consumers (AHWGC) and produces (AHWGP) and the User Recommendations Database (URDB). The center has also participated in multiple peer review cycles of the TOPEX/Poseidon Informational (TPI) CD-ROM.
Certainly the most important use of the DAAC and other NASA archive facilities has been in its support of ongoing research and analysis. This usage extends over both operational and development archive facilities and products. Significant examples include use of the World Ocean Circulation Experiment (WOCE) data base to access tide gauge data; the NOAA TOGA-TAO data bases to provide sea surface temperature, wind speed, and thermocline depth data; and smoothed sea level anomaly data from the JPL DAAC. Also accessed are the TOPEX/Poseidon Sensor Data Records (SDRs) used in altimeter waveform analysis which supports ongoing altimeter calibration studies. The above mentioned ECS Version 0 IMS is used as the current source for GSFC's Data Assimilation Office (DAO) multi-year gridded global GCM atmospheric data sets to provide model comparisons for the center's atmospheric climate research which incorporates the multiple observation of the current altimetric satellites.4.5. Science Computational Facility Status
Several new capabilities have been added to the Science Computing Facility at the Center for Space Research. The College of Engineering has provided CSR with an IBM RISC 6000 Model 950 system, and it has been brought into the existing college network to form a cluster of workstations. We have already successfully executed general ocean circulation models on the CSR system, and work is underway to establish a distributed processing environment in the College over seven of these RISC systems. Four 3.6 GB hard disks have also been added to the CSR UNIX cluster system to enhance our data storage capability. Other recent hardware additions include a Hewlett Packard 735 with a 125 MHz cpu, a Hewlett Packard Optical Disk Jukebox, and a Hewlett Packard Entria X-Terminal Station. Future plans include upgrading our VMS file server from a VAX8600 to a DEC Alpha 4000, acquiring additional high- and low-end workstations, and upgrading our Local Area Network to enhance data flow within our computational facility and with the external network.
Using EOS SCF funds, a Cray J916/5-1024 model computer has been purchased and co-located at the University of Texas at Austin High Performance Computing Facility (formerly the University of Texas System Center for High Performance Computing). This computer provides a balanced design, adequate bandwidth, compatibility with our existing software and that of other institutions to support our investigation goals. The NASA/CSR Cray with five processors is housed with the University of Texas at Austin Computation Center Cray J916/16-4096 which contains 16 processors. Each system has approximately 108 GB of disk storage available. By locating both systems together, CSR benefits from the existing manpower as well as the file backup and archiving facilities already maintained by the UT Computation Center.
We currently have acquired access to NSF Supercomputer center computers. We are now using GSFC's Cray C90 to perform some of our large-CPU computer jobs. For example, general ocean circulation models (Princeton MOM and coastal ocean models) and several other software packages resident at the UT Austin High Performance Computing Facility have been tested on the GSFC Cray Y-MP/C98 in anticipation of generations of possible relevant production runs for our research. Inherent problems associated with this approach include network speed and unavailability of large data storage authorization.
At present, there are federal and state plans to interlink the University of Texas main campus (includes UT/CSR), the Pickle Research Center (formerly the Balcones Research Center) where the Cray computer is housed, other state universities, and selected high schools, with high-speed optical fibre networks. We are currently awaiting the installation of the proposed network which will provide a significant data-transfer speed-up in our relevant computational and data exchange tasks for the EOS research. The anticipated connection will also benefit the communication speed for our current effort to work on the NSF Supercomputers, which was discussed previously.
The computational links between UT/CSR and AER Science Computing Facility is primarily through the use of the Internet. The AER Computer facility is an in-house facility and is centered around the UNIX environment which includes a SPARCserver 1000 (a symmetric multiprocessing computer currently containing six processors) as well as a variety of SUN SPARCstations, work stations and PCs. Remote communication is facilitated via a direct Internet connection (256KBs, via NEARNet) and several dial-in/dial-out modems.4.6. Effect of EOS Restructuring
At the present time, the effects of the EOS restructuring have not influenced any of our research plans. The single largest deficiency in the EOS data set is the lack of wind measurements throughout the depth of the atmosphere. The need for missions or programs to perform a one-time, sufficiently accurate measurement of the global gravity field and land topography remains. The gravity field mission is particularly important for complete analysis of satellite altimeter data. The threat to the long-range stability of the NASA contribution to the global satellite laser ranging network is a major concern. The SLR measurements are a key requirement for global measures of sea level, reference frame definition, and mass balance determinations. Some elements of this investigation would be lost if the SLR systems were eliminated