Software for Manipulating or Displaying NetCDF Data This document provides references to software packages that may be used for manipulating or displaying netCDF data. We include information about both freely-available and licensed (commercial) software that can be used with netCDF data. We rely on developers to help keep this list up-to-date. If you know of corrections or additions, please send them to us. Where practical, we would like to include WWW links to information about these packages in the HTML version of this document at http://www.unidata.ucar.edu/packages/netcdf/software.html. ---------------------------------------------------------------------------- Freely Available Software * CRDtools * DDI * DODS * Envision * EPIC * FAN * FERRET * FREUD * GMT * Gri * HDF interface * Ingrid * LinkWinds * MEXCDF * MexEPS * NetCDF File Calculator and Data Editor * NextStep NetCDF Utilities * PolyPaint+ * SciAn * Zebra * User-contributed software ---------------------------------------------------------------------------- Commercial or Licensed Packages * AVS * Environmental WorkBench * IBM Visualization Data Explorer * IDL interface * IRIS Explorer Module * MATLAB * NCAR Graphics * PPLUS * PV-Wave * Spyglass Slicer and Dicer * VISAGE and Decimate * WXP ---------------------------------------------------------------------------- CRDtools Climate Research Data Tools (CRDtools) provides data extraction, analysis, and visualization functions for the climatology researcher. This toolkit of X Windows applications was developed at the NOAA Climate Diagnostics Center. CRDtools is written in C and uses the NCAR Graphics plotting package to display some of its output. In addition to netCDF, you will need the NCAR graphics package to use CRDtools. * Current release information. * CRDtools installation instructions For additional CRDtools help, contact crdtools@cdc.noaa.gov. DDI The Data and Dimensions Interface ( DDI) addresses a significant problem in the visualization of large data sets: Extracting only the relevant data and providing it to a chosen graphics engine in the required form without undue effort. DDI transfers data between files, formats and visualization systems. It works within the specifications of each supported format to promote compatibility. DDI provides the following capabilities: * Browsing through the contents of DRS, NetCDF, and HDF files * Randomly selecting data variables * Rearranging variables into the desired form * Modifying or creating new variable attributes * Saving variables into new files * Feeding variables directly to a visualization system DDI operates in two modes: * As a module in a data flow environment such as AVS or IRIS Explorer * As a stand-alone application capable of sending data over the network to AVS, IRIS Explorer, IDL, NCSA Collage, and PV-WAVE DDI is available for a number of computer systems via anonymous FTP from sas.nersc.gov in the directory /pub/DDI. The Introduction to DDI and DDI Reference Manual are Www-accessible: http://www.nersc.gov/doc/Services/Applications/Graphics/DDI/DDI.html DDI was developed as a collaboration between the Program for Climate Model Diagnosis and Intercomparison (PCMDI), and the National Energy Research Supercomputer Center (NERSC), both of Lawrence Livermore National Laboratory (LLNL). DDI was developed by Chris L. Anderson, Robert S. Drach, and Dean N. Williams. DODS The Distributed Oceanographic Data System (DODS) is an effort to facilitate widespread exchange of scientific data sets between scientists over the Internet. DODS is based on existing data access tools; rather than developing a self contained system, it makes extensive use of existing data access APIs and the World Wide Web. DODS is being developed by the University of Rhode Island and the Massachusetts Institute of Technology. DODS can be used to make netCDF data files available over the Internet and it can also be used to adapt existing software which use the netCDF API (by re-linking) to read data served by a DODS data server. In principle, any program written using netCDF can be adapted to read data from a DODS data server - in other words any program which uses netcdf can become a client in the DODS client-server system. Included in the source and binary distributions are two freely available programs which have already been modified (re-linked). With a client program accessing data from a netCDF server, it is possible access a small subset of a large dataset over the Internet without copying the entire dataset (as you would have to do with FTP or AFS). The client can see changes to the netCDF dataset, e.g. when new records are added (which would not be possible with FTP). Finally, the client can also access cross-sections of variable data without paging large amounts of data across the network (as you would have to do with NFS, for example). While DODS currently supports netCDF only, the developers plan support for other data access APIs including JGOFS. As new APIs are added to DODS, existing clients will be able to read any of those data sets, even those stored in a `foreign' API or format. The first beta release of DODS software is freely available via anonymous ftp in both source form (you will need the GNU C++ compiler and libg++) or binary form for selected platforms. Envision Envision is an interactive system for the management and visualization of large scientific data sets. It runs on UNIX workstations under X/Motif, manages data stored in netCDF or HDF files, and does visualization using NCSA Collage, NCSA XDataSlice, and IDL. Envision is public domain software and is available by anonymous FTP. The primary ftp site for Envision is vista.atmos.uiuc.edu in pub/envision. A secondary FTP site is csrp.tamu.edu. Included in the release are binaries for IBM RS6000, HP, Sun, and SGI, as well as source and make files. Complete documentation, sample data, and binaries to generate Envision project files for the sample data are provided. Envision was jointly developed at the Department of Atmospheric Sciences at University of Illinois at Urbana-Champaign and the Department of Meteorology at Texas A&M University, with funding from NASA's Applied Information Systems Research Program. For further information, contact Keith Searight (krs@cdc.noaa.gov). EPIC NOAA's Pacific Marine Environmental Laboratory (PMEL) has developed the EPIC software package for oceanographic data. EPIC provides graphical display and data field manipulation for multi-dimensional netCDF files (up to 4 dimensions). PMEL has been using this software on Unix and VMS several years. At present, they have: * a data file I/O library ( epslib, which is layered on top of the netCDF library). * epslib allows transparent access to multiple data file formats * a MATLAB MexEPS interface for using any supported EPIC file with MATLAB * suite of EPIC programs for graphics and analysis of hydrographic profile data and time series data. This software was developed on Sun/Unix and is also supported for DEC/Ultrix and VAX/VMS as a system for data management, display and analysis system for observational oceanographic time series and hydrographic data. The EPIC software includes over 50 programs for oceanographic display and analysis, as well as utilities for putting in-situ or observational data on-line (with on-the-fly graphics and data download) on the WWW. The developers are interested in coordinating with others who may be developing oceanographic software for use with netCDF files. The EPIC software is available via anonymous FTP from ftp.noaapmel.gov in the epic/ and /eps directories. To obtain the EPIC software, please see Web pages at http://www.pmel.noaa.gov/epic/epic-toolkits.html. For information about EPIC, please see the Web pages at http://www.pmel.noaa.gov/epic/home.html. Contact epic@pmel.noaa.gov, or Nancy Soreide, nns@noaapmel.gov, for more information. FAN FAN (File Array Notation) is Harvey Davies' package for extracting and manipulating array data from netCDF files. The package includes the three utilities nc2text, text2nc, and ncrob for printing selected data from netCDF arrays, copying ASCII data into netCDF arrays, and performing various operations (sum, mean, max, min, product, ...) on netCDF arrays. A library (fanlib) is also included that supports the use of FAN from C programs. The package is available via anonymous FTP from ftp://ftp.unidata.ucar.edu/pub/netcdf/contrib/fan.tar.Z. Questions and comments may be sent to Harvey Davies, hld@dar.csiro.au. FERRET FERRET is an interactive computer visualization and analysis environment designed to meet the needs of oceanographers and meteorologists analyzing large and complex gridded data sets. It is available by anonymous ftp from abyss.pmel.noaa.gov for a number of computer systems: SUN (Solaris and SUNOS), DECstation (Ultrix and OSF/1), SGI, VAX/VMS and Macintosh (limited support), and IBM RS-6000 (soon to be released). FERRET offers a Mathematica-like approach to analysis; new variables may be defined interactively as mathematical expressions involving data set variables. Calculations may be applied over arbitrarily shaped regions. Fully documented graphics are produced with a single command. Graphics styles included line plots, scatter plots, contour plots, color-filled contour plots, vector plots, wire frame plots, etc. Detailed controls over plot characteristics, page layout and overlays are provided. NetCDF is supported both as an input and an output format. Many excellent software packages have been developed recently for scientific visualization. The features that make FERRET distinctive among these packages are Mathematica-like flexibility, geophysical formatting (latitude/longitude/date), "intelligent" connection to its data base, special memory management for very large calculations, and symmetrical processing in 4 dimensions. Contact Steve Hankin, hankin@noaapmel.gov, for more information. FREUD Freud is a software environment that facilitates the production of publication quality contour and vector plots. While there are many public domain and commercial software packages that allow users to create contour and vector plots, Freud is unique in that users can quickly and easily customize the appearance of contour lines, contour fills, map projections, or vector arrows with a couple of mouse clicks. Freud allows researchers to: * create an unlimited number of contour and vector plot overlays that can be superimposed on up to six different map backgrounds. * interactively position an unlimited number of plots of any size at any position on a plotting page. * set the line style, line color, fill pattern, and fill color of contours, vectors or map background. * read in data in netCDF, binary, or ASCII data formats. * browse through netCDF variable attributes. * create scripts to automate plots by using extensions to the procedural scripting language TCL. * save a set of overlays to restore a given plot at a later date. For more information, contact Joe Sirott (sirott@atmos.washington.edu). GMT GMT(Generic Mapping Tools) is a free, public-domain collection of about 50 UNIX tools that allow users to manipulate two- and three-dimensional data sets (including filtering, trend fitting, gridding, projecting, etc.) and produce Encapsulated PostScript File (EPS) illustrations ranging from simple x-y plots through contour maps to artificially illuminated surfaces and 3-D perspective views in black and white, gray tone, hachure patterns, and 24-bit color. GMT supports 20 common map projections plus linear, log, and power scaling, and comes with support data such as coastlines, rivers, and political boundaries. Version 3.0 was recently announced in "New Version of the Generic Mapping Tools Released," EOS Trans., AGU 72, 329. The package can access netCDF data as well as ASCII, native binary, or user-defined formats. The GMT package is available via anonymous ftp from several servers. Because of file sizes you are strongly encouraged to use the closest server: * kiawe.soest.hawaii.edu in /pub/gmt. [Hawaii, US] Serving Asia/Australia/Pacific * ibis.grdl.noaa.gov in /pub/gmt. [Maryland, US] Serving North and South America * ftp.geologi.uio.no in /pub/gmt. [Oslo, Norway] Serving Europe GMT was developed and is maintained by Paul Wessel (wessel@soest.hawaii.edu) and Walter H. F. Smith (walter@amos.grdl.noaa.gov). Gri Gri is an extensible plotting language for producing scientific graphs, such as x-y plots, contour plots, and image plots. Dan Kelley of Dalhousie University is the author of Gri, which can read data from netCDF files as well as ASCII and native binary data. For more information on Gri, see the URL http://www.phys.ocean.dal.ca/~kelley/gri/gri1.html. HDF interface The National Center for Supercomputing Applications (NCSA) has added the netCDF interface to their Hierarchical Data Format (HDF) software. HDF is an extensible data format for self-describing files. A substantial set of applications and utilities based on HDF is available; these support raster-image manipulation and display and browsing through multidimensional scientific data. An implementation is now available that provides the netCDF interface to HDF. With this software, it is possible to use the netCDF calling interface to place data into an HDF file. The netCDF calling interface has not changed and netCDF files stored in XDR format are readable, so existing programs and data will still be usable (although programs will need to be relinked to the new library). There is currently no support for the mixing of HDF and netCDF structures. For example, a raster image can exist in the same file as a netCDF object, but you have to use the Raster Image interface to read the image and the netCDF interface to read the netCDF object. The other HDF interfaces are currently being modified to allow multi-file access, closer integration with the netCDF interface will probably be delayed until the end of that project. Eventually, it will be possible to integrate netCDF objects with the rest of the HDF tool suite. Such an integration will then allow tools written for netCDF and tools written for HDF to both interact intelligently with the new data files. Ingrid Ingrid, by M. Benno Blumenthal , is designed to manipulate large datasets and model input/output. Given the proper commands in its command file, it can read data from its data catalog, a netCDF file, or a directly attached model, and output the data, either by feeding it to a model, creating a netCDF file, or creating plots and other representations of the data. Ingrid has a number of filters which allow simple data manipulations, such as adding two datasets together, smoothing, averaging, and regridding to a new coordinate. Ingrid is still under development and the source code is not yet available for public release. It currently runs only on SGI (it uses the POINTER and STRUCTURE extensions to FORTRAN). In addition to netCDF, it also reads HDF, CDF, VOGL, and SGI GL. Ingrid is currently running as a WWW daemon that can be accessed through http://rainbow.ldgo.columbia.edu/datacatalog.html to see some of its capabilities on a climate data catalog maintained by the Climate Group of the Lamont-Doherty Earth Observatory of Columbia University. To quote the introduction: The Data Catalog is both a catalog and a library of datasets, i.e. it both helps you figure out which data you want, and helps you work with the data. The interface allows you to make plots, tables, and files from any dataset, its subsets, or processed versions thereof. This data server is designed to make data accessible to people using WWW clients (viewers) and to serve as a data resource for WWW documents. Since most documents cannot use raw data, the server is able to deliver the data in a variety of ways: as data files (netCDF and HDF), as tables (html), and in a variety of plots (line, contour, color, vector) and plot formats (PostScript and gif). Processing of the data, particularly averaging, can be requested as well. The Data Viewer in particular demonstrates the power of the Ingrid daemon. ... LinkWinds The Linked Windows Interactive Data System ( LinkWinds) is a visual data exploration system which has served as a testbed and prototyping environment for a NASA/JPL program of research into graphical methods for rapidly and interactively accessing, displaying and analyzing large multivariate multidisciplinary datasets. It provides a variety of functions and services, including interactive 2- dimensional and 3-dimensional graphical displays of data, hard copy of graphical displays and text, interactive color manipulation, animation creation and display, data subsetting either at the input or output, a journal and macro capability, context-sensitive help, and network support for data search and retrieval as well as collaborative data analysis. It is an integrated multi- application execution environment, running under UNIX, with a full graphical user interface (GUI). The system has several facilities for database access and provides for the ingestion of very large data files in a variety of database formats. Data formats currently accepted are: 1. Raw binary integer (1 - 4 byte) and floating point (4, 8 byte) data. 2. The Hierarchical Data Format (HDF). 3. The Common Data Format (CDF). 4. NetCDF. 5. The Silicon Graphics, Inc. native RGB image format. 6. Data with Planetary Data System (PDS) headers. 7. The astrophysics Flexible Image Transport System (FITS). The current version, 2.1 executes only on SGI platforms under IRIX 4 or IRIX 5. An alpha-test build for the Sun workstation running under Solaris 2.4 is also available. It contains the 2-dimensional subset of LinkWinds applications only. Various LinkWinds packages are available via anonymous ftp from the ftp://twinky.jpl.nasa.gov/pub/LinkWinds directory. MEXCDF An interface between MATLAB and NetCDF called MEXCDF has been developed by Chuck Denham at the US Geological Survey in Woods Hole. MEXCDF is a complete interface between netCDF and MATLAB that uses the MATLAB mexfile utility to allow MATLAB users to read, write, and manipulate netCDF data files in an efficient manner, following the C language interface syntax found in the netCDF 2.3 users manual. In addition, the interface has been enhanced in several ways: * Dimensions and variables accessible by number or name. * Attributes accessible by number or name. * Parameters accessible by number or name. * Prepended "nc" not necessary for operation names. * Prepended "NC_" not necessary for specifying parameters. * Parameter names not case-sensitive. * Required lengths default to actual lengths via -1. * AutoScaling via "scale_factor" and "add_offset" attributes. As an example, to read the following 2D array 'elev' in file 'foo.cdf', short elev(lat, lon) elev:scale_factor=100. elev:add_offset=0., the required MATLAB commands using mexcdf are simply: mexcdf('open','foo.cdf','nowrite'); elev=mexcdf('varget',cdfid,'elev',[0 0],[-1 -1]); mexcdf('close'); The edges values "-1" means get all the values in this dimension, and scale_factor and add_offset are handled automagically by varget. For more information regarding this software, get the file /pub/mexcdf/README via anonymous ftp from crusty.er.usgs.gov (128.128.19.19) or contact Rich Signell at rsignell@crusty.er.usgs.gov. MexEPS PMEL has developed a MATLAB interface, MexEPS, which supports several netCDF file conventions, including those adopted by PMEL. Many styles of time axes are supported and time manipulation routines ease the use of the time axis in MATLAB. The MexEPS package supports the following data formats: * reading, writing and editing netCDF files; * reading and writing Classic EPIC files * reading formatted ASCII files It includes: * VARIABLE, AXIS, ATTRIBUTE manipulation routines * TIME manipulation o TIME enters MATLAB as YYMMDDhhmmss.fff o Can be converted to netCDF udunits time convention (e.g. days since 1990-01-01 00:00:00) * MATLAB help and example scripts using MexEPS * ASCII2MAT mexFunction, which reads a formatted ASCII file into MATLAB as a matrix The MexEPS package is freely available in PMEL's anonymous ftp directory ftp://ftp.pmel.noaa.gov/eps/mexeps/ If you have any questions or comments, please contact the author, Willa Zhu (willa@pmel.noaa.gov) or Nancy Soreide (nns@pmel.noaa.gov). NetCDF File Calculator and Data Editor The netCDF calculator and data editor is part of the EPIC package, described above. Algebraic manipulations available under the netCDF calculator include arithmetic operations (addition, subtraction, multiplication and division), exponentiation, log functions, square root, and some additional functions such as differentiation, integration, regridding and removing the mean value. In addition to the built-in functions, the calculator also allows user written routines for manipulation of data. The netCDF calculator also provides the ability to extract a subset of the EPIC System Data sets by specifying limits in any or all of the four dimensions (x,y,z,time) or (i,j,k,l). The netCDF calculator includes an interactive data-editing function for one dimensional data sets. This displays the data and provides for replacement of individual points or a range of data points by linear interpolation or by typing in a replacement value. Data sets generated by the calculator data manipulation or data editing functions can be plotted with the standard PPLUS plotting commands and can also be written out as data files in any of the EPIC System formats (including netCDF). The netCDF calculator uses an expression language, much like C: although there are several control-flow statements, most statements such as assignments are expressions whose value is disregarded. For example, the assignment operator "=" assigns the value of its right operand to its left operand, and yields the value, so multiple assignments work. The calculator knows about four different data types: scalar (double precision floating point), string, slab (a specification for a four dimensional hyper-slab), and fields (a sub-sampled region of a four dimensional data set). The syntax for each of the four types are similar, however, not all operations are valid for all data types. The netCDF calculator discussed here is a simple programmable interpreter for floating point, string and field expressions. It has been extensively modified from hoc (The UNIX Programming Environment, Kernighan and Pike, Chapter 8). It has C-style control flow, function definition and the usual numerical built-in functions. The netCDF calculator has been developed using lex, a lexical analyzer, and yacc, a parser generator. This allows a systematic and consistent syntax to be implemented easily. NextStep NetCDF Utilities Michael DeMan of Western Washington University has developed a set of netCDF utilities for NextStep. The utilities include a generator, an extractor, an importer, and an exporter. The generator can be used to create netCDF files that follow the COARDS conventions. The extractor provides a GUI wrapper for the ncextr utility, to extract cross sections and subsets of netCDF data. The importer accesses data from ASCII spreadsheet-style files. The exporter does the reverse, creating a spreadsheet-style matrix in one or more files from specified cross-sections of netCDF data. The utilities are available via anonymous FTP from ftp://iceberg.cs.wwu.edu/michael/NetCDF/. A README file provides more information about the utilities, available for NeXT and i386 platforms. For more information, contact Michael DeMan . PolyPaint+ PolyPaint+ is an interactive scientific visualization tool that displays complex structures within three-dimensional data fields. It provides both color shaded-surface display and simple volumetric rendering in either index or true color. For shaded surface rendering, the PolyPaint+ routines first compute the polygon set that describes a desired surface within the 3D data volume. These polygons are then rendered as continuously shaded surfaces. PolyPaint+ contains a wide variety of options that control lighting, viewing, and shading. Objects rendered volumetrically may be viewed along with shaded surfaces. Additional data sets can be overlaid on shaded surfaces by color coding the data according to a specified color ramp. 3D visualizations can be viewed in stereo for added depth perspective. Currently supported 3D visualizations are the following: * Shaded isosurface * Transparent contour shells or isosurfaces at varying levels * Volumetric or density plot * Planes * Contour ribbons * Topographic surface from 2D geographic data sets 3D data volumes may be sliced in the X, Y, or Z plane using an interactive cutting plane. A cross section of the data volume can be viewed in a 2D window as a 2D contour plot, a vector plot, a raster image or a combination of these options superimposed. Map outlines can be used as a background for 2D cross section plots of geographic data. All data is projected according to the coordinates specified by the user for the cross section window. The user interface provides direct manipulation tools for specifying the eye position, center of view, light sources, and color ramps. Subsetting of data can be done easily by selecting the data by index or geographic coordinate. On-line contextual help provides easy access to more detail about the software. Tutorials which range from very simple visualizations to complex combinations of data sets provide the user with a quick learning tool. Currently PolyPaint+ accepts only data which is in the NetCDF file format. A file conversion utility which converts from raw binary data to netCDf is a part of the application. PolyPaint+ is a joint effort of the University of Colorado and NCAR (National Center for Atmospheric Research) funded by the NASA AISRP program. A beta version of PolyPaint+ is currently available free of charge using FTP or for a nominal fee which would cover tape distribution. A license agreement must be signed in order to use it. You may order by... * TELEPHONE : 303-492-7289 (Margi Klemp) : 303-497-8159 (Bill Boyd) * U.S. MAIL : Margi Klemp University of Colorado / LASP 1234 Innovation Dr. Boulder, CO 80303 USA * E-MAIL : margi@aries.colorado.edu SciAn SciAn is a scientific visualization package developed at the Supercomputer Computations Research Institute at Florida State University. More information about SciAn is available from the URL http://www.scri.fsu.edu/~mimi/scian.html. SciAn brings together the power of 3-dimensional scientific visualization and movie making with the ease of use and familiarity of object-oriented drawing packages. SciAn makes it very easy to apply visualization to new data by minimizing the number of decisions a researcher needs to make before seeing the first image on the screen. Once there is an image to work with, the researcher can modify the visualization to bring out details in the data using a wide variety of controls. SciAn can show any number of visualizations on the screen at one time. The viewpoint of different visualizations can be linked, enabling easy comparison of data from observation and simulation. Complete movie scenes can be made by adding a few lines to a SciAn generated script. SciAn can operate a laser videodisc recorder automatically or save images to files for recording later. SciAn produces isosurface, color mesh, line contour, trace, arrow, ball & stick, point cloud, and numeric display visualizations. It can work with fields defined over structured or nonstructured grids and geometric data in several formats, including NCSA's Hierarchical Data Format, NetCDF, and Protein Data Bank formats. SciAn is available, with full documentation, in electronic form over the network free of charge. For more information on SciAn, send electronic mail to: scian-info@scri.fsu.edu or write Supercomputer Computations Research Institute, Florida State University, Tallahassee, Florida, 32306-4052. SciAn requires: * a Silicon Graphics IRIS 4D series or IBM RISC System/6000 workstation with GL graphics library * a Z-buffer * ANSI-compliant C Desirable but not required are * an Alpha buffer * double buffering * FORTRAN and external libraries for more features Zebra Zebra (formerly named Zeb) is a system for data ingest, storage, integration and display, designed to operate in both real time and postprocessing modes. Zebra was developed by Jonathan Corbet and others in NCAR's Research Data Program. Zebra's primary use is for the superpositioning of observational data sets (such as those collected by satellite, radar, mesonet and aircraft) and analysis products (such as model results, dual-Doppler synthesis or algorithm output). Data may be overlaid on a variety of display types, including constant altitude planes, vertical cross-sections, X-Y graphs, Skew-T plots and time-height profiles. The fields for display, color tables, contour intervals and various other display options are defined using an icon based user-interface. This highly flexible system allows scientific investigators to interactively superimpose and highlight diverse data sets; thus aiding data interpretation. Data handling capabilities permit external analysis programs to be easily linked with display and data storage processes. The data store accepts incoming data, stores it on disk, and makes it available to processes which need it. An application library is available for data handling. The library functions allow data storage, retrieval and queries using a single applications interface, regardless of the data's source and organization. NetCDF data that conforms to Zebra conventions is supported by this interface. The NCAR Research Data Program is currently making the Zebra system available to the university research community, either in the form of the STORM-FEST CDROM set, or as a source-code distribution available over the network. In addition, configuration files for all projects in which Zebra has been used are available with the source code distribution, and some quick-look datasets are available as well. The software is distributed without warranty, and may not be used for commercial purposes. Persons interested in obtaining Zebra should contact Michele Case (case@stout.atd.ucar.edu). ---------------------------------------------------------------------------- User-Contributed Software Unidata makes available a separate catalog (http://www.unidata.ucar.edu/packages/netcdf/contrib.html) to a directory (ftp://ftp.unidata.ucar.edu/pub/netcdf/contrib/) of freely available, user-contributed software and documentation related to the netCDF library. This software may be retrieved by anonymous FTP. We haven't necessarily used or tested this software; we make it available "as is". The criteria for inclusion in the netcdf/contrib/ directory of user-contributed software are: * General usefulness to a significant part of the netCDF community * Small size * Infrequent need for updates * Free availability ---------------------------------------------------------------------------- AVS AVS (Application Visualization System) is a visualization application software and development environment. An AVS module has been written that allows multi-dimensional netCDF data sets to read into AVS as uniform or rectilinear field files. The AVS user can point and click to specify the name of the variable in the selected netCDF file, as well as selecting the hyperslab. If 1D coordinate variables exist (a variable that has the same name as a dimension) then the coordinate variable will be used to specify the coordinates of resulting rectilinear field file. If no coordinate variable exists, then the resulting field file will be uniform. Once in AVS, there are hundreds of analysis and display modules available for image processing, isosurface rendering, arbitrary slicing, alpha blending, streamline and vorticity calculation, particle advection, etc. AVS runs on many different platforms (Stardent, DEC, Cray, Convex, E and S, SET, Sun, IBM, SGI, HP, FPS and WaveTracer), and it has a flexible data model capable of handling multidimensional data on non-Cartesian grids. The module source code and documentation is available from the International AVS Center, in the ftp://testavs.ncsc.org/avs/AVS5/Module_Src/data_input/read_netcdf/ directory. See also the information on DDI for another way to use netCDF data with AVS. Environmental WorkBench SuperComputer Systems Engineering and Services Company (SSESCO) has developed the Environmental WorkBench (EWB), an easy to use visualization and analysis application targeted at environmental data. The EWB currently has numerous users in the fields of meteorological research, air quality work, and groundwater remediation. EWB system features include: * Random access file structure using the netCDF-based public domain MeRAF file system with support for gridded, discrete (non-grid-based observation), and particle types * Support for geo-referenced or Cartesian coordinate systems * Object oriented Graphical User Interface (GUI) that is very easy to use * Tools for converting model and observational data sets and data writers to netCDF * Interactive rotation/translation of scenes in 3D space * Time sequencing controls to step forward/backward, animate sequentially, or go to a chosen time step; including multiple asynchronous or non-uniform time steps * Interactive slicers to select cross sections through 3D data sets * Display operators available on the slices, including o Contour lines with selectable contour levels o Color shading by data value with variable transparency level o Arrow and streamline representation for vector quantities o Positional reference lines at user selected intervals o Color coded shapes at each grid node * Multiple 3D isosurfaces at selected parameters and values with variable transparency * Display of particle positions with coloring by type, height, and source * Display of discrete data using colored spheres and labels for scalar data and arrows for vectors (with arrowheads or meteorological style) * Multiple user definable color maps to which isosurface and colored field shading may be separately assigned * On screen annotation for generation of report ready figures * Image export in any of the common image formats (gif, tiff, encapsulated postscript, etc.) * Graceful handling of missing or bad data values by all the graphics rendering routines * Automatic data synchronization to allow automatic screen updating as new data arrives in real-time from a model or set of sensors * Two and three dimensional interpolation from scattered observations to a grid, using the Natural Neighbor Method. This robust volume based method yields results far superior to distance weighting schemes. Systems currently supported include IBM RS/6000, Silicon Graphics, HP and SUN workstations, with a PC version scheduled for release. SSESCO has implemented a meta-file layer on top of the netCDF library, called MeRAF. It handles multiple netCDF files as well as automatic max-min calculations, time-varying gridded, particle, and discrete data, logical groupings for discrete data, and an overall simplified and flexible interface for storing scientific data. MeRAF is being used by the DOE at the Hanford-Meteorological Site for observational data and will be used for their weather-modeling. IBM Visualization Data Explorer The IBM Visualization Data Explorer (Data Explorer or simply DX) is a general-purpose software package for data visualization and analysis. It employs a data-flow driven client-server execution model and provides a graphical program editor that allows the user to create a visualization using a point and click interface. DX runs on 7 major UNIX platforms and is designed to take full advantage of multi-processor systems from IBM, SGI and Sun. DX is built upon an internal data model, which describes and provides uniform access services for any data brought into, generated by, or exported from the software. This data model supports a number of different classes of scientific data, which can be described by their shape (size and number of dimensions), rank (e.g., scalar, vector, tensor), type (float, integer, byte, etc. or real, complex, quaternion), where the data are located in space (positions), how the locations are related to each other (connections), aggregates or groups (e.g., hierarchies, series, composites, multizone grids, etc.). It also supports those entities required for graphics and imaging operations within the context of Data Explorer. Regular and irregular, deformed or curvilinear, structured and unstructured data as well as "missing" or invalid data are supported. The details of the data model are hidden at the user level. As a result DX operations or modules are polymorphic and appear typeless. The DX Import module, which reads data for use within Data Explorer directly utilizes data in netCDF as well as other formats (e.g., HDF, CDF). One or more variables may be selected as well as step(s) of a time series. Data in conventional netCDFs are directly imported. Since the DX data model is more comprehensive than the netCDF data model, a methodology to extend netCDF via attribute conventions (e.g., for unstructured meshes, non-scalar data and hierarchies) for use with Data Explorer is available. DX supports a number of realization techniques for generating renderable geometry from data. These include color and opacity mapping (e.g., for surface and volume rendering), contours and isosurfaces, histograms, two-dimensional and three-dimensional plotting, surface deformation, etc. for scalar data. For vector data, arrow plots, streamlines, streaklines, etc. are provided. Realizations may be annotated with ribbons, tubes, axes, glyphs, text and display of data locations, meshes and boundaries. Data probing, picking, arbitrary surface and volume sampling, and arbitrary cutting/mapping planes are supported. DX supports a number of non-graphical functions such as point-wise mathematical expressions (e.g., arithmetic, transcendental, boolean, type conversion, etc.), univariate statistics and image processing (e.g., transformation, filter, warp, edge detection, convolution, equalization, blending, morphological operations, etc.). Field/vector operations such as divergence, gradient and curl, dot and cross products, etc. are provided. Non-gridded or scattered data may be interpolated to an arbitrary grid or triangulated, depending on the analysis requirements. The length, area or volume of various geometries may also be computed. Tools for data manipulation such as removal of data points, subsetting by position, sub/supersampling, grid construction, mapping, interpolation, regridding, transposition, etc. are available. Tools for doing cartographic projections and registration as well as earth, space and environmental sciences examples are available at Cornell University via info.tc.cornell.edu. Contact Lloyd Treinish (lloydt@watson.ibm.com) for more information about using data in netCDF with the IBM Visualization Data Explorer. IDL interface IDL (Interactive Data Language) is a scientific computing environment that combines mathematics, advanced data visualization, scientific graphics, and a graphical user interface toolkit to analyze and visualize scientific data. Designed for use by scientists and scientific application developers, IDL's array-oriented, fourth-generation programming language allows you to prototype and develop complete applications. IDL now supports data in netCDF format. IDL is a commercial product available from Research Systems, Inc. 777 29th Street, Suite 302 Boulder, CO (303) 786-9900 As an example, here is how to read data from a netCDF variable named GP in a file named "data/aprin.nc" into an IDL variable named gp using the IDL language: id = ncdf_open('data/april.nc') ncdf_varget,id, ncdf_varid( id, 'GP'), gp Now you can visualize the data in the gp variable in a large variety of ways and use it in other computations in IDL. You can FTP a demo version of IDL, including the netCDF interface, by following the instructions in pub/idl/README available via anonymous FTP from gateway.rsinc.com or boulder.colorado.edu. IRIS Explorer Module The Atmospheric and Oceanic Sciences Group at the National Center for Supercomputing Applications (NCSA) and the Mesoscale Dynamics and Precipitation Branch at NASA-Goddard Space Flight Center have developed the NCSA PATHFINDER module set for IRIS Explorer. Two of the modules, ReadDFG (to output Grids), and ReadDF (to output Lattices) are capable of reading from NCSA HDF files, MFHDF/3.3 files, and Unidata netCDF files. A user-friendly interface provides control and information about the contents of the files. For ReadDF, the format translation is handled transparently. Up to five unique lattices may be generated from the file (as these files can contain multiple data fields) using a single module. A variety of dimensionalities and data types are supported also. Multiple variables may be combined in a single lattice to generate vector data. All three Explorer coordinate systems are supported. With ReadDFG, user selected variables from the file are output in up to five PATHFINDER grids. Each grid can consist of scalar data from one variable or vector data from multiple variables. Coordinate information from the file is also included in the grids. Any number of dimensions in any of the Explorer coordinate types are supported. For more information on the NCSA PATHFINDER project and other available modules, visit the WWW/Mosaic PATHFINDER Home Page at http://redrock.ncsa.uiuc.edu/PATHFINDER/pathrel2/top/top.html The ReadDF module may be downloaded either via the WWW server or anonymous ftp at redrock.ncsa.uiuc.edu in the /pub/PATHFINDER directory. For more information please send email to: pathfinder@redrock.ncsa.uiuc.edu See also the information on DDI for another way to use netCDF data with IRIS Explorer. MATLAB MATLAB is a software package that integrates numerical analysis, matrix computation, signal processing and graphical display. Two freely-available software packages that implement a MATLAB/netCDF interface are available: MEXCDF, and MexEPS. NCAR Graphics The Scientific Visualization Group of the NCAR Scientific Computing Division has developed the NCAR Command Language (NCL) for interactive data manipulation and plot specification. This language supports a project to create, from existing utilities, a scientific visualization environment that provides a means of reading and writing data, a means of manipulating that data, and a means of visually analyzing the data interactively. NCL is intended to provide easy and intuitive access to datasets and allow users to explore and process their data prior to visualization. Since datasets often come in a variety of data formats, grid sizes, grid resolutions, and units, very different datasets often need to be combined, compared, and used at the same time. Currently, specialized applications must be developed to read individual datasets and transform them into a form that is compatible with other datasets being used, as well as with the graphics package being used. NCL allows different datasets, in different storage formats, to be imported into one uniform and consistent data manipulation environment. The primary data format used internally by NCL is the netCDF data format. NCL doesn't place any restrictions or conventions on the organization of input netCDF files. NCL is a complete programming language that provides flexibility and configurability. In NCL the primary data type is the data file record. A data file record stores one or more variables, dimension information, coordinate variable information and attribute information as one NCL object. A binary file can be read in, dimension names, variable names, attributes and coordinate variables can be assigned to it using NCL language constructs and the resulting file record can be written to any of the currently supported formats, including netCDF, without writing a single line of source code. The NCL language also provides the ability to specify configuration parameters for visualizations of data. These descriptions can be stored and used as user defaults and reused as plot specifications. The function set of NCL contains built-in data processing and mathematical functions and can be extended by the user to provide custom data processing techniques, as well as custom data ingestion. Contact Ethan Alpert, the NCAR Interactive project coordinator, at ethan@ncar.ucar.edu for more information. PV-Wave PV-Wave is a software environment from Visual Numerics for solving problems requiring the application of graphics, mathematics, numerics and statistics to data and equations. PV-WAVE uses a fourth generation language (4GL) that analyzes and displays data as you enter commands. PV-WAVE includes integrated graphics, numerics, data I/O, and data management. The latest version of PV-Wave supports data access in numerous formats, including netCDF. See also the information on DDI for another way to use netCDF data with PV-Wave. PPLUS Plot-Plus (PPLUS) is a general purpose scientific graphics package, which is used in several PMEL applications. It will read most standard ascii or binary files, as well as netCDF file format, which used by the TOGA-TAO Project and the EPIC system for management display and analysis. PPLUS is an interactive, command driven, scientific graphics package which includes features such as Mercator projection, Polar Stereographic projection, color or gray scale area-fill contour plotting, and support for many devices: X-windows, PostScript, HP, Tektronix, and others. This powerful and flexible package recognizes netCDF data format, and it can extract axis lables and graph titles from the data files. The user can customize a plots, or combine several plots into a composite. Plots are of publication quality. The PPLUS graphics package is used for all the TAO workstation displays, including the animations. The animations are created by generating a PPLUS plot for each frame, transforming the PPLUS metacode files into hdf format with the PPLUS m2hdf filter, and then displaying the resulting bit maps as an animation with the XDataSlice utility, which is freely available on Internet from the National Center for Supercomputing Applications, at anonymous@ftp.ncsa.uiuc.edu (141.142.20.50). There is also a new m2gif utility which produces GIF files from PPLUS metacode files. PPLUS is supported for most Unix systems and for VAX/VMS, and is in use at many oceanographic institutes in the US (e.g., (PMEL, Harvard, WHOI, Scripps, NCAR, NASA, University of Rhode Island, University of Oregon, Texas A&M...) and also internationally (Japan, Germany, Australia, Korea...). Examples of PPLUS graphics are any of the graphs in the Web pages at: * http://www.pmel.noaa.gov/epic/home.html * http://www.pmel.noaa.gov/toga-tao/home.html * http://www.pmel.noaa.gov/toga-tao/realtime.html A site license may be purchased for the PPLUS graphics package. There is no limit to the number of workstations at the site using PPLUS, and you get the source code (but you can't distribute it). To purchase PPLUS, contact Dr. Donald W. Denbo. Internet: pplus@olympus.net Fax and Voice: 360-452-2156 Postal mail: Plot Plus Graphics, P.O. Box 4, Sequim, Spyglass Slicer and Dicer Spyglass Slicer for Windows/Windows NT is a commercial volumetric data analysis and visualization tool that reads a variety of data and image formats, including netCDF. After importing the data, Slicer uses ray tracing to render data as volumes, slices, or isosurfaces. It's possible to make any data value transparent, translucent, or opaque, and users can render slices in orthogonal planes or at arbitrary angles. Animations can be created from user-specified key frames. Spyglass Dicer is a similar commercial visualization tool for Apple Macintoshes that displays three-dimensional volumes of data in color, and that allows you to interactively create orthogonal slices, oblique slices, isosurfaces, blocks, and cutouts for viewing those data values. You can make all the data within a certain range of values invisible, create time or space animation sequences, print images in halftones or on color printers, and select from a variety of save and output features. Spyglass Dicer will read HDF and netCDF files as well as generic data. Although Dicer is designed primarily for the visual inspection and analysis of volumetric data, it also provides data file translation, resampling, and reformating functions for netCDF and HDF files. Slicer and Dicer are published and marketed by Spyglass, URL: http://www.spyglass.com/, e-mail: info@spyglass.com, phone: (217)355-6000, fax: (217)355-8925. VISAGE and Decimate VISAGE (VISualization, Animation, and Graphics Environment) is a turnkey 3D visualization system developed at General Electric Corporate Research and Development, (Schroeder, WJ et al, "VISAGE: An Object-Oriented Scientific Visualization System", Proceedings of Visualization `92 Conference). VISAGE is designed to interface with a wide variety of data, and uses netCDF as the preferred format. VISAGE is used at GE Corporate R & D, GE Aircraft Engine, GE Canada, GE Power Generation, as well as ETH Zurich, Switzerland, MQS In Chieti, Italy, and Rensselaer Polytechnic Institute in Troy, New York. GE has another application called "Decimate" that does polygon reduction/decimation (Schroeder,WJ et al, "Decimation of Triangle Meshes", Proceedings of SIGGRAPH `92). This application uses netCDF as a preferred format. Decimate is currently licensed to Cyberware, Inc., makers of 3D laser digitizing hardware. Decimate is currently bundled with the scanners, and will soon be available as a commercial product. WXP WXP is a software package developed at Purdue University in the Department of Earth and Atmospheric Sciences. It is intended to be an general purpose weather visualization tool for current and archived meteorological data. The products available with this server are derived from data obtained from the National Weather Service and the University of Wisconsin. WXP decodes broadcast weather data and model outputs into WXP-specific netCDF files, and provides analysis and display application programs for manipulating and displaying the data in those files. Universities must be licensed by Unidata and Purdue to obtain the WXP software. For information on obtaining a WXP license, contact support@unidata.ucar.edu. ----------------------------------------------------------------------------