The HDF4 CF (H4CF) Conversion Toolkit can access various NASA HDF4 and HDF-EOS2 files by following the CF conventions . The toolkit includes a conversion library for application developers and a conversion utility for NetCDF users. We have translated the information obtained from various NASA HDF-EOS2 and HDF4 files and the corresponding product documents into the information required by CF into the conversion library. We also have implemented an HDF4-to-NetCDF (either NetCDF-3 or NetCDF-4 classic) conversion tool by using this conversion library. In this web page, we will first introduce how to build the conversion library and the tool from the source. Then, we will provide basic usage of the tool and the conversion library APIs. The information for the supported NASA HDF-EOS2 and HDF4 products and visualization screenshots of some converted NetCDF files will also be presented.
If you need the HDF4-to-NetCDF conversion tool only, you can download static binaries from Table 1.
Platform | Binary for NetCDF-3 converter |
Binary for NetCDF-4 converter |
---|---|---|
Mac OS 10.11-10.15 for version 1.3 | h4tonccf | h4tonccf_nc4 |
Mac OS 10.12 (Sierra) for version 1.2 | h4tonccf | h4tonccf_nc4 |
Mac OS 10.11 (El Capitian) for version 1.2 | h4tonccf | h4tonccf_nc4 |
Mac OS 10.10 (Yosemite) for version 1.1 | h4tonccf | h4tonccf_nc4 |
Mac OS 10.9 (Mavericks) for version 1.1 | h4tonccf | h4tonccf_nc4 |
Mac OS 10.8 (Mountain Lion) for version 1.1 | h4tonccf | h4tonccf_nc4 |
Linux CentOS 7 (x86_64) for version 1.3 | h4tonccf | h4tonccf_nc4 |
Linux CentOS 6 (x86_64) for version 1.3 | h4tonccf | h4tonccf_nc4 |
Linux CentOS 7 (x86_64) for version 1.2 | h4tonccf | h4tonccf_nc4 |
Linux CentOS 7 (x86_64) for version 1.1 | h4tonccf | h4tonccf_nc4 |
Linux CentOS 6 (x86_64) for version 1.1 | h4tonccf | h4tonccf_nc4 |
Linux CentOS 5 (x86_64) for version 1.1 | h4tonccf | h4tonccf_nc4 |
Windows (Visual Studio on windows 7) for version 1.1 | h4tonccf | h4tonccf_nc4 |
If you want to develop your own application like the HDF4-to-NetCDF conversion tool with H4CF library APIs, you need to download the source code package from here Please unpack the gzipped tar archive file to a location (e.g., /usr/local/src) where source packages are usually built on your machine. The package includes the conversion tool source code and documentation.
You may confirm the checksum(sha256) of the binaries and the source tar from h4cf_checksum .
Building the H4CF Conversion library requires the installation of HDF4 library and HDF-EOS2 library.
In addition, building the HDF4-to-NetCDF conversion tool requires the installation of the latest (version 4.3.0 or above)
NetCDF
C library.
The HDF4 library must be configured with "--disable-netcdf
" option.
Otherwise, the HDF4-to-NetCDF conversion tool cannot be built.
To support NetCDF-4 classic output for the converter, you must install the latest
HDF5
library first before you install the NetCDF library.
If you prefer to generate the NetCDF-3 output,
the NetCDF library must be configured with "--disable-netcdf-4
" option
and the HDF5 library installation is not required any more.
However, we encourage you to choose NetCDF-4 classic conversion
because the NetCDF-3 file format does not support unsigned integer 32-bit types
and may cause overflow error during the conversion.
The H4CF conversion toolkit 1.3 tests with HDF4.2.15, HDFEOS2.19, netCDF4.7.3 and HDF5 1.10.5(for netCDF-4).
The H4CF conversion toolkit 1.2 tests with HDF4.2.13, HDFEOS2.19, netCDF4.4.1.1 and HDF5 1.8.19(for netCDF-4).
The H4CF conversion toolkit 1.1 tests with HDF4.2.11, HDFEOS2.19, netCDF4.3.3.1 and HDF5 1.8.15-patch1(for netCDF-4).
We provide a method to set up the build
environment using a configure
script, which is maintained
by the
autoconf
system.
You should specify the paths to
the installed libraries as arguments when you run the
configure
script. Figure 1 shows how you specify the configuration
options. You should change the path values (e.g., /usr/local) to match the actual
paths on your system.
If HDF4 and HDF-EOS2 are built without using the szip library, then
the --with-szlib
option can be omitted. For some
systems that already have the zlib library and the jpeg library,
--with-jpeg
and --with-zlib
can also be omitted.
If you want to install conversion library only and your system doesn't have the NetCDF library,
--with-netcdf
can also be omitted.
If your system already has the NetCDF library in your standard system path,
you can skip building the conversion tool by specifying --with-netcdf=no
.
In this case, one may use the following command as shown in Figure 2.
However, we strongly encourage users to build the converter utility as well. Therefore, by default, the configure script will build the conversion utility if an NetCDF library is found in your standard system path. That means you don't have to specify --with-netcdf
as long as your default system build environment has the latest version of NetCDF to build the conversion tool.
The recommended command in Figure 1 will generate a Makefile
.
With the Makefile
, you can build the H4CF library
and the HDF4-to-NetCDF Converter utility program
by issuing the make
command as shown in Figure 3.
The above command will install the H4CF conversion library
and the conversion tool.
With the default value of the prefix
variable in the above
configuration, the make install
command will
produce the standalone static library files -
$prefix/lib/libh4cf.a
and $prefix/lib/libh4cf.la
.
The C++ header files for application programs that use the library
will be installed under $prefix/include
.
The HDF4-to-NetCDF conversion utility h4tonccf
will be
installed under $prefix/bin
.
For the conversion tool h4tonccf
, the
above "configure
" and "make install
" commands will
generate a version of a tool that produces either NetCDF-3 or NetCDF-4 class file but not both.
The output really depends on how you built your NetCDF library as explained in the above REQUIREMENTS section.
To build the static conversion tool, one can also follow the Makefile.template under the /utility to manually build. We provide the dependent libraries that can be found under the deps directories of
We will convert an HDF-EOS2 file from NASA Goddard Space Flight Center Level 1 and Atmosphere Archive and Distribution System (LAADS) to NetCDF and examine it using Unidata IDV . Users can download the HDF file used in this example from here . To convert the example MODIS HDF-EOS2 file, run the converter as shown in Figure 5.
The following figure is the view of the data fields of the converted NetCDF file MYD021KM when it is opened with IDV.
The following figure is the color-shaded plan view of the Earth View 1KM Emissive Bands data field of the MYD021KM file in IDV.
Next, we will convert a MODIS Terra file from the NASA Goddard Space Flight Center Ocean Biology Processing Group (OBPG) to NetCDF and examine it using Panoply. Users can download the HDF file used in this example from here. To convert the example HDF4 file, run the converter as shown in Figure 8.
The following figure is the view of the data fields of the converted NetCDF file T20000322000060.L3m_MO_NSST_4.nc when loaded into Panoply.
The following figure is the plot of l3m_data in T20000322000060.L3m_MO_NSST_4.nc generated by Panoply.
Finally, we will convert an HDF-EOS2 file from the NASA Goddard Earth Sciences Data and Information Center (GES DISC) to NetCDF and examine it using IDV. Users can download the HDF file used in this example from here. To convert the example AIRS HDF-EOS2 file, run the converter as shown in Figure 11.
The following figure is the view of the data fields of the converted NetCDF file AIRS.2002.08.01.L3.RetStd_H031.v4.0.21.0.G06104133732.nc when loaded into IDV.
The following figure is the color-shaded plan view of the Temperature_WM_A data field of the converted NetCDF file AIRS.2002.08.01.L3.RetStd_H031.v4.0.21.0.G06104133732.nc generated by IDV.
The Table 2 summarizes the NASA products that are supported by the conversion tool. See also the demo section for the comprehensive examples.
NASA Data Centers | Product |
---|---|
GES DISC | AIRS / TRMM / MERRA / TOMS |
LAADS | MODIS / Suomi NPP VIIRS |
LaRC ASDC | CERES / MOPITT |
LP DAAC | MODIS |
NSIDC | AMSR_E / MODIS /NISE |
OBPG | OCTS / SeaWiFS / CZCS / MODIS |
PO.DAAC | AVHRR |
Although the H4CF Conversion Toolkit satisfies the key requirements of the CF conventions that make NetCDF visualization tools work, there are some specific non-CF compliant attributes that the toolkit doesn't process. To make such attributes follow the CF conventions, we recommend users to utilize either NcML or NCO.
For example, if you carefully examine the plot of converted file in Figure 10, you will notice that there is no unit in the colorbar scale and land area is covered with fill values because the units attribute is not provided for the l3m_data as dataset attribute in the HDF4 file and _FillValue attribute is missing in the HDF4 file. Instead, the units information is provided as a file attribute called Units with value deg-C. You can add (or overwrite) units and _FillValue attribute and their values by writing a simple NcML file as shown in Figure 14.
Then, you can open the NcML file directly with Panoply after putting the NcML file and the converted NetCDF file into the same directory. Finally, Panoply will display the correct unit information excluding fill values as shown in Figure 15.
You can also create a new NetCDF file from NcML using the ToolsUI application available from Unidata. By creating the attribute-corrected NetCDF file from the converted file using the ToolsUI, NetCDF-C visualization tool like GrADS may visualize the attribute-corrected NetCDF file. For more information about this technique, please read the section Writing out files modified with NcML from the ToolsUI tutorial.
If you want to edit a batch of converted NetCDF files via command line tool instead of ToolsUI GUI tool, you can download netcdfAll-4.3.jar and issue a command as shown in Figure 16.
netcdfAll-4.3.jar
file in the same directory
where a.ncml
file exists. The a.ncml
should contain a line that refers to the
converted file a.nc
in the same directory.
NCO (version 4.3.0 or above)
also allows you to edit attributes of NetCDF file via command line interface.
You can get the same screenshot in Figure 15 using the ncatted
command in
NCO as shown in Figure 17.
In the above example, the first line will create a new file (i.e., T20000322000060.L3m_MO_NSST_4.nco.nc) with units attribute and the second line will modify the new file directly.
The Table 3 summarizes some of the NASA products that need editing after conversion. Please note that we could not test all NASA products so the table is far from complete. Please let us know if you find other products that need editing.
NASA Data Centers | Product | Note |
---|---|---|
GES DISC | AIRS | Add units attribute. |
LAADS | MOD04/MYD04 | Change the scale factor. |
NSIDC | AMSR_E_L2A | Rename scale factor. (e.g., SCALE_FACTOR to scale_factor) |
OBPG | CZCS / MODIS | Add scale / fill value / offset attributes. |
We provide a directory called examples
in the distribution package to demonstrate
how H4CF Library APIs work with different types of HDF4 files.
The directory has several subdirectories with sample application codes and data files.
Each directory has
read
,
read_dims
,
read_subset
,
read_var_attrs
and
read_file_attrs
applications. To build these examples, modify the
Makefile.template
in each subdirectory to conform to the configuration in your system and compile them as follows:
The H4CF library present HDF-EOS2/HDF4 groups, variables, and attributes as abstract C++ objects in STL containers. When a file is opened, lists of pointers to file attributes and variables are returned as shown in Figure 5.
You can traverse all attributes and variables using the C++ STL list iterator as shown in Figure 6. All returned variable names and attribute names will follow the CF conventions.
The example programs employ these API methods to access the objects
in an HDF file. The sample HDF-EOS2 file
examples/hdf-eos2/geo.hdf
used in the following examples is included in the source code
distribution.
The first example code, examples/hdf-eos2/read.cpp , prints each value of the variable named temp in the sample file as shown in Figure 21. You can verify the dimensions and values of the temp variable using either HDF-EOS2 Dumper or HDFView .
The second example code,
examples/hdf-eos2/read_subset.cpp
,
subsets data from
the temp variable using the
start = {0,0}
, stride = {2,2}
and
edge = {4,4}
function parameters and
prints the resulting 4-by-4 subsetted values,
which is a collection of every second value from the variable as shown in Figure 22.
The third example code, examples/hdf-eos2/read_var_attrs.cpp , reads the attributes from the temp variable and prints the name and value of its attributes as shown in Figure 23. In the sample file, the variable has only one attribute called _FillValue with value -999.
Please note that Vdata attributes are mapped to file attributes while Vdata field attributes are mapped as variable attributes. In addition, HDF4 file annotations are not converted in the 1.0 release.
This conversion toolkit generally follows the HDF4 to CF mapping document to convert HDF4 and HDF-EOS2 objects to CF. Please also read the release document especially paying attention to the Known Problems section. Please also check the user's guide on how to use the APIs and the conversion tool. The complete reference manual for the APIs is also available.
For a comprehensive list of NASA data products that the H4CF Conversion Toolkit can support, please visit the sample conversion demo page. The demo page has more than 40 different NASA data products from 8 different data centers. It also has both NetCDF-3 and NetCDF-4 classic files that are converted from the NASA data products.