These are step-by-step instructions for downloading and processing TRMM PR (profiling radar) rainfall data. They are being assembled for Drew's Taiwan research, but they can be easily modified for any geographic area or for other TRMM instruments.

Note: Production of the 2A25 datasets will be discontinued as of April 30, 2003, though exisiting files will remain available.

Drew, you see here the greater density of radar data over TMI data.

First register as a user of the Distributed Active Archive Center. Go to NASA to order your data. Set your coordinates to 120 26 122 21 Select a date range. For Taiwan, you might get 6 months at a time. Pick 2A25, and set max display to 500 items. submit.

Confirm that you have less than 2 gigabytes and click on "Add All Found To Order". Enter your registration and password.

Click on FTP and "Create Order"

Click on "Review & Submit"

Click on "Submit"

Click on "Submit" You will see a confirmation code, which will also be mailed to you, e.g. Yzgz615ScN3 . You will recieve email when your order is ready, in about 4 hours. You can download a tar file or FTP individual files. If you use the tar file, note that it unpacks to a .ops directory.

Download software. Each HDF file represents an orbital track. Uncompress a file and execute vshow if you want to see how complicated the HDF files are. The following shell script will create ASCII records for you geographic area:

pushd $1
foreach z (*Z)
  set fil=$z:r
  set dayo=`echo $z | cut -d '.' -f3-4`
  if (-e rain$dayo) then
    echo rain$dayo already exists.
    echo $dayo
    uncompress $z
    ../bin/hdp dumpsds -n nearSurfRain -d $fil| tr -s ' \012' '\012\012' >! temprain
    ../bin/hdp dumpsds -n geolocation -d $fil| grep -v '^$' | paste - temprain | grep -v FloatInf | grep -v '\-9999' | ../bin/asciiclip > ! rain$dayo
    compress $fil
It first pushes to the directory where the HDF.Z files are stored. (I like to have a separate directory for each batch I download.) It sets the variable dayo, indicating the day and orbit. If the output file does not exist, it proceeds to uncompress the HDF file and dump the desired data. First it dumps the data stored under the keyword nearSurfRain, one value per line. If you are working with 2A12 data, the keyword is surfaceRain. Then it dumps latlong coordinate (keyword geolocation), filtering out the blank records between scan lines, and combining it with rain. Data proceeds down the pipe as bad values are filtered, and the program asciiclip filters out triplets outside your geographic area. Edit this program to alter the hardwired coordinates.

When you have built and assembled every ASCII file for a month (or any other time period of interest), create a command file consisteing of a) the root name of the output file, and b) a list of the input files. Invoke the program sortsummer. It builds an array with an element for each geographic cell, and averages every rainfall record within that cell. The array is then written out as a binary (.bil) file. A second array indicating the number of records per cell is written out for quality-assurance purposes. Alter the coordinates in sortsummer.f before compiling. The fllowing block of csh code can automate this process.

set year=1998
foreach ordmonth (01xjan 02xfeb 03xmar 04xapr 05xmay 06xjun 07xjul 08xaug 09xsep 10xoct 11xnov 12xdec)              
  set ord = `echo $ordmonth | sed 's/x.*//'`
  set month = `echo $ordmonth | sed 's/.*x//'`
  echo $month $ord   
  cd ${year}$month
  echo $month${year} >! ${year}$
  ls rain${year}${ord}* >> ${year}$
  sortsummer < README
  cd ..