DOI

TGW WRF Workflow

This workflow describes the steps undertaken to prepare input data and run the WRF model in support of generating data for the historical scenario of the Thermodynamic Global Warming Simulations dataset for the Jones et al 2023 paper.

  1. Download and compile WRFv4.0.1 (which will be used for metgrid):
    1. cd $SCRATCH/WRF_CLIMATE
    2. wget https://github.com/wrf-model/WRF/archive/refs/tags/v4.0.1.zip
    3. unzip v4.0.1.zip
    4. cd WRF-4.0.1
    5. Load important modules:
      1. You may need to unload some other modules first
      2. module load craype-haswell
      3. module load cray-netcdf
      4. module load impi
    6. ./configure
      1. On NERSC, you first need to set the NETCDF environment variable: export NETCDF=${NETCDF_DIR}
      2. On NERSC, choose architecture 66 for Intel HSW (dmpar)
      3. Choose 1=basic
    7. ./compile wrf
  2. Download and compile WRFv4.2.1:
    1. cd $SCRATCH/WRF_CLIMATE
    2. wget https://github.com/wrf-model/WRF/archive/refs/tags/v4.2.1.zip
    3. unzip v4.2.1.zip
    4. cd WRF-4.2.1
    5. Load important modules:
      1. You may need to unload some other modules first
      2. module load cray-netcdf
      3. export NETCDF=${NETCDF_DIR}
      4. module load cray-parallel-netcdf
      5. export PNETCDF=${PARALLEL_NETCDF_DIR}
      6. module load cray-hdf5
      7. export HDF5=${HDF5_DIR}
      8. module load craype-mic-knl
      9. module load impi
      10. module load png
      11. module load jasper
      12. export JASPERLIB=/global/common/cori/software/jasper/1.900.1/hsw/intel/lib
      13. export JASPERINC=/global/common/cori/software/jasper/1.900.1/hsw/intel/include
    6. Update the max history fields value:
      1. Open the file frame/module_domain.F with an editor and search for max_hst_mods = 200
      2. Update the value from 200 to 2000
      3. Save the file
    7. Fix adaptive timestep bug:
      1. Update the file dyn_em/adapt_timestep_em.F following the diff here.
    8. ./configure
      1. On NERSC, choose architecture 70 for INTEL KNL MIC (dmpar)
      2. Choose 1=basic
    9. Within the file congigure.wrf update the setting for BUILD_RRTMG_FAST:
      1. -DBUILD_RRTMG_FAST=1
    10. Compile WRF with:
      1. ./compile em_real
    11. Copy the file namelist.input to the WRF directory:
      1. cp namelist.input $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/namelist.input
  3. Download and compile the WRF Pre-processing System (WPS) Version 4.0.1:
    1. cd $SCRATCH/WRF_CLIMATE
    2. wget https://github.com/wrf-model/WPS/archive/refs/tags/v4.0.1.zip
    3. unzip v4.0.1.zip
    4. cd WPS-4.0.1
    5. Load important modules:
      1. You may need to unload some other modules first
      2. module load craype-haswell
      3. module unload impi
    6. ./configure
      1. On NERSC, you first need to set the NETCDF environment variable: export NETCDF=${NETCDF_DIR}
      2. On NERSC, choose architecture 39: "Cray XC CLE/Linux x86_64, Intel compiler"
    7. ./compile
  4. Download the static WRF geographical data from https://www2.mmm.ucar.edu/wrf/users/download/get_sources_wps_geog.html. All the relevant links are:
    1. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz
    2. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_thompson28_chem.tar.gz
    3. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_noahmp.tar.gz
    4. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_px.tar.gz
    5. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_urban.tar.gz
    6. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_ssib.tar.gz
    7. https://www2.mmm.ucar.edu/wrf/src/wps_files/lake_depth.tar.bz2
    8. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_older_than_2000.tar.gz
    9. https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_alt_lsm.tar.gz
    10. https://www2.mmm.ucar.edu/wrf/src/wps_files/modis_landuse_20class_15s_with_lakes.tar.gz
    11. https://www2.mmm.ucar.edu/wrf/src/wps_files/nlcd2006_ll_9s.tar.bz2
    12. Unpack all these files into the same directory and note the location; for this walkthrough let's say we put them at $SCRATCH/WRF_CLIMATE/GEOGRAPHY
  5. Set the following options in the $SCRATCH/WRF_CLIMATE/WPS-4.0.1/namelist.wps file:
    1. max_dom = 1,
    2. interval_seconds = 10800,
    3. e_we = 425,
    4. e_sn = 300,
    5. geog_data_res = 'nlcd2011_9s+modis_fpar+modis_lai',
    6. dx = 12000,
    7. dy = 12000,
    8. map_proj = 'lambert',
    9. ref_lat = 40.0,
    10. ref_lon = -97.0,
    11. truelat1 = 30.0,
    12. truelat2 = 45.0,
    13. stand_lon = -97.0,
    14. geog_data_path = '$SCRATCH/WRF_CLIMATE/GEOGRAPHY', (replace with directory from step 3!)
    15. fg_name = 'FILE',
    16. constants_name = 'FIX:1979-01-01_00',
  6. Generate the grid file:
    1. Submit the job with:
      1. sbatch geogrid_cori.sl
    2. When the job completes, confirm that the file geo_em.d01.nc was created in the WPS-4.0.1 directory and is about 25 MB
  7. Download and process the time invariant ERA5 grib file:
    1. Visit https://rda.ucar.edu/datasets/ds633.0/index.html#sfol-wl-/data/ds633.0?g=20
    2. You may need to create an account with NCAR RDA
    3. Download the file: e5.oper.invariant.128_172_lsm.ll025sc.1979010100_1979010100.grb
    4. Place this file in the directory: $SCRATCH/WRF_CLIMATE/invariant/
    5. cd $SCRATCH/WRF_CLIMATE/WPS-4.0.1
    6. Link the invariant files:
      1. ./link_grib.csh $SCRATCH/WRF_CLIMATE/invariant/*grb
    7. Link the ERA VTable:
      1. ln -s ./ungrib/Variable_Tables/Vtable.ERA-interim.pl Vtable
    8. In namelist.wps:
      1. update prefix = 'FIX',
      2. update start_date = '1979-01-01_00:00:00',
      3. update end_date = '1979-01-01_00:00:00',
    9. Submit the batch job to ungrib with sbatch ungrib_cori.invariant.sl
    10. When the job completes, you should see a file FIX:1979-01-01_00
    11. Delete the supporting file:
      1. rm GRIBFILE.*
  8. Download the relevant ERA5 grib files for the year you want to simulate:
    1. On NERSC, run module load globus-tools to access the Globus utilities
    2. Run the script createFileListForYear.sh <year> to create a list of ERA5 files needed for that year, which will create $SCRATCH/WRF_CLIMATE/<year>/transfer_jan-jun.txt and $SCRATCH/WRF_CLIMATE/<year>/transfer_jul-dec.txt
    3. Download data for Jan-Jun of that year:
      1. transfer_files.py -s 1e128d3c-852d-11e8-9546-0a6d4e044368 -t dtn -i $SCRATCH/WRF_CLIMATE/<year>/transfer_jan-jun.txt -d $SCRATCH/WRF_CLIMATE/<year>/jan-jun
      2. This will create a Globus task which will eventually succeed and notify you by email
    4. Download data for Jul-Dec of that year:
      1. transfer_files.py -s 1e128d3c-852d-11e8-9546-0a6d4e044368 -t dtn -i $SCRATCH/WRF_CLIMATE/<year>/transfer_jul-dec.txt -d $SCRATCH/WRF_CLIMATE/<year>/jul-dec
      2. This will create a Globus task which will eventually succeed and notify you by email
  9. Copy the entire WPS-4.0.1 directory to a 6-month-specific directories:
    1. cp -r $SCRATCH/WRF_CLIMATE/WPS-4.0.1 $SCRATCH/WRF_CLIMATE/WPS_<year>_jan-jun
    2. cp -r $SCRATCH/WRF_CLIMATE/WPS-4.0.1 $SCRATCH/WRF_CLIMATE/WPS_<year>_jul-dec
  10. Perform the WPS preprocessing:
    1. cd $SCRATCH/WRF_CLIMATE/WPS_<year>_jan-jun
    2. Link the ERA5 forcing data for this 6 month period:
      1. ./link_grib.csh $SCRATCH/WRF_CLIMATE/<year>/jan-jun/*grb
    3. Link the ERA VTable:
      1. ln -sf ./ungrib/Variable_Tables/Vtable.ERA-interim.pl Vtable
    4. Update the namelist.wps:
      1. set prefix = 'FILE',
      2. set start_date = '<year>-01-01_00:00:00',
      3. set end_date = '<year>-06-30_21:00:00',
    5. Repeat the above steps for jul-dec, making sure to update the start and end dates
    6. Submit the job with:
      1. sbatch ungrib_cori.sl_year <year>
    7. When the job completes, make sure the WPS_<year>_<months> directories have the three-hourly files that look like FILE:<date>_<time>
  11. Generate the metgrid files:
    1. Submit the job with:
      1. sbatch metgrid_cori.sl_year <year>
    2. When the job completes, make sure the WPS_<year>_<months> directories have the three-hourly files that look like met_em.d01.<date>_<time>.nc
  12. Create the WRF inputs:
    1. Copy the WRF folder to a year and month specific folder:
      1. cp -L -r $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>
    2. Copy the input and batch script to the new folder:
      1. cp namelist.create_wrf_input $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>/namelist.input
      2. cp run_real.sl $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>/
    3. cd $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>
    4. Update the namelist.input start and end times:
      1. note that there is an extra week beyond the 6 month period
      2. start_year = <year>
      3. start_month = 01 or start_month = 07
      4. end_year = <year> or end_year = <year + 1>
      5. end_month = 07 or end_month = 01
      6. end_day = 06 or end_day = 07
      7. end_hour = 21 or end_hour = 00
    5. Link the .nc files generated in the previous step:
      1. ln -s $SCRATCH/WRF_CLIMATE/WPS_<year>_<months>/met_em*.nc ./
      2. Also link the overlapping files from the subsequent time period (one week)
    6. Load important modules:
      1. You may need to unload some other modules first
      2. module load craype-mic-knl
      3. module load cray-netcdf
      4. module load impi
    7. Submit the job with:
      1. sbatch run_real.sl
    8. When the job completes, check the log files for errors and make sure the files wrfbdy_d01, wrffdda_d01, wrfinput_d01, and wrflowinp_d01 exist in the folder
    9. Repeat the above steps for the other set of <months>.
  13. Download the GHG concentration data from https://esgf-node.llnl.gov/search/input4mips/:
    1. On NERSC, run module load globus-tools to access the Globus utilities
    2. Run the script downloadGHGFiles.sh to create a Globus task for downloading the time invariant files to $SCRATCH/WRF_CLIMATE/GHG
    3. Convert these files to a format useable by WRF:
      1. module load matlab/R2020b
      2. matlab -batch GHG_for_WRF_historical
    4. This should generate a data table at $SCRATCH/WRF_CLIMATE/GHG/GHG.txt
    5. Copy this file to the WRFv4.2.1 code:
      1. cp $SCRATCH/WRF_CLIMATE/GHG/GHG.txt $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/CAMtr_volume_mixing_ratio
  14. Link the initial year and month data to the WRFv4.2.1 code:
    1. ln -sf $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>/wrfbdy_d01 $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/
    2. ln -sf $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>/wrffdda_d01 $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/
    3. ln -sf $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>/wrflowinp_d01 $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/
    4. ln -sf $SCRATCH/WRF_CLIMATE/WRF_input_<year>_<months>/wrfinput_d01 $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/
  15. Run WRF!
    1. Update $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/namelist.input with the start and end year, month, and day matching your files from the previous steps.
    2. Copy the output fields file to the WRFv4.2.1 code:
      1. cp myoutfields.txt $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/
    3. Copy the launch script to the WRFv4.2.1 code:
      1. cp run_wrf.sl $SCRATCH/WRF_CLIMATE/WRF-4.2.1/test/em_real/
    4. Submit the job with:
      1. sbatch run_wrf.sl
    5. The output data will populate in this directory for both 1-hour resolution for a few variables, and 3-hour resolution for many variables.
  16. For the full experiment, WRF must run for a year of warmup (1979). After each 6 months of input data has been run through WRF, the new data must be linked and WRF restarted from the latest restart file. See the file s_restartV6 for techniques on automating the restart process and adapting the timestep when necessary.