/BEAM_beta

Beam: A Monte Carlo Radiative Transfer Code for studies of Saturn's Rings

Primary LanguageFortran

BEAM readme updated May 8, 2018

BEAM is  a Monte Carlo (MC) based Radiative Transfer/Ray Tracing code 
written in Fortran 90. In addition, this package includes a 
few utility scripts written in python and R that we have found useful
for diagnostic purposes. The MC code has been run and tested on MacOS and Linux 
systems (Pleiades at the NAS supercomputing facility).

The software consists of two parts; a set up code that generates a 
particle field and the "main" code that does the Monte Carlo based 
raytracing. The two parts are decoupled to allow for using a particle 
field obtained some other way.

prerequisites:
------------------------------------------------------------------------
gnu fortran,

If running BEAM on a Mac:

If you don't have gcc installed. I recommend getting the binary from 

http://hpc.sourceforge.net/

You will also need to install Xtools from the Mac App store.

Open MPI,

if you want to run the parallel version locally (on your own machine) 
you will most likely need to install open MPI. Open MPI can be 
downloaded from,

https://www.open-mpi.org/software

When you download (get the most recent version), you will find a file 
called INSTALL that will guide you through getting it set up.

python,

Most systems (at least MacOS and Linux) have python pre-installed. 
NAS has a variety of python Versions available. The two scripts 
included here have been tested with python 2.7 but will probably work 
with newer versions, such as python 3. Some basic information about python
is given in the document python/pythoninfo.txt included with this package. 

R language,

In this package we include an R script as a utility for visualizing 
particle fields generated by BEAM. We recommend installing R studio. 
It has a nice matlab-like interface and can be downloaded for 
free from,

https://www.rstudio.com/products/rstudio/

for the program plotGranola.R you will need to add the RGL library 
this can be done easily from the tools menu in R studio.

Modules on NAS,

If running BEAM on Pleiades you will you will have to load the modules 
you need to compile and run the code. As of this writing the modules 
to load are,
     
     >module load gcc/4.9.3
     >module load mpi-sgi/mpt.2.15r20 
     >module load python

These seem to work but newer versions "should" work fine too. 

Contents of package:
------------------------------------------------------------------------

	README.txt 
	   -this file
	in_progress.txt   
	   - Document that details features of the software still being tested
	   
	run_mpi4_S0tau05D001Phase60sha45Ntarg300.pbs 
		-this a sample pbs script for runing BEAM on NAS it
		executes the MPI version of the fortran and python 
		code mentioned below to average over multiple runs.

	runTau10D01S0_Phase20_50targs.sh
	    -Sample BASH shell script for averaging over multiple runs
	    that can be run on any unix system. This is meant to duplicate 
	    what the pbs script does, but averages over far fewer runs 
	    and requires a much longer runtime.
	    
	source
	   Fortran source code
		   make_particles_wakes.F90 
			  -original particle set up code. Contains only 
			  a moderate capability to ensure particles do not overlap.
			  Works well with low D and tau (density less 
			  then ~ 0.1, tau less then about 1.0) and narrow size 
			  distributions (r=5 to 500 cm, for example). 
		   mpw_iquad_xy.f90
		      -particle set up code that does a better job 
		      of ensuring that no particles overlap by seperating
		      the coordinate grid into quadrants. Better to use this 
		      code for high density cases with a wide size distribution.
			  
		   mc.F90
			  -main ray tracing code
		   mc_mpi.F90 
			  -main code madified to use message massing interface (MPI) on NAS
		   planarflux_simpsons.f90
			  -program used to generate the Apf tables found in support files
			  directory.
	misc
	   plotGranola.R
	       -R program that plots a particle field in 3d. Used for visualizing 
	       output from make_particles_wakes.F90 or mpw_iquad_xy.f90. For 
	       different particle files you just need to change line 4.
	supportfiles
	   -Tables of planar flux for a k=1,1.05, and 1.15 for S=0 to 2.0 
	   used by montecarlo code. The files are,
	      apf_table_k115.dat
          apf_table_k105.dat
          apf_table_k1.dat
       these were generated by planarflux_simpsons.f90 assuming a 
       hapke-like shadowing function (f(a) = exp(-shad*sqrt(tan(a/2)))). 
       If you want to use a simple exponential (f(a)=exp(-shad*a)) 
       in the main code, you will need to generate new tables.
       
	python
	   python scripts, 
		   measureTau.py
			  -Can be used to check the particle file that it accurately 
			  represents the desired tau and D values. Because of the 
			  option in model to have particles gaussian distributed 
			  vertically, the D values output from this script may 
			  be less accurate the the tau values. 
		   SumFlux.py
			  -Used by pbs and shell script to average flux and I/F over 
			  multiple simulations.
		   sphericalAlbedo.py
		      -script that plots the spherical albedo based/inspired by 
		      the paper, 
		      
		      "Rough surfaces: Is the dark stuff just shadow?"(2016)
		      by Jeffrey N. Cuzzi, Lindsey B. Chambers, Amanda R. Hendrix
		      
		      Used to understand how shadowing and mineart parameters 
		      effect reflectivity and not required for running BEAM.	  
	   pythoninfo.txt
     	   -a document that describes where to obtain a python interpreter 
	       and installing modules that are used in our utility programs.
	       Also, includes where to get more information on python online.       
       
	    
	   
	bin
	   -compiled binaries of source code go in this directory
		   mcx - monte carlo program
		   wakex - particle field set up program
	doc
	   -supporting documentation
	input
	   -sample input files for BEAM 
	    -input files for monte carlo code
	       Lambert surface element scattering
	        mc-beamS0Phase20sha100tau10D01LitGeom.in
			mc-beamS0Phase100sha100tau10D01LitGeom.in
			mc-beamS0Phase150sha100tau10D01LitGeom.in
		   Callisto phase function scattering	
			mc-beamS0Phase20sha100tau10D01Callisto.in			
	    -input files for setup code    
			part-no_wakesD01tau10_nbins40.in
			part-wakesDw01taug01tauw10_phi0_nbins40.in
			part-wakesDw01taug01tauw10_phi30_nbins40.in
          -template files used by python script SumFlux.py    
			  if_sum.out
			  flux_sum.out    
	output
	   this directory contains sample output.Details follow
          Ouput from Monte Carlo code,
            All of these use obslat = 41.7 degrees, sunlat = 26.7,
            distance from saturn = 88000 km
            
			  Lambert_D01tau10Phase20_mcx - output files from run with 
				 lambert surface element scattering with phase angle of
				 20 degrees
			 
			  Lambert_D01tau10Phase100_mcx - output files from run with 
				 lambert surface element scattering with phase angle of
				 100 degrees
		  
			  Callisto_D01tau10Phase20_mcx - output files from run with 
				 Callisto phase function scattering with phase angle of
				 20 degrees

			  Lambert_D01tau10Phase20_mc_mpi - output files from run with 
				 lambert surface element scattering with phase angle of
				 20 degrees using MPI version of the code.
	   
			   Ntarg300D01tau10_S0_phase20sha100 - results generated by 
				 the pbs script run_mpi4_S0tau10D01Phase20sha45Ntarg300.pbs
				 on Pleides.
		   
			   BEAM_tau10D01S0Phase20SHA100Rev3_50targs - results generated 
				 the shell script runTau10D01S0_Phase20_50targs.sh 
     	     
           Output from mpw_iquad_xy.f90 or make_particles_wakes.F90
           
              nowakes_D01_tau10_nbins40.partdat - particle field with 
                 with D=0.1, tau=1.0 (nowakes case), rmin = 5 cm, and 
                 rmax = 500 cm.
                 
              wakes_Dw01taug01tauw10_phi0_nbins40.partdat - particle field
                 with wake/gap regions. In the gap tau is 0.1, in the wake 
                 D=0.1, tau = 1.0, rmin = 5 cm, and 
                 rmax = 500 cm. The wake orientation is at 0.0 degrees.

              wakes_Dw01taug01tauw10_phi30_nbins40.partdat - particle field
                 with wake/gap regions. In the gap tau is 0.1, in the wake 
                 D=0.1, tau = 1.0, rmin = 5 cm, and rmax = 500 cm. 
                 The wake orientation is at 30.0 degrees.
        				
Compiling code:
------------------------------------------------------------------------
    cd into source directory
    
    particle code should be compiled as,
    
      >gfortran -O3 make_particles_wakes.F90 -o ../bin/wakex
    
    or the "quad" set up code as,
      >gfortran -O3 -mcmodel=medium mpw_iquad_xy.f90 -o ../bin/wakex
      
      *theoretically, -mcmodel=large can be used, but this is not fully 
      implemented in gfortran and will not work in a MacOS environment.
    
    monte carlo code (serial version) should be compiled as,
      > gfortran -O3 mc.F90 -o ../bin/mcx
      
    the MPI (parallel) version can be compiled on MacOS as,
    
      > mpifort -O3 mc_mpi.F90 -o ../bin/mc_mpix
    
    On NAS you will need to load a couple of modules first. As of 
    this writing they are,
    
      >module load gcc/4.9.3
      >module load mpi-sgi/mpt.2.15r20

    then compile MPI code as,
      >gfortran -O3 -I/nasa/sgi/mpt/2.15r20/include -o ../bin/mc_mpix mc_mpi.F90 -lmpi
      
      if you use a different version of the sgi/mpt library you will 
      have to change the compile statement accordingly.
      
Running the model
------------------------------------------------------------------------
    First run the setup code, the steps below assume you are have cd'd 
    into the output directory,
             
      type,
      
      >../bin/wakex < ../input/part-no_wakesD01tau10_nbins40.in > output_particles
      
      the file output_particles will contain messages normally output to the screen. 
      I use the included file ../input/part-no_wakesD01tau10_nbins40.in as an example. 
      This generates a non-wake particle field with the smallest particle at r=5cm and largest 
      at r=500 cm. The particle sizes start with the largest and are put in 
      geometrically spaced bins (this example has 40 bins). The physical parameters are 
      tau - 1.0 and D = 0.1. After running you should see a file called:
      nowakes_D01_tau10_nbins40.partdat. This will be used by mcx
      
      **** Actual tau and D ****************************************************
      A bit of caution here the actual particle field generated will generally 
      have a tau then the input tau. How much depends on the number of size bins 
      used. The same is true of D. I find with 40 bins tau is off by 2% and D is 
      off by 7-8%. If you want to validate the results of BEAM with classical 
      results or compare with other models you will want to measure the tau and D 
      in the particle file. You may find the script measureTau.py useful for this. 
      However, be aware that because, in rare cases, there are particles whose z 
      coordinates are outliers the calculated D values are misleading. 
      **************************************************************************
      
      Running mcx (the serial version),
      
      type > ../bin/mcx < ../input/mc-beamS0Phase20sha100tau10D01LitGeom.in > output
      
      The file output contains messages printed while the program runs.
      
      In this example the sun latitude is 26.7 degrees observer latitude is 
      41.7 degrees, phase angle is 20 degrees, and the solar hour angle 
      is 100 degrees. The distance to the ring patch is 88000 km with 
      10000 solar and satshine photons are used. We use the the minneart 
      scatering law as well. There is also a particle roughness parameter 
      but this is set to zero here.
      
      One side note, for the viewing geometry to make sense, the phase angle 
      can not be larger then 180 - (obslat+sunlat) and can not be smaller 
      then abs(obslat - sunlat). If you try using the input file 
      mc-beamS0Phase150sha100tau10D01LitGeom.in you will get an error message 
      and the program will halt execution for this reason. 
      
      Four files are generated,
      
      flux.out - a table of flux due to orders of scattering from solar photons 
                 and satshine 
                 
      if.out - tabulation of I/F as a function of single scattering albedo
      nphotpass.out - a file that is useful for understanding output in 
                      some cases. It reports 6 numbers;
          * Total number of photons that pass through without hitting any particles 
          * Satshine photons that that did not interact
          * Solar photons that did not interact
          * Satshine photons used to compute flux
          * Solar photons used to compute flux 
          * Number of scattering events that result in flux not seen by the observer
            due to the scattering particle being in the way.
          * Number of scattering events that result in flux not seen by the observer
            because of another particle being in the way.
          
      output - contains messages related to progress of program run    
      
      Running mc_mpix (the parallel version),
      
      Most higher end Macs have 2 to 8 cores so the parallel version can 
      be run locally by first linking the input file to a hardcoded filename,
         
         >ln -s ../input/mc-beamS0Phase20sha100tau10D01LitGeom.in input_file.in  
         
      
      and then (if you have four cores) execute the code as,
      
         >mpiexec -np 4 ../bin/mc_mpix
         
      generates same files as serial version   

running the model on Pleidies
------------------------------------------------------------------------
      As aluded to above, to speed up execution time we converted the serial 
      version to a MPI version that splits up a large number of 
      photons among multiple processors. Also we found averaging runs 
      among unique particle fields (or targets) with the same physical 
      parameters eliminates bias. So we developed a pbs script that 
      loops through 100 (for high density cases) or 300 (for low density) 
      targets. The PBS script assumes we are running it in a CSHELL environment.
      The beginning of the code has some shell variable declarations used 
      form input filenames and output directories. 
      
      So after compiling both codes on NAS one can execute the sample pbs 
      script in the root directory as,
      
      >qsub -q normal run_mpi4_S0tau10D01Phase20sha45Ntarg300.pbs
      
      this puts the "job" into the normal queue. This is for jobs that take 
      less then 8 hours to run. 
      
      as it stands output files are written to /nobackupp8/dmolson/  
      on lines 37 and 38 you will need to modify those lines to work in your 
      account.
      
      When executed the script will put results in a folder such as the one 
      you find in /output/Ntarg300D01tau10_S0_phase20sha100/
      with seperate results for each target as well as averaged results in flux_sum.out 
      and if_sum.out. 
      
Shell script      
------------------------------------------------------------------------
      We also provide a shell script that does the same thing as the PBS script
      but uses the serial version of the code rather then the mpi version. 
      While in the BASH shell (the script is written using BASH commands) 
      and in the root directory type,
      
      > ./runTau10D01S0_Phase20_50targs.sh
      
      While running, the script echos the progress as it goes through 
      each run. As with the PBS script, this script first creates a 
      parent folder for the results with a subfolder for the results 
      from individual runs. An example of output can be found in 
      /output/BEAM_tau10D01S0Phase20SHA100Rev3_50targs.