Polygon inflow on parallel domain
schoeller opened this issue · 2 comments
Dear all,
thanks for this software. I am running into trouble while trying to parallelize a script. It seems that the inflow point lies outside of the boundary and thus causes an error.
First the domain has been created and distributed as such:
if myid == 0:
print "# Create domain:"
print "# mesh_filename = " + working_dir + 'mesh_01.msh'
domain = anuga.create_domain_from_regions(bounding_polygon=bounding_polygon_01,
boundary_tags=boundary_tags_01,
mesh_filename=working_dir + 'mesh_01.msh',
maximum_triangle_area=100000,
verbose=True)
domain.set_name(sim_id)
domain.set_datadir(outputs_dir)
poly_fun_pairs = [
[
'Extent',
inputs_dir + src
]
]
print "# create topography_function"
print "input raster = " + inputs_dir + src
topography_function = qs.composite_quantity_setting_function(
poly_fun_pairs,
domain,
nan_treatment='exception',
)
print topography_function
print "# set_quantity elevation"
domain.set_quantity('elevation', topography_function) # Use function for elevation
domain.set_quantity('friction', 0.03) # Constant friction
domain.set_quantity('stage', 1) # Constant initial stage
print "# all quantities set"
else:
domain = None
#--------------------------------------------------------------------------
# Distribute sequential domain on processor 0 to other processors
#--------------------------------------------------------------------------
if myid == 0 and verbose: print 'DISTRIBUTING DOMAIN'
domain = distribute(domain, verbose=verbose)
domain.set_store_vertices_uniquely(False)
After setting up boundary condition, the inflow has been assigned in doing:
ib = fiona.open(inputs_dir + 'internal_polygon_01.geojson', 'r')
region01 = []
for item in ib[0]['geometry']['coordinates'][0]:
region01.append(item)
ib.closed
print region01
fixed_inflow = Inflow(
domain=domain,
rate=20000,
polygon=region01
)
domain.forcing_terms.append(fixed_inflow)
When calling mpirun -np 1 python runDamBreak.py
it works flawlessy. Calling more cores throws the error:
Traceback (most recent call last):
File "runDamBreak.py", line 252, in <module>
run_dambreak('2')
File "runDamBreak.py", line 169, in run_dambreak
polygon=region01
File "/usr/local/lib/python2.7/dist-packages/anuga/shallow_water/forcing.py", line 667, in __init__
verbose=verbose)
File "/usr/local/lib/python2.7/dist-packages/anuga/shallow_water/forcing.py", line 337, in __init__
assert is_inside_polygon(point, bounding_polygon), msg
AssertionError: Point (433218.661520856, 1268631.0420187942) in polygon for forcing term did not fall within the domain boundary.
--------------------------------------------------------------------------
Primary job terminated normally, but 1 process returned
a non-zero exit code. Per user-direction, the job has been aborted.
--------------------------------------------------------------------------
--------------------------------------------------------------------------
mpirun detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:
Process name: [[47221,1],1]
Exit code: 1
Any hints are very welcome
Best regards
Sebastian
@stoiver 👍
Preliminary results now are
1# mpirun -np 1 python runDamBreak.py
Processor 0
That took 8.58 seconds
Communication time 0.00 seconds
Reduction Communication time 0.00 seconds
Broadcast time 0.00 seconds
1#mpirun -np 4 python runDamBreak.py
Processor 0
That took 3.90 seconds
Communication time 0.20 seconds
Reduction Communication time 0.47 seconds
Broadcast time 0.00 seconds
Processor 1
That took 3.90 seconds
Communication time 0.21 seconds
Reduction Communication time 0.32 seconds
Broadcast time 0.00 seconds
Processor 2
That took 3.90 seconds
Communication time 0.26 seconds
Reduction Communication time 0.31 seconds
Broadcast time 0.00 seconds
Processor 3
That took 3.90 seconds
Communication time 0.20 seconds
Reduction Communication time 0.42 seconds
Broadcast time 0.00 seconds
On first sight it seems that some commands are being called as many times as -np X
argument is defined for and that I have to enclose them with myid == 0
. Certainly the examples will assist and experimenting further too. For rainfall and parallelization I hope for validation_tests/case_studies/towradgi/run_towradgi.py
.
Cheers to you
Sebastian