Appending data to a DataArray in Dfs2 file
Closed this issue · 2 comments
Hi there
I am in a bit of a struggle. And I hope my problem has a solution.
It's all about to avoid memory errors, like: MemoryError: Unable to allocate 1.52 GIB for an array with shape (41760,75,65) and data type float64
.
I am seeking a solution where I can append my data stored in my memory into an existing Dfs2 file, without loading the existing data into memory. Like a "load and dump" solution.
The example above with the MemoryError
, the first "dimension" is the time as you might know, which is 41760
timesteps with timestep of 60 seconds. This means that it tried to load 29 days before it "crashed"... and I would like to have a year.
When loading Dfs2
from the mikeio
, I am able to see some details about the file in-memory. Is it possible to store data in the existing DataArray
(not Dataset
) in this mode?
from mikeio import Dfs2
import numpy as np
dfs2 = Dfs2('data.dfs2') # Assuming this is the existing Dfs2 file.
test_data = np.zeros(dfs2.shape)
# ds = dfs2.read() # I would like to avoid this...
dfs2.write(to_existing=DataArray,data=test_data) # Of course, 'to_existing', is a made up argument, but something like this.
Is there any solution to my memory problem?
Thanks in advance!
I think there are several solutions or parts:
- Use
mikeio.generic.concat
to concatenate files - Use
mikecore
directly and skipMIKE IO
, then you can append to an existing Dfs2 file. - Use float32 instead of float64
- Rent a cloud computer with more memory
Hi @ecomodeller
I tried the first solution and it worked like a charm! Thank you!