Memory error with numpy's linear algebra with very large datasets
SimonMolinsky opened this issue · 2 comments
SimonMolinsky commented
If dataset has more than 10 000 points then calculations of semivariogam crashes.
SimonMolinsky commented
Problem is too complex to solve it without large changes in package structure. It is still open and it will be reviewed in the future. If you, dear reader, are an expert in computational algebra and memory management then we need you!
SimonMolinsky commented
Ok, I close this issue for now, but if it occurs at some point in the future, I will rewrite more chunks of code with dask
.