Memory Error
wangyuyy opened this issue · 1 comments
wangyuyy commented
I tried to read tfrecord file as in Usage, the code are as follows:
import numpy as np
import torch
from tfrecord.torch.dataset import TFRecordDataset
if __name__ == '__main__':
path = './cater_with_masks_test.tfrecords-00000-of-00100'
index_path = None
description = {"camera_matrix": "float32",
"image": "uint8",
"mask": "uint8",
"object_positions": "float32"}
data = TFRecordDataset(path, index_path, description)
loader = torch.utils.data.DataLoader(data, batch_size=1, shuffle=False)
output = next(iter(loader))
print(output)
but it raised error MemoryError when trying to iter loader.
Traceback (most recent call last):
File "torch_read.py", line 17, in <module>
output = next(iter(loader))
File "/root/miniconda3/envs/dataset/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 681, in __next__
data = self._next_data()
File "/root/miniconda3/envs/dataset/lib/python3.7/site-packages/torch/utils/data/dataloader.py", line 721, in _next_data
data = self._dataset_fetcher.fetch(index) # may raise StopIteration
File "/root/miniconda3/envs/dataset/lib/python3.7/site-packages/torch/utils/data/_utils/fetch.py", line 32, in fetch
data.append(next(self.dataset_iter))
File "/root/miniconda3/envs/dataset/lib/python3.7/site-packages/tfrecord/reader.py", line 219, in example_loader
for record in record_iterator:
File "/root/miniconda3/envs/dataset/lib/python3.7/site-packages/tfrecord/reader.py", line 80, in tfrecord_iterator
yield from read_records()
File "/root/miniconda3/envs/dataset/lib/python3.7/site-packages/tfrecord/reader.py", line 71, in read_records
datum_bytes = datum_bytes.zfill(int(length * 1.5))
MemoryError
Besides, I wonder how to write description
which is used in TFRecordDataset
.
vahidk commented
Description is optional you don't need to provide it.
If you want help with your issue you should spend some time trying to debug it not just directly write the error here. For example you can see how much memory is this trying to allocate.