[Fatal Error] When the amount of data is too large, errors occur
Opened this issue · 3 comments
#include "../src/memory_pool.h"
// #include "../tmp/MemoryPool.h"
// #include "../src/allocate.h"
#include <vector>
#include <cstdlib>
#include <iostream>
std::vector<int, MemoryPool<int>> v;
// std::vector<int> v;
int main(void) {
for (int i=0; i<100000000; i++) {
int t = rand()%10;
v.emplace_back(t);
if (i % 100000 == 0)
std::cout << i << "\n";
}
return 0;
}
0
100000
200000
300000
400000
500000
600000
700000
800000
900000
1000000
1100000
1200000
1300000
1400000
1500000
1600000
1700000
1800000
1900000
2000000
[1] 95072 segmentation fault ./test
i have run and test your code.And my debug envornment is ubuntu22 linux 5.19 clang++ 14
The problem may caused by that the default blockSize is 4096.If you use AddressSanitizer you will find that the this code will always error at i==1021(1021*4+some util mem). Every time before you want to allocate a piece of memory from this pool ,you should find out if it has enough memory or catch the error. Or just set a1 bigger BlockSize.
i have run and test your code.And my debug envornment is ubuntu22 linux 5.19 clang++ 14 The problem may caused by that the default blockSize is 4096.If you use AddressSanitizer you will find that the this code will always error at i==1021(1021*4+some util mem). Every time before you want to allocate a piece of memory from this pool ,you should find out if it has enough memory or catch the error. Or just set a1 bigger BlockSize.
If there is not enough memory, shouldn't a new block be generated? Why increase the size of the block?