genaray/ZeroAllocJobScheduler

ConcurrentQueue generates garbage

LilithSilver opened this issue · 0 comments

As I discussed in #9, ConcurrentQueue generates garbage.

I did a bit more testing, and produced the following table, from repeatedly clearing and re-adding a set amount of items to a ConcurrentQueue:

Method QueueCapacity Reps Mean Error StdDev Median Allocated
BenchmarkConcurrentQueue 1 32 5,011.1 ns 103.44 ns 145.00 ns 5,000.0 ns 16984 B
BenchmarkConcurrentQueueWithDequeue 1 32 2,174.3 ns 47.47 ns 78.00 ns 2,200.0 ns 600 B
BenchmarkQueue 1 32 648.0 ns 17.21 ns 50.22 ns 600.0 ns 600 B
BenchmarkConcurrentQueue 32 32 19,262.5 ns 203.21 ns 199.58 ns 19,200.0 ns 16984 B
BenchmarkConcurrentQueueWithDequeue 32 32 28,842.9 ns 416.75 ns 369.44 ns 28,800.0 ns 600 B
BenchmarkQueue 32 32 8,573.5 ns 172.01 ns 277.77 ns 8,500.0 ns 600 B
BenchmarkConcurrentQueue 64 32 35,466.7 ns 504.52 ns 393.89 ns 35,350.0 ns 41560 B
BenchmarkConcurrentQueueWithDequeue 64 32 51,761.5 ns 398.80 ns 333.01 ns 51,700.0 ns 1368 B
BenchmarkQueue 64 32 9,938.5 ns 198.82 ns 166.02 ns 10,000.0 ns 600 B

This shows that even if we're just adding and removing a single item, it still generates garbage every single time the queue is cleared! However, if we dequeue the whole queue instead of clearing it, it reuses its segment.... unless we exceeded the segment length.

Looking at the ConcurrentQueue source, the initial segment size is 32 items. So, in the current system, if we exceed 32 JobMetas at once, we start allocating... and the moment we dequeue things, we generate garbage.

However, there's a hack to increase the initial segment size! If we pass a dummy IEnumerable<JobMeta> to the queue, we can increase that maximum. So during the initialization we should initialize the queue with a sequence as long as our limit should be, and then immediately dequeue (NOT clear!) the queue.