mikegoatly/lifti

Chaining multiple SerializeAsync/DeserializeAsync - header data issue

tomcashman opened this issue · 6 comments

I need to serialize two FullTextIndex instances into a single file.

When trying to read the file, I receive the following error:

System.AggregateException: One or more errors occurred. (Unable to read header data from serialized index content.)

My deserialization is done inside a task:

return Task.Run(async () =>
{
   await serializer.DeserializeAsync(index1, stream, disposeStream: false);
   await serializer.DeserializeAsync(index2, stream, disposeStream: true);
});

The index is serialized by separate program similarly inside a Task.

return Task.Run(async () =>
{
   await serializer.SerializeAsync(index1, stream, disposeStream: false);
   await serializer.SerializeAsync(index2, stream, disposeStream: true);
});

Hi @tomcashman! Thanks for raising an issue - serializing multiple indexes to the same file wasn't an anticipated use case, so if you don't mind sharing you reasoning, I'd be interested to understand why you need to do it.

Without debugging, I think the problem here will be that there's no guarantee that the first call to DeserializeAsync will leave the stream's position at the location after the last byte of the first index, so the second call will not be at the correct location to read the second index's header bytes to determine the version of the serializer used to write it. It might be possible to provide this guarantee, but I'd need to spend some time investigating it.

We've been using the library to build a full-text search index for a video game. Due to the nature of game consoles and their I/O restrictions it's better for us to read one large sequential file.

If it's too big of a task to support this we can try find a workaround.

Thanks for the update - I'll have a quick look at this tonight to see what's involved in making it work

@tomcashman This should be fixed for you now in v3.1.0 - let me know how you get on!

It works perfectly. Thank you so much! 😃

No problem, @tomcashman glad it worked for you! I'd love to find out how you get on with it once you've got things up and running fully - feel free to reach out on twitter (same username there) if you want.