How to pass parameters from preprocessing to postprocessing when using micro-batch operations
pengxin233 opened this issue · 4 comments
📚 The doc issue
I have a variable that is obtained by parsing the image data in pre-processing, but it is not an input to the model. I want to pass it to post-processing and return it together with the results. Like knowing how to pass it from pre-processing to post-processing
Suggest a potential alternative/fix
No response
Here is how the handle method in the base handler does it, You can pass the data as a tuple to inference, and pass the tuple from inference to post_process
https://github.com/pytorch/serve/blob/master/ts/torch_handler/base_handler.py#L420-L424
以下是基处理程序中的 handle 方法,您可以将数据作为元组传递给推理,并将元组从推理传递给post_process
https://github.com/pytorch/serve/blob/master/ts/torch_handler/base_handler.py#L420-L424
Here is how the handle method in the base handler does it, You can pass the data as a tuple to inference, and pass the tuple from inference to post_process
https://github.com/pytorch/serve/blob/master/ts/torch_handler/base_handler.py#L420-L424
In ordinary requests, this can indeed be delivered. But we use micro-batch operations, so aren’t pre-processing and post-processing multi-threaded? At this time, there seems to be a problem using this method of delivery. My handler has no problem when the batchSize is 1, but an error occurs when the batchSize is 10.
cc @mreso
Thank you very much for your help. This method of passing parameters is also correct in micro-batch. The error was caused by my incorrect usage method and has been fixed.