blue-oil/blueoil

Support dual inference at same time on FPGA

tk26eng opened this issue · 3 comments

Now runtime only support one inference on FPGA.
But sometimes running two models is useful.
We might need the feature for that.

Related issue: #666

@tk26eng @primenumber

Just a related question:
Is there a possibility of using multiple FPGAs for inference?
(Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)

@kalpitthakkar-lm

@tk26eng @primenumber

Just a related question:
Is there a possibility of using multiple FPGAs for inference?
(Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)

Maybe we don't have any plan to use multiple FPGAs for inference.
Bigger FPGA is easier way to handle large model on FPGA instead of multiple FPGAs.