neuml/codequestion

Integrate FastAPI for model serving

davidmezzetti opened this issue · 0 comments

Similar to neuml/txtai#12 - allow serving codequestion models via FastAPI. This will help speed up calls via the command line (#4), along with the possibility of remote service calls.