Cornerstone-OnDemand/modelkit

Handle breaking batch behaviour options

Opened this issue · 0 comments

Currently breaking a prediction_batch example, breaks the call and raise the error

we may want another option like (for example) returning all the batches returns except the breaking ones (set to none or set to the Exception) and a mask or something like that