Azure/InferenceSchema

Output Schema Doesn't Respect Type

Opened this issue · 1 comments

Summary

If you pass a pandas type to output schema, and return a pandas dataframe from my run function, the generated code will not be automatically encoded.

What do you expect

Azure ML will encode my output to json.

What actually happens

The service will report an error encoding the dataframe.

In a best effort to keep this separate from AML, it isn't the job of the AML service to encode the output to JSON (or the other way around in the case of input). For input, that is one of the value adds for the decorator though, is to allow type transformations, and AML users can take advantage of that fact.

Our original thinking was that we didn't want to fail a call to the decorated function because of a datatype transformation issues after the function was called, but I can see the value to allowing for this (as well as for maintaining consistency). Will take a look at adding this in.