TexasInstruments/jacinto-ai-devkit

Does Jacinto7 family supports Tflite, caffee?

Opened this issue · 4 comments

TIDL documentation is bit confusing . I'm trying to do a custom object detection and classification, could you advise me

  1. should I use pytorch?
    2)if tf lite supports, could you provide some hints on where i can found training code for object detection
  2. if tf lite supports, could you provide some hints on where i can found training code for semantic segmentation.

TIDL can import PyTorch(exported to ONNX), Tensorflow, TFLite and Caffe models.

Object Detection:

Currently TIDL supports Single Shot Multi Box (SSD) Object detector. There are plans to add more object detectors later. You can train object detector in multiple ways. Please consult the latest TIDL documentation for more details: https://software-dl.ti.com/jacinto7/esd/processor-sdk-rtos-jacinto7/latest/exports/docs/tidl_j7_01_02_00_09/ti_dl/docs/user_guide_html/md_tidl_fsg_meta_arch_support.html

Basically there are three ways for Object Detection as of now:

  1. Using Caffe - please see the documentations at: https://git.ti.com/cgit/jacinto-ai/caffe-jacinto/about/, https://git.ti.com/cgit/jacinto-ai/caffe-jacinto-models/about/

  2. Using Tensorflow Object Detection API: https://github.com/tensorflow/models/tree/master/research/object_detection

  3. Using PyTorch/ONNX. You can use any good implementation of SSD in PyTorch. We recommend mmdetection: https://github.com/open-mmlab/mmdetection
    The TIDL link above specifies a way in which the exported ONNX file and a separate meta architecture prototxt can be specified to TIDL. mmdetection has some training examples of SSD already.
    Note: If you use mmdetection SSD, please remove the L2 normalization operation as it has some layers that are currently not supported by TIDL.
    For an example, you can refer to the following link: https://e2e.ti.com/support/processors/f/791/p/907822/3374066#3374066

Semantic segmentation:

Please refer to the following: https://git.ti.com/cgit/jacinto-ai/pytorch-jacinto-ai-devkit/about/

Most of these documentation are accessible from our main page: https://github.com/TexasInstruments/jacinto-ai-devkit

@mathmanu
Thanks for the detailed explanation.

I moved with TensorFlow SSD Mobilenet-V2, and able to convert the frozen model.

But how to create tidl_import_ssd_mobilenet_v2.txt

"/ti_dl/test/testvecs/config/import/public/tensorflow/tidl_import_ssd_mobilenet_v2.txt"

I was running
./out/tidl_model_import.out ${TIDL_INSTALL_PATH}/ti_dl/test/testvecs/config/import/public/tensorflow/tidl_import_ssd_mobilenet_v2.txt

which outputs

TF Model (Proto) File : ../../test/testvecs/models/public/tensorflow/ssd_mobilenet_v2/ssd_v2.pb
TIDL Network File : ../../test/testvecs/config/tidl_models/tensorflow/tidl_ssd_mobilenet_v2_net.bin
TIDL IO Info File : ../../test/testvecs/config/tidl_models/tensorflow/tidl_ssd_mobilenet_v2_io_
Concat is Only suported accorss channels

a) How can i correctly set tidl_import_ssd_mobilenet_v2 ??
b) where am I wrong

Reference: https://e2e.ti.com/support/processors/f/791/t/892409?TDA4VM-import-of-TensorFlow-SSD-ModelNet-v2

Hi arjunskumar,

Please file an issue in the appropriate e2e forum to get your TIDL specific questions answered by the expert.
https://e2e.ti.com/support/processors/f/791
(Make sure to add tags TIDL and TDA4 in the tags section)

Also mention the above Reference link that you listed above as I see that the thread was not answered.

Best regards,

Hi @mathmanu , I've posted to the forum, its not publicly available as it require moderator permissions