Need to figure what that brain is. Probably subscribes and publishes to a bunch of queues to give and receive information.
This module is meant to run the robot's mechanical components.
- Validate move X will turn some angle when not using PID... which is strange.... pid controler related perhaps.
- Finish working on the IK testing
- Make the robot arm position a "set point" and every X time, resynch to the set point... so MQTT would just "set the point". (maybe lower wiggle?)
- Need to move some of this in the parent project
- Motion Feedback (via mqtt pubsub reversed so the client knows the bot is in place)
- Error Handling to MQTT error queue (in custom error handlers)
- Go to home position (avoiding obstacles)
- Follow Tracked Object?...
- Build architecture with IOT Hub for deployment/management and collection of events, BrainWave for inferencing
- Events are : position, movement orders, inferencing output
- IOT Edge Connection and streaming of events :
- Error Handling
- Motion commands
- Pick commands
- Objects seen
- Wrongfully picked object - will go to a DNN training feedback loop (create new training examples)
- Find Object
- Track Object
- Detect Object
- Track it
- Follow Tracked Object
- Keep tracked object in field of view
- Loop
- If drifting to the left, turn + phi by a few deg, if drift to right, thrn - phi).
- Advance towards object /bot/base/move/forward for 1-2 seconds, then reloop until distance to object is within ~10-20cm
- fixed distance if static configuration, using /bot/base/move/XYphi)
- Loop
- Keep tracked object in field of view
- Pick Object with end effector /bot/pickObject
- Move base to standoff
- Run inverse kinatics to find the exact position... to run next step
- move to object
- pick object (close gripper)
- move back above in standoff
- await further instructions
- Drop object (open gripper /bot/gripper/open)
- Go back to home position
- For each pair of main steps :
- Generate list of detailed robot configuration (number of position per step TBD, will do trial and error)
- For each robot config step, move joints and wheels to position (or use velocities) to next configuration (end effector + mobile base)
- Need to create a container to run demo python code to validate it can receive code and utilize Azure to do TTS. See containers offered on prem to see if one works on a PI or Jetson.
- Need to test Azure STT.
- Need to test LUIS and LUIS in a container.
- TBD - add personnality module