Mouse trajectories from deeplabcut as Input data
Opened this issue · 22 comments
Hi,
I would like to use JAABA to assess mouse grooming behavior.
In our lab we use the open source software Deeplabcut (DLC) for tracking of mouse body parts, and all the input data required by JAABA and prepared by PrepareJAABAData GUI can be extracted using DLC.
I was wondering how it would be possible to create adequate input for JAABA manually - not using the GUI.
Any suggestions?
Thanks in advance,
Yara.
Dear Yara,
I am trying something similar but for scratching and not using DLC at the moment cause I thought it was too cumbersome to calculate a and b...I was wondering how you are labeling the mice with DLC in order to obtain the JAABA parameters a and b....thanks a lot.
Cheers,
Augusto
Hi Augusto,
That is also something I am still not sure about..
I was actually wondering whether 'a' and 'b' are parameters of the mouse? Do they refer to the long and short axes of the mouse's body?
If so then I might have an idea.. We track both ears of the mouse and the tail base (and the nose, but it is not always visible so it wouldn't be helpful for these analyses), and so you can draw a triangle connecting these 3 body parts and this should help calculate 'a' and 'b'.
Hi Yara,
Acording to JAABA documentation (http://jaaba.sourceforge.net/DataFormatting.html#TracksFileStructure), a is 1/4 lenght of the long axis of a ellipse fitted to the mouse while b is the same for the short axis. So in theory you would need to have both the lenght and the width of the mouse for every frame...unless Mayank contradicts me I dont know if that triangle would be enough for JAABA later to extract information....
But if you do that, from what I understand, you will be missing potential information that the detector could use to "learn" the behavior during training, right?
Thanks, thats what I expected.
And thats why I finally dropped the idea of using DLC...mayb Yara can try setting the points in head, tailbase, right flank and left flank...dont know how good DLC would learn tot differentiate right and left flanks but it might work...then calculating a and b would be straight forward.
Thats an option.
However, if b is the distance between ears, it will be pretty constant for mice since that doesnt change with the pose of the animal and so it would not add any important information from which to build a detector, right? But...now that I think of it...a and b for flies also dont change much...their bodies are pretty fixed by definition... so how did you expect a and be to be useful for flies when you designed the code? Probably I am missing some understanding on how the code works.
Got it. For sure, I prefer to have it than not.
And thanks for the heads up regarding APT, looking forward to the release!
Thanks for the all the input.. This all sounds very helpful.
I will try to use Mayank's suggestions for 'a' and 'b' from what I have for now, and see if it works well enough..
In any case, looking forward to using APT once released!
Yara.
Hi again,
I noticed that the current version has an option to import from APT. Thus we have moved to work with APT, but I was wondering what animal parts I should track in a mouse so that 'a' and 'b' can be estimated correctly.
As a trial run we started by marking both ears and the tailbase, in addition to right and left flank as suggested here.. Would that be enough for JAABA to convert these tracked parts to useful input?
Hi Mayank,
As I've mentioned in the beginning, I want to use JAABA to track events of grooming.
It does sound helpful to have the 'a' and 'b' parameters. I am wondering whether we need to change our direction and try to use Motr for this purpose. What do you think?
Hi Yara,
Mayank mentioned you were interested in grooming behavior and wondering about what points on the mouse to label. Motr was developed in my lab, and does an OK job tracking the centroid of an ellipse fit to the mouse, but as I'm sure you've noticed, mice are not ellipses! :)
At the moment, my workflow is to use Motr to track the centroids, and then use that as input to APT to get the nose, tail base, and left and right ears. I'm interested in social behavior and really want to know where the head is. The tricky bit about training any classifier for mouse body parts is how flexible the mouse body is. The parts I mention are the only ones that I'm really certain I can label consistently. However, if your interest is grooming, maybe some points on the paws, and something on the left and right flank?
Roian
P.S. In case it isn't clear, once I have the nose and tail, I can calculate 'a'. If you mark the left and right flank, that could give you something like 'b'.
P.P.S. If you haven't already, it might be useful to look at the early work of Fentress (e.g. Golani & Fentress, 1985 in Developmental Psychobiology 18(6): 529-544). That focuses on pups, but is very thorough. If one could have any labeled points, it seems like it would be nice to have: paws (enough points to show open vs closed paw), back of the head (behind the ears), eyes (mid-head landmark), nose, base of tail, tip of tail, maybe midpoint of tail. then, on the body, left outside hip and right outside hip would be good, but are often occluded, and a couple points down the spine). However, the great thing about machine learning in general, and therefore true about JAABA, is that you don't necessarily need to label the points involved in the behavior to train a good behavior classifier. Case in point: you can train a wing grooming classifier in JAABA with just labeled body parts and no wing labels. So, I guess my high-level suggestion would be to try labeling points that you feel you can label consistently, and then see how it does. Hope that helps!
Hi,
Thanks for your detailed response.
I am actually following this paper: https://doi.org/10.1016/j.jneumeth.2017.05.026 . They suggest using Motr for tracking of the mouse and then training a JAABA classifier for detection of grooming behaviour. Since we had already used deeplabcut to track the mouse body parts I mentioned before, I was wondering whether I can use that as input instead of trajectories from Motr.
I'm finding it a little too complicated to try and optimise what we get from DLC to fit the requirements of the JAABA classifier, and that's why I thought I might just try using Motr, since it seems to work well enough for this purpose.
Basically, these are the input parameters that are keeping me from being able to use DLC for my purpose:
- theta - is it supposed to be in radians or degrees?
- id - Identity number of the trajectory. What does this mean?
and the most important issue is: - 'b' - how to estimate b (the short axis of the mouse) using DLC.
After trying to use APT as suggested, I found it is not working well enough for us, and am trying to optimise its performance, but this is taking too long.. That's why now my plan is to try using what I have from DLC, entering 'b' as the distance between the ears and after I get answers about the issues above I will try to train a classifier and see whether it is successful.
If not, I might consider using Motr as suggested in the paper....
Hi @Yaraat92 ,
Part of the info you need is published on the motr website, in particular here:
- Radians.
- ID of the mouse in case you have more than 1 mouse in the cage.
- for b calculation you would need to label a point on the right and one point on the left side of the animals body at the level of the body center. Then just calculate as explained in motr:
m_afB: length of semi-minor axis of ellipse in pixels
. I am pretty confident that DLC will learn to label those points accurately.
Good luck!
A
Thanks a lot!! I will give a try :)