A tool to find videos from similar video clips, using CNNs, OpenCV, and k-d trees. This project was developed in ~9 hours for 2019 Facebook Hackathon in Buenos Aires, Argentina.
- Run
yarn install
- Run
pip install -r requirements.txt
- Create folder
tf_models
, and download the next models into it: http://www.cs.toronto.edu/~guerzhoy/tf_alexnet/bvlc_alexnet.npy and ftp://mi.eng.cam.ac.uk/pub/mttt2/models/vgg16.npy
To generate model db from raw videos: python generate_training.py
To run the processing server: FLASK_APP=code.py flask run
To run the public API: sudo node server.js
Then, the endpoint http://localhost:3000/upload
expects a MP3 video in a FormData body, with the field name video
. The endpoint returns an object with a link to the complete video:
{
"name": "sprite",
"link": "https://drive.google.com/uc?export=download&id=16frYcF91IxDuHSseZwemWTvWwO3Ynr47",
"description": "Sprite: Love wins"
}
- Download Expo app.
- Open https://expo.io/@ramadis/abracadabra
- Send the sample videos to your cellphone and load them through the app