AR application for virtual try on of sneakers
View Demo
·
Report Bug
·
Request Feature
The state of the art in machine learning (ML) has achieved exceptional accuracy on many computer vision tasks solely by training models on photos. Building upon these successes and advancing 3D object understanding has great potential to power a wider range of applications, such as augmented reality, robotics, autonomy, and image retrieval.
Here we are presenting to develop an AR application which overlays a 3D object on the existing shoes.
https://app.gazebosim.org/dashboard
Once downloaded, they can be transform as shown below.
First run
./mediapipe/graphs/object_detection_3d/obj_parser/obj_cleanup.sh [INPUT_DIR] [INTERMEDIATE_OUTPUT_DIR]and then run
bazel run -c opt mediapipe/graphs/object_detection_3d/obj_parser:ObjParser -- input_dir=[INTERMEDIATE_OUTPUT_DIR] output_dir=[OUTPUT_DIR]INPUT_DIR should be the folder with initial asset .obj files to be processed, and OUTPUT_DIR is the folder where the processed asset .uuu file will be placed.
Note: ObjParser combines all .obj files found in the given directory into a single .uuu animation file, using the order given by sorting the filenames alphanumerically. Also the ObjParser directory inputs must be given as absolute paths, not relative paths. See parser utility library at
mediapipe/graphs/object_detection_3d/obj_parser/
for more details.
- mediapipe
- Android
To get a local copy up and running follow these simple steps.
- mediapipe
-
Clone the repo
git clone https://github.com/Princep/congenial-system.git git submodules update cd mediapipe
-
Use mediapipe
- Replace the textures and models
#Update the model_scale and model_transformation
model_scale: [0.45, 0.55, 0.15]
model_transformation: [1.0, 0.0, 0.0, 0.0]
model_transformation: [1.0, 0.0, 1.0, -0.9]
model_transformation: [0.0, 0.0, 0.0, 1.0]
#Animation file
cp mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetection3d/assets/sneaker/model.obj.uuu mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetection3d/assets/box.obj.uuu
#Texture file
cp mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetection3d/assets/sneaker/texture.jpg mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetection3d/assets/texture.jpg
- Prerequistes
cd mediapipe
#Add Android workspace in WORKSPACE file
android_sdk_repository(name = "androidsdk", build_tools_version = "30.0.2")
android_ndk_repository(name = "androidndk", api_level = 20)
# Switch to OpenCV 4
sed -i -e 's:3.4.3/opencv-3.4.3:4.0.1/opencv-4.0.1:g' WORKSPACE
sed -i -e 's:libopencv_java3:libopencv_java4:g' third_party/opencv_android.BUILD
- Build and install app
bazel build -c opt --config android_arm64 mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetection3d:objectdetection3d
adb install -r bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/objectdetection3d/objectdetection3d.apk
- Run the app
- Download the apk provided in Source code Archive.zip. Allow Permissions to install the apk. Source
- Point it directly on your feets. NOTE : Make sure the feets have some existing shoes or sandals.
See the open issues for a list of proposed features (and known issues).
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE
for more information.
Your Name - @pp_spector - prince.patel.14@gmail.com
Project Link: https://github.com/PrinceP/congenial-system.git