This repository provides the implementation of the BERT model and make the model inference on android devices.
- PyTorch 1.9.0 and torchvision 0.10.0 or later
- Python 3.8 or above
- Android Pytorch library pytorch_android_lite:1.9.0, pytorch_android_torchvision:1.9.0
- Android Studio 4.0.1 or later
To Test Run the BERT Android App, follow the steps below:
You can train your own BERT model (the BERT-base model) or download a BERT base model file to the /app/src/main/assets
folder using the link here.
Recommend to download the model from huggingface.
Open the BERT android project using Android Studio. Note the app's build.gradle
file has the following lines:
implementation 'org.pytorch:pytorch_android_lite:1.9.0'
implementation 'org.pytorch:pytorch_android_torchvision:1.9.0'
and in the MainActivity.java, the code below is used to load the model:
mModule = LiteModuleLoader.load(MainActivity.assetFilePath(getApplicationContext(), "BERT.ptl"));
Select an Android emulator or device and build and run the app. The demo screenshot is as follows:
Input some text (e.g., I like reading.) and touch the button Start to get the result (binary classification task).
Mode 2 (for advanced users)
The code can make inference on the SST-2 dev set (binary classification task). Please check the commnet part of the code and fit your requirement. There is no feedback until the inference is done, you can check the run console to make sure everything is fine.
Read the tutorial here to get the basics of pytorch android.
For more information on using Mobile Interpreter in Android, see the tutorial here.