Avatars for Skype and Zoom. Democratized.
Disclaimer: This project is unrelated to Samsung AI Center.
- 17 April 2020. Created Slack community. Please join via invitation link.
- 15 April 2020. Added StyleGAN-generated avatars. Just press
Q
and now you drive a person that never existed. Every time you push the button – new avatar is sampled. - 13 April 2020. Added Windows support (kudos to 9of9).
- Requirements
- Install
- Setup avatars
- Run
- Controls
- Driving your avatar
- Configure video meeting app
- Contribution
- Troubleshooting
- Credits
To run Avatarify smoothly you need a CUDA-enabled (NVIDIA) video card. Otherwise it will fallback to the central processor and run very slowly. These are performance metrics for some hardware:
- GeForce GTX 1080 Ti: 33 fps
- GeForce GTX 1070: 15 fps
- Mac OSX (MacBook Pro 2018; no GPU): very slow ~1 fps
Of course, you also need a webcam!
Download model's weights from Dropbox, Mega, Yandex.Disk or Google Drive [716 MB, md5sum 46b26eabacbcf1533ac66dc5cf234c5e
]
Linux uses v4l2loopback
to create virtual camera.
- Install CUDA.
- Download Miniconda Python 3.7 and install using command:
bash Miniconda3-latest-Linux-x86_64.sh
- Clone
avatarify
and install its dependencies (sudo privelege is required):
git clone https://github.com/alievk/avatarify.git
cd avatarify
bash scripts/install.sh
- Download network weights and place
vox-adv-cpk.pth.tar
file in theavatarify
directory (don't unpack it).
(!) Note: we found out that in versions after v4.6.8 (March 23, 2020) Zoom disabled support for virtual cameras on Mac. To use Avatarify in Zoom you can choose from 2 options:
- Install Zoom v4.6.8 which is the last version that supports virtual cameras
- Use latest version of Zoom, but disable library validation:
codesign --remove-signature /Applications/zoom.us.app
For Mac it's quite difficult to create a virtual camera, so we'll use CamTwist app.
- Install Miniconda Python 3.7.
- Clone
avatarify
and install its dependencies:
git clone https://github.com/alievk/avatarify.git
cd avatarify
bash scripts/install_mac.sh
- Download network weights and place
vox-adv-cpk.pth.tar
file in theavatarify
directory (don't unpack it). - Download and install CamTwist from here. It's easy.
Video tutorial is coming!
This guide is tested for Windows 10.
- Install CUDA.
- Install Miniconda Python 3.7.
- Install Git.
- Press Windows button and type "miniconda". Run suggested Anaconda Prompt.
- Download and install Avatarify (please copy-paste these commands and don't change them):
git clone https://github.com/alievk/avatarify.git
cd avatarify
scripts\install_windows.bat
- Download network weights and place
vox-adv-cpk.pth.tar
file in theavatarify
directory (don't unpack it). - Run
run_windows.bat
. If installation was successful, two windows "cam" and "avatarify" will appear. Leave these windows open for the next installation steps. If there are multiple cameras (including virtual ones) in the system, you may need to select the correct one. Openscripts/settings_windows.bat
and editCAMID
variable.CAMID
is an index number of camera like 0, 1, 2, ... - Install OBS Studio for capturing Avatarify output.
- Install VirtualCam plugin. Choose
Install and register only 1 virtual camera
. - Run OBS Studio.
- In the Sources section, press on Add button ("+" sign), select Windows Capture and press OK. In the appeared window, choose "[python.exe]: avatarify" in Window drop-down menu and press OK. Then select Edit -> Transform -> Fit to screen.
- In OBS Studio, go to Tools -> VirtualCam. Check AutoStart, set Buffered Frames to 0 and press Start.
- Now
OSB-Camera
camera should be available in Zoom (or other videoconferencing software).
The steps 11-12 are required only once during setup.
Avatarify comes with a standard set of avatars of famous people, but you can extend this set simply copying your avatars into avatars
folder.
Follow these advices for better visual quality:
- Make square crop of your avatar picture.
- Crop avatar's face so that it's not too close not too far. Use standard avarars as reference.
- Prefer pictures with uniform background. It will diminish visual artifacts.
Your web cam must be plugged-in.
Note: run Skype or Zoom only after Avatarify is started.
It is supposed that there is only one web cam connected to the computer at /dev/video0
. The run script will create virtual camera /dev/video9
. You can change these settings in scripts/settings.sh
.
You can use command v4l2-ctl --list-devices
to list all devices in your system. For example, if the web camera is /dev/video1
then the device id is 1.
Run:
bash run.sh
cam
and avatarify
windows will pop-up. The cam
window is for controlling your face position and avatarify
is for the avatar animation preview. Please follow these recommendations to drive your avatars.
- Run:
bash run_mac.sh
- Go to CamTwist.
- Choose
Desktop+
and pressSelect
. - In the
Settings
section chooseConfine to Application Window
and selectpython (avatarify)
from the drop-down menu.
cam
and avatarify
windows will pop-up. The cam
window is for controlling your face position and avatarify
is for the avatar animation preview. Please follow these recommendations to drive your avatars.
If there are multiple cameras (including virtual ones) in your system, you may need to select the correct one in scripts/settings_windows.bat
. Open this file and edit CAMID
variable. CAMID
is an index number of camera like 0, 1, 2, ...
- In Anaconda Prompt:
cd C:\path\to\avatarify
run_windows.bat
- Run OBS Studio. It should automaitcally start streaming video from Avatarify to
OBS-Camera
.
cam
and avatarify
windows will pop-up. The cam
window is for controlling your face position and avatarify
is for the avatar animation preview. Please follow these recommendations to drive your avatars.
Note: To reduce video latency, in OBS Studio right click on the preview window and uncheck Enable Preview.
Keys | Controls |
---|---|
1-9 | These will immediately switch between the first 9 avatars. |
Q | Turns on StyleGAN-generated avatar. Every time you push the button – new avatar is sampled. |
0 | Toggles avatar display on and off. |
A/D | Previous/next avatar in folder. |
W/S | Zoom camera in/out. |
Z/C | Adjust avatar target overlay opacity. |
X | Reset reference frame. |
F | Toggle reference frame search mode. |
R | Mirror reference window. |
T | Mirror output window. |
I | Show FPS |
ESC | Quit |
These are the main principles for driving your avatar:
- Align your face in the camera window as closely as possible in proportion and position to the target avatar. Use zoom in/out function (W/S keys). When you have aligned, hit 'X' to use this frame as reference to drive the rest of the animation
- Use the overlay function (Z/C keys) to match your and avatar's face expressions as close as possible
Alternatively, you can hit 'F' for the software to attempt to find a better reference frame itself. This will slow down the framerate, but while this is happening, you can keep moving your head around: the preview window will flash green when it finds your facial pose is a closer match to the avatar than the one it is currently using. You will see two numbers displayed as well: the first number is how closely you are currently aligned to the avatar, and the second number is how closely the reference frame is aligned.
You want to get the first number as small as possible - around 10 is usually a good alignment. When you are done, press 'F' again to exit reference frame search mode.
You don't need to be exact, and some other configurations can yield better results still, but it's usually a good starting point.
Go to Settings -> Audio & Video, choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) camera.
Go to Settings -> Video and choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) from Camera drop-down menu.
Make a call, allow browser using cameras, click on Settings icon, choose avatarify
(Linux), CamTwist
(Mac) or OBS-Camera
(Windows) in Video settings drop-down menu.
Our goal is to democratize deepfake avatars. To make the technology even more accessible, we have to tackle two major problems:
- Add support for more platforms (Linux and Mac are already supported).
- Optimize neural network run-time. Running network real-time on CPU is of high priority.
Please make pull requests if you have any improvements or bug-fixes.
- My avatar is distorted: Please follow these recommendation for avatar driving.
- Zoom/Skype doesn't see
avatarify
camera. Restart Zoom/Skype and try again. - Avatar image is frozen: In Zoom, try Stop and Start Video.
bash run_mac.sh
crashes with "Cannot open camera": Try to change CAMID inrun_mac.sh
from0
to1
,2
, ...pipe:0: Invalid data found when processing input
: Make sureCAMID
inscripts/settings.sh
is correct. Usev4l2-ctl --list-devices
to query available devices.ASSERT: "false" in file qasciikey.cpp, line 501
. If you have several keyboard layouts, switch to English layout.No such file or directory: 'vox-adv-cpk.pth.tar'
. Please follow instructions Download network weights
- Avatrify uses First Order Motion Model for generating avatars.