/LASAFT-Net-v2_reference

A PyTorch implementation: "LASAFT-Net-v2: Listen, Attend and Separate by Attentively aggregating Frequency Transformation"

Primary LanguagePythonMIT LicenseMIT

LASAFT-Net-v2

我修改了安装依赖,在kino_requirements.txt里

Listen, Attend and Separate by Attentively aggregating Frequency Transformation

Woosung Choi, Yeong-Seok Jeong, Jinsung Kim, Jaehwa Chung, Soonyoung Jung, and Joshua D. Reiss

Demonstration (under construction)

Experimental Results

  • Musdb 18
model vocals drums bass other AVG
Meta-TasNet 6.40 5.91 5.58 4.19 5.52
AMSS-Net 6.78 5.92 5.10 4.51 5.58
LaSAFT-Net-v1 7.33 5.68 5.63 4.87 5.88
LASAFT-Net-v2 7.57 6.13 5.28 4.87 5.96
model model type vocals drums bass other AVG
KUILAB-MDX-Net dedicated (1 source/ 1 model) 8.901 7.173 7.232 5.636 7.236
LaSAFT-Net-v1 (light) conditioned (4 sources/ 1 model) 7.275 5.935 5.823 4.557 5.897
LASAFT-Net-v2 (light) conditioned (4 sources/ 1 model) 7.324 5.976 5.884 4.642 5.957

How to reproduce

1. Environment

  • Ubuntu 20.04
  • wandb for logging

You must create .env file by copying .env.sample to set environmental variables.

wandb_api_key=[Your Key] # "xxxxxxxxxxxxxxxxxxxxxxxx"
data_dir=[Your Path] # "/home/ielab/repos/musdbHQ"
  • about wandb_api_key
    • we currently only support wandb for logging.
    • for wandb_api_key, visit wandb, go to setting, and then copy your api key
  • about data_dir
    • the absolute path where datasets are stored

2. Installation (cuda)

conda env create -f environment.yaml -n lasaftv2
conda activate lasaftv2
pip install -r requirements.txt