/atma17

Primary LanguageJupyter NotebookMIT LicenseMIT

AtmaCup 17

competition page: https://www.guruguru.science/competitions/24

Description

Usage

  1. Clone this repository

  2. Download datesets from the competition page

  3. unzip the datasets and put them in the input directory

  4. Docker build and Run

    docker compose up -d
    # to interact with the container
    docker compose local-dev exec bash
  5. run the following command

    make setup
    python -m src.exp.exp002.train && python -m src.exp.exp002.test
    python -m src.exp.exp004.train && python -m src.exp.exp004.test
    python -m src.exp.exp006.train && python -m src.exp.exp006.test
    python -m ensemble

References

[1] takaito, [atmaCup17] Tutorial Notebook① DeBERTa Large Modelの学習 (少し手を加えるとLB: 0.9718), https://www.guruguru.science/competitions/24/discussions/4aee3312-ba40-4934-9035-22c7d2b51c09/

[2] nishimoto, 2023-24年のKaggleコンペから学ぶ、NLPコンペの精度の上げ方, https://zenn.dev/nishimoto/articles/974f2a445f9d74

[3] AI SHIFT, 【AI Shift/Kaggle Advent Calendar 2022】Kaggleで学んだBERTをfine-tuningする際のTips④〜Adversarial Training編〜, https://www.ai-shift.co.jp/techblog/2985