/KORani

Primary LanguagePythonApache License 2.0Apache-2.0

KORani

  • KORani: Large Language Models for πŸ‡°πŸ‡· Korean and πŸ‡ΊπŸ‡Έ English using LLaMA 13B and Polyglot 12.8B.
  • Tested which LLM is effective for πŸ‡°πŸ‡· Korean tasks after finetuning.
  • πŸ€— You can download the weights from the Link.

Release

This repository contains inference code for KORani models that are based on LLaMA 13B and Polyglot 12.8B. KORani models are finetuned using ShareGPT & KoVicuna dataset. This work is hugely influenced by Vicuna project.

Models

We offer three types of models as follows.

Model Base Train dataset Huggingface Link
1️⃣ KORani-v1-13B Polyglot 12.8B KoVicuna dataset Link 1
2️⃣ KORani-v2-13B LLaMA 13B KoVicuna dataset Link 2
3️⃣ KORani-v3-13B LLaMA 13B ShareGPT & KoVicuna dataset Link 3

Notes

  • We used LLaMA 13B from here.
  • We extracted only the data from Kovicuna that corresponds to the first and second parts of the conversation, which are 'human' and 'GPT'.
  • The model finetuning was conducted on eight A100 40GB GPUs. The code used for training is based on the Fastchat.

Local Setup

  1. Install dependencies
    pip install -r requirements.txt

How to use

  1. Prepare your prompt at prompts/{task_name}.txt
  2. Run inference.py
python inference.py --model_path MODEL_NAME --task TASK_NAME

Command

--model_path (str): model path for evaluation. (e.g. KRAFTON/KORani-v3-13B)
--task (str): choose which task you want to evaluate. (e.g. only [QA, summarization, translation] are available in this repo.)

Examples

You can check how to get the evaluation score in the tables from this git repository. https://github.com/krafton-ai/AutoEvalGPT

1. Question Answering (QA)

python inference.py --model_path "KRAFTON/KORani-v3-13B" --task "QA"

This is the prompt for QA task. You can modify it in the QA.txt.

PROMPT = """μš°λ¦¬λŠ” μ•„λž˜μ™€ 같은 정보λ₯Ό κ°–κ³  μžˆμŠ΅λ‹ˆλ‹€.
---------------------
{context}
---------------------
### 주어진 정보에 따라, μ§ˆλ¬Έμ— λ‹΅ν•΄μ£Όμ„Έμš”.: '{question}'
### Assistant:"""

For example,

context = "헨리 κ΅¬μŠ€νƒ€ν”„ λͺ°λž˜μŠ¨(μ˜μ–΄: Henry Gustav Molaison, 1926λ…„ 2μ›” 26일 ~ 2008λ…„ 12μ›” 2일)은 λ‡Œμ „μ¦μ„ μΉ˜λ£Œν•˜κΈ° μœ„ν•΄μ„œ 수술적으둜 ν•΄λ§ˆλ₯Ό ν¬ν•¨ν•œ 내츑츑두엽이 제거된 미ꡭ의 κΈ°μ–΅μž₯μ•  ν™˜μž H.M으둜 μ „λΆ€ν„° 이미 μ•Œλ €μ Έ μžˆμ—ˆλ‹€. 
κ·ΈλŠ” 1957λ…„ 말뢀터 κ·Έκ°€ 죽을 λ•Œ κΉŒμ§€ κ·ΈλŠ” κ΄‘λ²”μœ„ν•˜κ²Œ 연ꡬ가 λ˜μ—ˆλ‹€. 그의 κ²½μš°λŠ” λ‡Œ κΈ°λŠ₯κ³Ό κΈ°μ–΅ μ‚¬μ΄μ˜ 링크λ₯Ό μ„€λͺ…ν•˜λŠ” 이둠 λ°œλ‹¬κ³Ό λ‡ŒμΈμ§€μ‹¬λ¦¬ν•™μ˜ λ°œλ‹¬, λ‡Œμ˜ ꡬ쑰와 κΈ°λŠ₯이 νŠΉμ • 심리적 κ³Όμ •κ³Ό κ΄€ν•œ 이해λ₯Ό λͺ©ν‘œλ‘œ ν•˜λŠ” μ‹¬λ¦¬ν•™μ˜ ν•œ 지점에 맀우 μ€‘μš”ν•œ 역할을 ν•˜μ˜€λ‹€. 
κ·Έκ°€ μ£½κΈ° 전에, κ·ΈλŠ” μ½”λ„€ν‹°μ»· μ£Ό μœˆμ € λ½μŠ€μ— μœ„μΉ˜ν•œ 관리기관에 μ‚΄λ©΄μ„œ μ§€μ†μ μœΌλ‘œ 쑰사가 λ˜μ—ˆλ‹€. 
그의 λ‡ŒλŠ” μ§€κΈˆ 2009λ…„ 12μ›” 4일에 μ‘°μ§ν•™μ μœΌλ‘œ μŠ¬λΌμ΄μŠ€λ˜μ–΄ λ―Έκ΅­ μƒŒλ””μ—μ΄κ³ μ— μžˆλ‹€."
question = "헨리 κ΅¬μŠ€νƒ€ν”„μ˜ λ‡ŒλŠ” ν˜„μž¬ 어디에 μœ„μΉ˜ν•΄ μžˆλŠ”κ°€?"
Model Score (averaged over 5 examples) Output Example
GPT-4 - 헨리 κ΅¬μŠ€νƒ€ν”„μ˜ λ‡ŒλŠ” ν˜„μž¬ λ―Έκ΅­ μƒŒλ””μ—μ΄κ³ μ— μœ„μΉ˜ν•΄ μžˆλ‹€.
GPT-3.5-turbo 7.1 헨리 κ΅¬μŠ€νƒ€ν”„μ˜ λ‡ŒλŠ” μ§€κΈˆ 2009λ…„ 12μ›” 4일에 μ‘°μ§ν•™μ μœΌλ‘œ μŠ¬λΌμ΄μŠ€λ˜μ–΄ λ―Έκ΅­ μƒŒλ””μ—μ΄κ³ μ— μžˆλ‹€.
Vicuna-13B 7.2 헨리 κ΅¬μŠ€νƒ€ν”„μ˜ λ‡ŒλŠ” 2009λ…„ 12μ›” 4일에 μ‘°μ§ν•™μ μœΌλ‘œ μŠ¬λΌμ΄μŠ€λ˜μ–΄ λ―Έκ΅­ μƒŒλ””μ—μ΄κ³ μ— μžˆμŠ΅λ‹ˆλ‹€.
KoAlpaca-13B 2.6 Henry Gustav Molaison의 λ‡ŒλŠ” 2008년에 μ‚¬λ§ν•˜μ˜€μŠ΅λ‹ˆλ‹€. 그의 λ‡ŒλŠ” μ½”λ„€ν‹°μ»· μ£Ό μœˆμ € λ½μŠ€μ— μœ„μΉ˜ν•œ 기관에 λ³΄κ΄€λ˜μ–΄ 있으며, ν˜„μž¬λ„ 계속 μ‘°μ‚¬μ€‘μž…λ‹ˆλ‹€.
KORani-v1 6.6 헨리 κ΅¬μŠ€νƒ€ν”„ λͺ°λ ˆμ΄μŠ¨μ˜ λ‡ŒλŠ” ν˜„μž¬ λ―Έκ΅­ μƒŒλ””μ—μ΄κ³ μ— μžˆμŠ΅λ‹ˆλ‹€. 그의 λ‡ŒλŠ” 1957λ…„ 수술둜 제거된 그의 λ‚΄μΈ‘ μΈ‘λ‘μ—½μ˜ 자리λ₯Ό ν™•μΈν•˜κΈ° μœ„ν•΄ μ‘°μ§ν•™μ μœΌλ‘œ μ ˆλ‹¨λ˜μ—ˆμŠ΅λ‹ˆλ‹€. 이 μ—°κ΅¬λŠ” λ‡Œμ˜ ꡬ쑰와 κΈ°λŠ₯이 νŠΉμ • 심리적 κ³Όμ •κ³Ό κ΄€λ ¨λœ 이해λ₯Ό λͺ©ν‘œλ‘œ ν•˜λŠ” μ‹¬λ¦¬ν•™μ˜ ν•œ 뢄야인 인지 신경과학에 λŒ€ν•œ 그의 κ³΅ν—Œμ„ 기리기 μœ„ν•œ κ²ƒμž…λ‹ˆλ‹€."
KORani-v2 8.0 헨리 κ΅¬μŠ€νƒ€ν”„μ˜ λ‡ŒλŠ” ν˜„μž¬ μΊ˜λ¦¬ν¬λ‹ˆμ•„ μƒŒλ””μ—μ΄κ³ μ— μžˆλŠ” μƒŒλ””μ—μ΄κ³  λŒ€ν•™κ΅ μ›¨μŠ€νŠΈ 캠퍼슀의 μ‹ κ²½κ³Όν•™ μ‹€ν—˜μ‹€μ— λ³΄κ΄€λ˜μ–΄ μžˆμŠ΅λ‹ˆλ‹€.
KORani-v3 6.4 헨리 κ΅¬μŠ€νƒ€ν”„μ˜ λ‡ŒλŠ” ν˜„μž¬ μ½”λ„€ν‹°μ»· μ£Ό μœˆμ € λ½μŠ€μ— μžˆλŠ” 관리 기관에 μžˆμŠ΅λ‹ˆλ‹€.

2. Translation

python inference.py --model_path "KRAFTON/KORani-v3-13B" --task "translation"

This is the prompt for translation. You can modify it in the translation.txt.

PROMPT = """### Instruction: Translate English sentence into Korean. You may leave specific names as they are.
English: μ–Όλ§ˆλ‚˜ λ§Žμ€ μ–Έμ–΄λ₯Ό 말할 수 μžˆλ‹ˆ?
Korean: How many languages can you speak?#
English: 일 λ‹€ 끝났어?
Korean: Did you finish your work?#
English: {source_sentence} 
Korean:"""

For example,

source_sentence = "Hey! I have some Kissflow Legal Review requests that are blocked by Tax and Accounting, can this be looked at?"
Model Score (averaged over 5 examples) Output Example
GPT-4 - 헀이! μ œκ°€ Tax and Accounting에 μ˜ν•΄ μ°¨λ‹¨λœ λͺ‡ 가지 Kissflow 법λ₯  κ²€ν†  μš”μ²­μ΄ μžˆλŠ”λ°, 이것을 확인해 μ£Όμ‹€ 수 μžˆλ‚˜μš”?
DeepL 9.4 μ•ˆλ…•ν•˜μ„Έμš”! 세무 및 νšŒκ³„μ—μ„œ μ°¨λ‹¨λœ Kissflow 법λ₯  κ²€ν†  μš”μ²­μ΄ λͺ‡ 개 μžˆλŠ”λ°μš”, 이 μš”μ²­μ„ μ‚΄νŽ΄λ³Ό 수 μžˆλ‚˜μš”?
GPT-3.5-turbo 8.6 μ•ˆλ…•ν•˜μ„Έμš”! 세무 및 νšŒκ³„ λΆ€μ„œμ—μ„œ μ°¨λ‹¨λœ λͺ‡ 가지 Kissflow Legal Review μš”μ²­μ΄ μžˆμŠ΅λ‹ˆλ‹€. 확인해 μ£Όμ‹€ 수 μžˆλ‚˜μš”?
Vicuna-13B 3.8 μ•ˆλ…•ν•˜μ„Έμš”! μ„ΈκΈˆκ³„μ‚°κ³Ό νšŒκ³„λΆ€μ„œκ°€ 차단해 μžˆλŠ” Kissflow Legal Review μš”μ²­μ΄ λͺ‡ κ°œκ°€ μžˆμŠ΅λ‹ˆκΉŒ? 이것을 μ‚΄νŽ΄λ³΄μ‹œκ² μŠ΅λ‹ˆκΉŒ?
KoAlpaca-13B 5.6 μ•Ό! μ„ΈκΈˆκ³Ό νšŒκ³„μ— λ§‰νžŒ ν‚€μ‹±ν”Œλ‘œμš° 법적 κ²€ν†  μš”μ²­μ΄ μžˆλŠ”λ°, 이거 검토해쀄 수 μžˆμ–΄?
KORani-v1 7.5 μ•ˆλ…•ν•˜μ„Έμš”! μ„ΈκΈˆκ³Ό νšŒκ³„λ‘œ 인해 λ§‰ν˜€ μžˆλŠ” ν‚€μŠ€ν”Œλ‘œμš° 법λ₯  κ²€ν†  μš”μ²­μ΄ λͺ‡ 개 μžˆλŠ”λ°, κ²€ν† ν•΄ μ£Όμ‹€ 수 μžˆλ‚˜μš”?
KORani-v2 5.4 μ•ˆλ…•ν•˜μ„Έμš”! μ œκ°€ Kissflow Legal Review μš”μ²­μ„ λͺ©κ²©ν–ˆλŠ”데, 세무 및 νšŒκ³„ λΆ€μ„œμ—μ„œ μ°¨λ‹¨ν•˜κ³  μžˆλŠ”λ° 이 문제λ₯Ό 쑰사해 μ£Όμ‹œκ² μ–΄μš”?
KORani-v3 7.1 μ•ˆλ…•ν•˜μ„Έμš”! μ €λŠ” Kissflow Legal Review μš”μ²­μ΄ μ„ΈκΈˆκ³Ό νšŒκ³„μ— μ˜ν•΄ μ°¨λ‹¨λ˜κ³  μžˆλŠ”λ°, 이 λ¬Έμ œκ°€ μ‚΄νŽ΄λ³Ό 수 μžˆμ„κΉŒμš”?

3. Summarization

python inference.py --model_path "KRAFTON/KORani-v3-13B" --task "summarization"

This is the prompt for summarization. You can modify it in the summarization link. Keep in mind you did not exceed the maximum length = 2048.

PROMPT = """# Meeting note
{target_document}

# Summarize the meeting note into 3 Korean sentences.
### Output: 1)"""

For example,

target_document = """# Document
전년도 λŒ€λΉ„ 79λͺ… λŠ˜μ–΄ 1019λͺ…, ν–‰μ •μˆ˜μš” λŒ€μ²˜ κ΄‘μ–‘μ‹œμ˜ 곡무원 정원이 크게 λŠ˜μ–΄λ‚˜ ν–‰μ •μ„œλΉ„μŠ€ ν–₯상이 κΈ°λŒ€λœλ‹€. 
μ‹œλŠ” ν–‰μ •μ•ˆμ „λΆ€μ—μ„œ λ°œν‘œν•œ 2018년도 μžμΉ˜λ‹¨μ²΄ κΈ°μ€€μΈκ±΄λΉ„μ—μ„œ κ΄‘μ–‘μ‹œμ˜ 일반직 정원이 μ§€λ‚œν•΄λ³΄λ‹€ 79λͺ…이 λŠ˜μ–΄λ‚œ 1019λͺ…μœΌλ‘œ 산정됐닀고 λ°ν˜”λ‹€. 
μ§€λ‚œ 1995λ…„ 도농톡합 λ‹Ήμ‹œ 991λͺ…μ΄μ—ˆλ˜ κ΄‘μ–‘μ‹œ 곡무원 정원은 IMFμ‹œμ ˆμ— ν˜Ήλ…ν•œ ꡬ쑰쑰정을 κ±°μΉ˜λ©΄μ„œ 2002λ…„μ—λŠ” 788λͺ…μœΌλ‘œ 200μ—¬λͺ…이 κ°€κΉŒμ΄ μ€„μ–΄λ“€μ—ˆμœΌλ‚˜ 이번 정원 ν™•λ³΄λ‘œ 곡무원 정원 1000λͺ… μ‹œλŒ€λ₯Ό 맞게 됐닀. 
κ·Έλ™μ•ˆ κ΄‘μ–‘μ‹œλŠ” ν¬μŠ€μ½”λ₯Ό μ€‘μ‹¬μœΌλ‘œ ν•œ 산업단지와 μ»¨ν…Œμ΄λ„ˆλΆ€λ‘, κ²½μ œμžμœ κ΅¬μ—­, 택지 개발, λ‹€μ–‘ν•œ 볡지 μ •μ±… λ“± μ‹œλ―Όμ˜ μ‚Άμ˜ 질 ν–₯상을 μœ„ν•œ ν–‰μ •μˆ˜μš”κ°€ λ‚ λ‘œ μ¦ν­ν•˜λŠ” 데에 λΉ„ν•΄ ν•œμ •λœ 곡무원 μ •μ›μœΌλ‘œ λ§Žμ€ 어렀움을 κ²ͺμ–΄ μ™”μ—ˆλ‹€. 
μ‹œμ˜ 이번 정원 좩원은 μ§€κΈˆκΉŒμ§€ 격무에 μ‹œλ‹¬λ €μ˜¨ κ³΅λ¬΄μ›λ“€μ—κ²Œ λ”μš± μ—΄μ‹¬νžˆ 일할 수 μžˆλ„λ‘ ν™œλ ₯을 λΆˆμ–΄λ„£μœΌλ©΄μ„œ 지역과 λ‚˜λΌ λ°œμ „μ„ λ’·λ°›μΉ¨ ν•˜λŠ” κ²½μ œλ„μ‹œλ‘œμ„œμ˜ μœ„μƒμ„ λ“œλ†’μ΄λŠ” μ‹œλ„ˆμ§€ 효과둜 μ΄μ–΄μ§ˆ κ²ƒμœΌλ‘œ κΈ°λŒ€λœλ‹€. 
ν•œνŽΈ, κ΄‘μ–‘μ‹œλŠ” ν•œμ‹œκΈ°κ΅¬μΈβ€˜κΈ°μ—…μœ μΉ˜μΆ”μ§„λ‹¨β€™μ΄ 2017λ…„ μ—°λ§λ‘œ 폐지됨에 따라 μ „λΌλ‚¨λ„λ‘œλΆ€ν„° 4κΈ‰ μƒμ„€κΈ°κ΅¬μΈβ€˜μ‚°λ‹¨λ…Ήμ§€κ΄€λ¦¬μ„Όν„°β€™λ₯Ό 승인 λ°›μ•„ μ˜¬ν•΄ 1μ›”λΆ€ν„° μš΄μ˜ν•˜κ²Œ 됨으둜써 λ„μ‹œ κ°œλ°œμ— ν•œμΈ΅ 탄λ ₯을 받을 κ²ƒμœΌλ‘œ μ „λ§ν•˜κ³  μžˆλ‹€. 
정홍기 총무과μž₯μ€β€œμ°ΈμœΌλ‘œ μ–΄λ ΅κ²Œ κ°€μ Έμ˜¨ 곡무원 정원과 4κΈ‰ μƒμ„€κΈ°κ΅¬λŠ” 격무에 μ‹œλ‹¬λ €μ˜¨ μ§μ›λ“€μ˜ μˆ¨ν†΅μ„ 트이게 ν•΄ 15만 μ‹œλ―Όμ„ μœ„ν•œ ν–‰μ •μ„œλΉ„μŠ€ μ§ˆμ„ ν–₯μƒμ‹œν‚¬ κ²ƒμœΌλ‘œ κΈ°λŒ€λœλ‹€β€λ©°β€œμ΄λ‹¬ μ•ˆμ— 정원 μ‘°λ‘€λ₯Ό κ°œμ •ν•˜κ³ , 직무뢄석을 톡해 λΆ€μ„œλ³„ 정원을 μ‘°μ •ν•  κ³„νšμ΄λ‹€β€κ³  λ§ν–ˆλ‹€.

ν•œκ΅­μ–΄ 3λ¬Έμž₯으둜 μœ„μ˜ Documentλ₯Ό μš”μ•½ν•΄μ€˜.
### Assistant: 1)"""
Model Score (averaged over 5 examples) Output Example
GPT-4 - 1) κ΄‘μ–‘μ‹œμ˜ 곡무원 정원이 전년도 λŒ€λΉ„ 79λͺ… λŠ˜μ–΄λ‚œ 1019λͺ…μœΌλ‘œ μ¦κ°€ν–ˆλ‹€. 2) 이둜 인해 ν–‰μ •μ„œλΉ„μŠ€ ν–₯상이 κΈ°λŒ€λ˜λ©°, 지역과 λ‚˜λΌ λ°œμ „μ„ λ’·λ°›μΉ¨ν•˜λŠ” κ²½μ œλ„μ‹œλ‘œμ„œμ˜ μœ„μƒμ΄ μƒμŠΉν•  κ²ƒμœΌλ‘œ μ „λ§λœλ‹€. 3) ν•œνŽΈ, κ΄‘μ–‘μ‹œλŠ” μ „λΌλ‚¨λ„λ‘œλΆ€ν„° 4κΈ‰ 상섀기ꡬ인 '산단녹지관리센터'λ₯Ό μŠΉμΈλ°›μ•„ λ„μ‹œ κ°œλ°œμ— 탄λ ₯을 받을 κ²ƒμœΌλ‘œ μ˜ˆμƒλœλ‹€.
GPT-3.5-turbo 8.5 1) κ΄‘μ–‘μ‹œ 곡무원 정원이 79λͺ… λŠ˜μ–΄ 1019λͺ…μœΌλ‘œ λŠ˜μ–΄λ‚˜ ν–‰μ •μ„œλΉ„μŠ€ ν–₯상이 κΈ°λŒ€λœλ‹€κ³  λ°ν˜”λ‹€.2) κ΄‘μ–‘μ‹œλŠ” 이번 정원 μΆ©μ›μœΌλ‘œ 지역과 λ‚˜λΌ λ°œμ „μ„ λ’·λ°›μΉ¨ ν•˜λŠ” κ²½μ œλ„μ‹œ μœ„μƒμ„ λ“œλ†’μΌ κ²ƒμœΌλ‘œ κΈ°λŒ€λœλ‹€. 3) 정원 μ‘°λ‘€λ₯Ό κ°œμ •ν•˜κ³ , 직무뢄석을 톡해 λΆ€μ„œλ³„ 정원을 μ‘°μ •ν•  κ³„νšμ΄λ‹€.
Vicuna-13B 7.4 1) 2018년도 κ΄‘μ–‘μ‹œμ˜ 곡무원 정원 1019λͺ…, μ§€λ‚œν•΄ 79λͺ… 증가. 2) 이번 정원 좩원은 μ§€κΈˆκΉŒμ§€ 격무에 μ‹œλ‹¬λ €μ˜¨ κ³΅λ¬΄μ›λ“€μ—κ²Œ λ”μš± μ—΄μ‹¬νžˆ 일할 수 μžˆλ„λ‘ ν™œλ ₯을 λΆˆμ–΄λ„£μœΌλ©΄μ„œ 지역과 λ‚˜λΌ λ°œμ „μ„ λ’·λ°›μΉ¨ ν•˜λŠ” κ²½μ œλ„μ‹œλ‘œμ„œμ˜ μœ„μƒμ„ λ“œλ†’μ΄λŠ” μ‹œλ„ˆμ§€ 효과둜 μ΄μ–΄μ§ˆ κ²ƒμœΌλ‘œ κΈ°λŒ€λœλ‹€. 3) κ΄‘μ–‘μ‹œλŠ” μ „λΌλ‚¨λ„λ‘œλΆ€ν„° 4κΈ‰ μƒμ„€κΈ°κ΅¬μΈβ€˜μ‚°λ‹¨λ…Ήμ§€κ΄€λ¦¬μ„Όν„°β€™λ₯Ό 승인 λ°›μ•„ μš΄μ˜ν•˜κ²Œ 됨으둜써 λ„μ‹œ κ°œλ°œμ— ν•œμΈ΅ 탄λ ₯을 받을 κ²ƒμœΌλ‘œ μ „λ§ν•˜κ³  μžˆλ‹€.
KoAlpaca-13B 5.7 1) κ΄‘μ–‘μ‹œλŠ” μž‘λ…„λ³΄λ‹€ 79λͺ… λŠ˜μ–΄λ‚œ 1019λͺ…μ˜ 곡무원을 λ‘˜ 수 있게 λ˜μ—ˆμŠ΅λ‹ˆλ‹€. 2) μ΄λŠ” ν–‰μ •μ•ˆμ „λΆ€μ˜ 지침에 λ”°λ₯Έ κ²ƒμž…λ‹ˆλ‹€. 3) 이둜써 κ΄‘μ–‘μ‹œλŠ” ν•œμ •λœ μ˜ˆμ‚°μœΌλ‘œ 인해 λ°œμƒν–ˆλ˜ κ³΅λ¬΄μ›λ“€μ˜ 초과 근무 μˆ˜λ‹Ή 지급 λ“±μ˜ 문제λ₯Ό ν•΄κ²° ν•  수 있게 λ˜μ—ˆμŠ΅λ‹ˆλ‹€.
KORani-v1 7.0 1) κ΄‘μ–‘μ‹œλŠ” μž‘λ…„λ³΄λ‹€ 79λͺ…이 λŠ˜μ–΄λ‚œ 1019λͺ…μ˜ 일반직 곡무원 정원을 ν–‰μ•ˆλΆ€λ‘œλΆ€ν„° μŠΉμΈλ°›μ•˜μŠ΅λ‹ˆλ‹€. 2) μ΄λŠ” κ΄‘μ–‘μ‹œμ˜ ν–‰μ • μˆ˜μš” 증가λ₯Ό μˆ˜μš©ν•˜κΈ° μœ„ν•œ κ²ƒμž…λ‹ˆλ‹€. 3) 이 정원 μ¦κ°€λŠ” κ΄‘μ–‘μ‹œ κ³΅λ¬΄μ›λ“€μ—κ²Œ 더 λ§Žμ€ 기회λ₯Ό μ œκ³΅ν•  κ²ƒμž…λ‹ˆλ‹€.
KORani-v2 6.2 1)κ΄‘μ–‘μ‹œμ˜ 곡무원 정원은 이번 정원 ν™•λ³΄λ‘œ 1000λͺ… μ£Όλ‹Ή 곡무원 정원을 λ‹¬μ„±ν–ˆμŠ΅λ‹ˆλ‹€. 2) μ΄λŸ¬ν•œ 좩원은 μ§€κΈˆκΉŒμ§€ 격무에 μ‹œλ‹¬λ €μ˜¨ κ³΅λ¬΄μ›λ“€μ—κ²Œ λ”μš± μ—΄μ‹¬νžˆ 일할 수 μžˆλ„λ‘ ν™œλ ₯을 λΆˆμ–΄λ„£μœΌλ©΄μ„œ μ§€μ—­μ‚¬νšŒμ™€ κ΅­κ°€μ˜ λ°œμ „μ„ μ§€μ›ν•˜λŠ” μ‹œλ„ˆμ§€ 효과둜 μ΄μ–΄μ§ˆ κ²ƒμœΌλ‘œ κΈ°λŒ€λ©λ‹ˆλ‹€. 3) κ΄‘μ–‘μ‹œλŠ” 'κΈ°μ—…μœ μΉ˜μΆ”μ§„λ‹¨'을 νμ§€ν•˜κ³  '산단녹지관리센터'λ₯Ό μŠΉμΈν–ˆμœΌλ©°, μ΄λŠ” λ„μ‹œ κ°œλ°œμ— ν•œμΈ΅ 탄λ ₯을 받을 κ²ƒμœΌλ‘œ κΈ°λŒ€λ©λ‹ˆλ‹€.
KORani-v3 6.9 1)κ΄‘μ–‘μ‹œμ˜ 곡무원 정원은 μ§€λ‚œν•΄λ³΄λ‹€ 79λͺ…이 λŠ˜μ–΄λ‚œ 1019λͺ…μœΌλ‘œ μ‚°μ •λ˜μ—ˆμŠ΅λ‹ˆλ‹€.2) μ΄λŸ¬ν•œ 좩원은 μ§€μ—­μ‚¬νšŒμ™€ κ΅­κ°€ λ°œμ „μ„ μ§€μ›ν•˜λŠ” ν–‰μ •μ„œλΉ„μŠ€μ˜ ν–₯μƒμœΌλ‘œ μ΄μ–΄μ§ˆ κ²ƒμœΌλ‘œ κΈ°λŒ€λ©λ‹ˆλ‹€.3) μ‹œλŠ” '이달 내에 정원 μ‘°λ‘€λ₯Ό κ°œμ •ν•˜κ³  직무뢄석을 톡해 λΆ€μ„œλ³„λ‘œ 정원을 μ‘°μ •ν•  κ³„νš'이며 ν–‰μ •μ„œλΉ„μŠ€ ν–₯상을 μœ„ν•΄ λ…Έλ ₯ν•  것이라고 λ°ν˜”μŠ΅λ‹ˆλ‹€.

Evaluation

We tested model performance using GPT-4, and the code and results of the test can be found through the AutoEvalGPT.

Limitations

The Korean performance of our models is not as good as the English performance of Vicuna. We believe this is due to the not enough quality of foundation models in the Korean tasks (compared to Llama in English tasks) and the dataset quality, which is primarily translational. We will continue to update the new versions of the Korani models as soon as we achieve better results.

License

Our github repo and models are intended for research purpose, non-commercial use only, subject to the model License of LLaMA, Terms of Use of the data generated by OpenAI, and Privacy Practices of ShareGPT. Please contact us If you find any potential violation. The code is released under the Apache License 2.0.