/humaneval-xl

[LREC-COLING'24] HumanEval-XL: A Multilingual Code Generation Benchmark for Cross-lingual Natural Language Generalization

Primary LanguagePythonMIT LicenseMIT

Watchers