datasets | language | ||
---|---|---|---|
|
|
Falcon-7B is a 7B parameters causal decoder-only model built by TII and trained on 1500B tokens of RefinedWeb enhanced with curated corpora. It is made available under the TII Falcon LLM License.
More details coming soon.
- Developed by: https://www.tii.ae
- Model type: Causal decoder-only
- Language(s) (NLP): English
- License: TII Falcon LLM License
- Paper: coming soon
- Demo: coming soon
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful
Falcon-7B is trained on English and French data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online
More details coming soon in the paper.