オイルマネーLLMの180B版

@osanseviero
Falcon 180B is out🤯

- 180B params
- Trained on 3.5 trillion tokens+7 million GPU hours
- Quality on par with PaLM 2 outperforms Llama 2 and GPT-3.5 across 13 benchmarks
- 4bit and 8bit precision with similar quality

Demo: https://huggingface.co/spaces/tiiuae/falcon-180b-demo
Blog:
https://huggingface.co/blog/falcon-180b