Introducing Llama2-70B-Chat with MosaicML Inference

Introducing Llama2-70B-Chat with MosaicML Inference

Llama2-70B-Chat is available via MosaicML Inference. To get started, sign up here and check out our inference product page. Figure 1: Human raters prefer Llama2-70B-Chat to ChatGPT and PaLM-Bison. Adapted from the Llama2 technical paper . See the paper for additional data on model-based evaluation using GPT-4. Llama2-70B-Chat was fine-tuned for dialog use cases, carefully optimized for safety and helpfulness leveraging over 1 million human annotations. On July 18th, Meta published Llama2-70B-Chat : a 70B parameter language model pre-trained on 2 trillion tokens of text with a context length of 4096 that outperforms all open source models on many benchmarks , and is comparable in quality to closed proprietary models such as OpenAI’s ChatGPT and Google PaLM-Bison....

Published in www.mosaicml.com · by Hagay Lupesko, Margaret Qian, Daya Khudia, Sam Havens, Daniel King, Erica Ji Yuen · 13 min read · August 28, 2023