databricks-ml-examples

alt text

MPT-7b-8k models

MPT-7B-8K are 7B parameter open-source LLM models with 8k context length trained with the MosaicML platform. It contains 2 models which are commercializable:

MPT-7B-8k FAQ

When would I choose…