databricks-ml-examples

Example notebooks for the Mixtral-8x7B models on Databricks

mistralai/Mixtral-8x7B-v0.1 and mistralai/Mixtral-8x7B-Instruct-v0.1 are a is a pretrained generative Sparse Mixture of Experts.

Mixtral 8x7B is a high-quality sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0.