Amazon Bedrock will get Mistral AI models soon

The goal of the French AI startup Mistral AI is to advance publicly accessible models to cutting-edge capabilities.

AWS is happy to inform you that Mistral 7B and Mixtral 8x7B, two excellent Mistral AI models, will soon be accessible on Amazon Bedrock.

The first foundation model from Mistral AI, Mistral 7B, supports tasks involving the natural coding of English text

This popular sparse Mixture-of-Experts (MoE) model, Mixtral 8x7B, excels at text completion, classification, question and answer, and code generation.

One standout feature of Mistral AI’s models is their exceptional cost-performance balance

Quick inference Mistral AI models are designed for minimal latency and have a remarkable inference speed

Mixtral 8x7B speaks multiple languages, has natural coding skills, and matches or surpasses Llama 2 70B on all benchmarks despite being six times faster

French, German, Spanish, Italian, and English are all mastered by Mixtral 8x7B

Alongside Mixtral 8x7B, Mistral AI is also release Mixtral 8x7B Instruct