Mistral AI Releases Mistral Small 4: A 119B-Parameter MoE Model that Unifies Instruct, Reasoning, and Multimodal Workloads
Mistral AI released Mistral Small 4, a 119B-parameter MoE model combining instruction-following, reasoning, multimodal understanding, and coding in one deployment. With 128 experts (4 active per token), 6B active parameters, and 256k context, it eliminates model-switching overhead for chat, agentic tasks, and long-document analysis.
MarkTechPost · 4 min read
Industry