Stability AI Unveils Compact SLM Innovation

Stability AI announced a smaller and faster version of its open source Stability small language model (SLM) named Stable LM 2 1.6B. “Stable LM 2 1.6B is a state-of-the-art 1.6 billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch,” the announcement post explains. “This model’s compact size and speed lower hardware barriers, allowing more developers to participate in the generative AI ecosystem.”

For those unfamiliar, Stability is perhaps the most well-known of the open source generative AI company, and its aim is to democratize bias-free AI. Its most famous product is the Stable Diffusion family of text-to-image generative AI models.

Stable LM 2 1.6B is aimed at text and code generation and is among its most compact language models, with 1.6 billion parameters in its base model. This allows fast experimentation and iteration with moderate resources, the firm says.

Notably, Stability AI claims that Stable LM 2 1.6B outperforms Microsoft’s Phi-1.5 and Phi-2 language models, as well as other popular SLMs like TinyLlama and Falcon, when configured for a similar number of parameters.

As with those solutions, the idea is to reduce the size of the model as much as possible with minimal quality, safety, and performance compromises. SLMs are much more cost effective than Large Language Models (LLMs) like ChatGPT 4.x, so they can be run offline locally. But as Stability AI notes, they can also “exhibit common issues such as high hallucination rates or potential toxic language.”

Stable LM 2 1.6B can be tested with HuggingFace.

Next
Previous