Stability AI Unveils Stable Code 3B: An Efficient AI Programming Model for Offline Operation

2024-01-17

Stability AI has released Stable Code 3B, the latest model for AI-assisted software development. It is a 2.7 billion parameter pure decoder language model that has been pre-trained on a dataset of 13 trillion different texts and code snippets. This new large-scale language model focuses on accurate and responsive code completion and is comparable to models that are 2.5 times its size. Impressively, Stable Code 3B can run offline on regular laptops without a GPU, bringing powerful AI coding capabilities to more developers. Stable Code 3B offers 60% performance of Code Llama 7B with a significantly smaller size, representing a huge efficiency breakthrough. It leverages Stability AI's 3 billion basic parameter Stable LM and further trains on over 4 trillion software engineering data. This streamlined model allows developers to write AI-assisted applications privately and in real-time, even without internet access. Stable Code 3B supports an extended context length of up to 100,000 tokens, far exceeding its training length of 16,384 tokens. This extension enables the model to parse and auto-complete more complex code. Stable Code 3B demonstrates cutting-edge proficiency in 18 programming languages based on the MultiPL-E benchmark for multi-language programming evaluation. It is capable of handling complex, multi-file projects and providing context-aware solutions in languages such as Python and Javascript. Stable Code 3B is available for both non-commercial and commercial purposes through Stability AI membership, along with their other core models such as SDXL Turbo and Stable Video Diffusion.