Home ยป Mellum: A Niche-Focused, Expertly Crafted, Small-Scale Open Source Model by JetBrains

Mellum: A Niche-Focused, Expertly Crafted, Small-Scale Open Source Model by JetBrains

JetBrains has announced the open-source Mellum model for code completion, set to debut in 2024 and to be used with JetBrains’ IDE family, boasting faster performance compared to using the large LLM language model.

Referred to as a “focal model” by JetBrains, Mellum is specifically designed for code writing, unlike the LLM language model tailored for code usage. The company’s strategy involves creating multiple specialized Mellum models to perform various tasks, such as code completion and predicting code differences between versions.

With a primary model parameter size of 4B, the performance of Mellum in code completion is comparable to larger parameter-sized models. Currently, Mellum supports code completion for languages like Java, Kotlin, Python, Go, PHP, C, C++, C#, JavaScript, TypeScript, CSS, HTML, Rust, and Ruby.

JetBrains cited a commitment to transparency and believes that open-sourcing fosters collaborative software development. The code is now available on Hugging Face, usable both on the cloud (via vLLM) and locally (through llama.cpp or Ollama). However, JetBrains cautions that Mellum may not be suitable for all software developers directly but is more geared towards AI/ML researchers for study or advancement.

TLDR: JetBrains introduces Mellum, an open-source code completion model with enhanced speed and targeted capabilities for specialized code writing tasks.

More Reading

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Utilizing AI to Enhance Code Editing: JetBrains Introduces Offline Functionality for Next Line Coding Without Internet Connectivity

Google Releases CodeGemma AI Model for Real-Time Code Completion Speeding Up Programmer’s Code Editing

JetBrains Unveils Mellum: A Language Model for Code Writing to Boost Code Efficiency Faster than Traditional LLMs