Home ยป [Rumor] Microsoft is Currently Training its Own Large-scale Model MAI-1 to Reduce Dependence on OpenAI

[Rumor] Microsoft is Currently Training its Own Large-scale Model MAI-1 to Reduce Dependence on OpenAI

The rumor has it that Microsoft is in the process of training its own large-scale language model called MAI-1, boasting 500 billion parameters and on par with models from OpenAI, Google, or Anthropic. Leading this project is Mustafa Suleyman, co-founder of DeepMind, who recently transitioned from Inflection AI to become Microsoft AI’s CEO.

This news about MAI-1 is significant as it marks Microsoft’s first foray into developing its own LLM model after relying on OpenAI’s GPT model for the past two years. The internal turmoil at OpenAI may serve as a cautionary tale for Microsoft, reminding them not to rely too heavily on external companies.

While Microsoft has been developing its own language models like Phi (most recently Phi-3), they have never introduced a large-scale model like MAI-1 before. Speculation is rife that Microsoft may officially unveil MAI-1 at Build 2024 towards the end of this month.

Source: The Information, Ars Technica

TLDR: Microsoft is reportedly training its own large-scale language model called MAI-1, potentially unveiling it at Build 2024. This move signifies a shift away from reliance on external companies like OpenAI.

More Reading

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Title: ChatGPT Now Supports Direct File Uploads from Google Drive and OneDrive for Premium Clients

SWE-Bench Verified: OpenAI Develops Programmer Testing Suite for Authentic Programming Skills Confirmation

OpenAI Launches ChatGPT Edu for Educational Institutions: A feature-rich Enterprise-Level Tool at a Budget-Friendly Price