xAI’s AI startup by Elon Musk, Grok-1, an open-source model from the company, includes essential elements such as Weight and network structure, as previously announced by Musk.
Grok-1 is a model with 314 billion parameters, utilizing the Mixture-of-Experts (MoE) learning technique. This open-source model is in the pre-training phase as of October 2023 and has not been tailored for specific applications or use cases.
For more details, visit github.com/xai-org/grok.
Source: xAI
In celebration of this Open source, CEO Elon Musk himself chimed in with a set of his own.
Tell us more about the “Open” aspect of OpenAI…
– Elon Musk (@elonmusk) March 17, 2024
TLDR: xAI’s AI startup led by Elon Musk introduces Grok-1, an open-source model with 314 billion parameters using Mixture-of-Experts learning, still in pre-training phase as of October 2023. Musk emphasizes the importance of the “Open” element in OpenAI.
Leave a Comment