Technology

Grok’s open source xAI model, but without any training code

[ad_1]

Elon Musk’s xAI unlocked the core code of the Grok AI model, but without any training code. The company described it as a “314 billion parameter expert blend model.” On GitHub.

In a blog post, xAI said that the model is not tuned to any specific application such as for use in conversations. The company noted that the Grok-1 was trained on a “custom” stack without specifying details. The model is licensed under Apache License 2.0which allows for commercial use cases.

Last week, Musk noted on X that xAI intends to open source Grok’s model this week. The company launched Grok in the form of a chatbot last year, which is available to Premium+ users of the X social network. Notably, the chatbot can access some of X’s data, but the open source model does not include connections to the social network.

Several notable companies have open-sourced some of their AI models including Meta’s LLaMa, Mistral, Falcon, and AI2. In February, Google also released two new open models called Gemma2B and Gemma7B.

Some AI tool makers are already talking about using Grok in their solutions. Perplexity CEO Arvind Srinivas posted on X that the company will fine-tune Grok for conversational search and make it available to Pro users.

Musk has been embroiled in a legal battle with OpenAI and filed a lawsuit against the company earlier this month for “betraying” the nonprofit’s AI goal. Since then, he has called out OpenAI and Sam Altman on X several times.



[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button