Uncategorized

Mistral Releases Codestral, Its First Generative AI Model For Code

Mistral, the French AI startup backed by Microsoft and valued at $6 billion, has released its first generative AI model for coding, dubbed Codestral. From a report: Codestral, like other code-generating models, is designed to help developers write and interact with code. It was trained on over 80 programming languages, including Python, Java, C++ and JavaScript, explains Mistral in a blog post. Codestral can complete coding functions, write tests and “fill in” partial code, as well as answer questions about a codebase in English. Mistral describes the model as “open,” but that’s up for debate. The startup’s license prohibits the use of Codestral and its outputs for any commercial activities. There’s a carve-out for “development,” but even that has caveats: the license goes on to explicitly ban “any internal usage by employees in the context of the company’s business activities.” The reason could be that Codestral was trained partly on copyrighted content. Codestral might not be worth the trouble, in any case. At 22 billion parameters, the model requires a beefy PC in order to run.

Read more of this story at Slashdot.

Mistral, the French AI startup backed by Microsoft and valued at $6 billion, has released its first generative AI model for coding, dubbed Codestral. From a report: Codestral, like other code-generating models, is designed to help developers write and interact with code. It was trained on over 80 programming languages, including Python, Java, C++ and JavaScript, explains Mistral in a blog post. Codestral can complete coding functions, write tests and “fill in” partial code, as well as answer questions about a codebase in English. Mistral describes the model as “open,” but that’s up for debate. The startup’s license prohibits the use of Codestral and its outputs for any commercial activities. There’s a carve-out for “development,” but even that has caveats: the license goes on to explicitly ban “any internal usage by employees in the context of the company’s business activities.” The reason could be that Codestral was trained partly on copyrighted content. Codestral might not be worth the trouble, in any case. At 22 billion parameters, the model requires a beefy PC in order to run.

Read more of this story at Slashdot.

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy