Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Codestral (Mistral code suggestion) #12519

Open
1 task done
Solido opened this issue May 31, 2024 · 4 comments
Open
1 task done

Codestral (Mistral code suggestion) #12519

Solido opened this issue May 31, 2024 · 4 comments
Labels
ai Improvement related to Assistant, Copilot, or other AI features assistant AI feedback for Assistant enhancement [core label]

Comments

@Solido
Copy link

Solido commented May 31, 2024

Check for existing issues

  • Completed

Describe the feature

Support codestral from MistralAI as an equivalent of OpenAI.

Codestral support infill and VsCode plugins are already available.

https://mistral.ai/news/codestral/

Thanks!

If applicable, add mockups / screenshots to help present your vision of the feature

No response

@Solido Solido added admin read Pending admin review enhancement [core label] triage Maintainer needs to classify the issue labels May 31, 2024
@JosephTLyons JosephTLyons added ai Improvement related to Assistant, Copilot, or other AI features assistant AI feedback for Assistant and removed triage Maintainer needs to classify the issue admin read Pending admin review labels May 31, 2024
@universalmind303
Copy link

Additionally, It'd be amazing if we could use this for inline_completions.

@NightMachinery
Copy link

NightMachinery commented Jul 3, 2024

Don't be too excited. Codestral is terrible at doing FIM. I have switched to asking Sonnet 3.5 to just fill in the marked part, and it does the job 10x better, even though it is a chat model and not tuned for FIM at all. Codestral can't even match the parentheses right.

@neofob
Copy link

neofob commented Jul 10, 2024

I could use Codestral model with private-gpt (fork from zylon-ai's private-gpt) in chat mode running in Docker with NVIDIA GPU support. So it would be cool if we could get it to work with zed locally.

@seddonm1
Copy link

I did a basic implementation that works: #15573

A few outstanding questions as I don't know this code base very well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai Improvement related to Assistant, Copilot, or other AI features assistant AI feedback for Assistant enhancement [core label]
Projects
None yet
Development

No branches or pull requests

6 participants