Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama cannot be used normally #22805

Open
1 task done
qukuqhd opened this issue Jan 8, 2025 · 1 comment
Open
1 task done

Ollama cannot be used normally #22805

qukuqhd opened this issue Jan 8, 2025 · 1 comment
Labels
admin read Pending admin review bug [core label] triage Maintainer needs to classify the issue

Comments

@qukuqhd
Copy link

qukuqhd commented Jan 8, 2025

Check for existing issues

  • Completed

Describe the bug / provide steps to reproduce it

Using a non-local Ollama service as an AI configuration request is incorrect.Zed send Get request Uri is /api/chat.The correct request should be the Post method.
image

The configuration file is as follows.

"language_models": {
"ollama": {
"api_url": "http://ollama_ip:11434/",
"available_models": [
{
"display_name": "deepseek-coder",
"name": "deepseek-coder-v2:16b-lite-instruct-q8_0",
"max_tokens": 10000,
"keep_alive": "30m"
}
]
}
}

Zed Version and System Specs

zed Dev 0.169.0 windows 11

If applicable, add screenshots or screencasts of the incorrect state / behavior

image

If applicable, attach your Zed.log file to this issue.

Zed.log

@qukuqhd qukuqhd added admin read Pending admin review bug [core label] triage Maintainer needs to classify the issue labels Jan 8, 2025
@qukuqhd
Copy link
Author

qukuqhd commented Jan 8, 2025

ollama version is 0.5.1

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
admin read Pending admin review bug [core label] triage Maintainer needs to classify the issue
Projects
None yet
Development

No branches or pull requests

1 participant