ollama OpenAI的端点返回404错误,

vmdwslir  于 3个月前  发布在  其他
关注(0)|答案(1)|浏览(54)

问题是什么?

除此之外,Ollama正在工作。我不确定该怎么办。

操作系统:Linux

GPU:Nvidia

CPU:AMD

Ollama版本:0.2.7

6bc51xsx

6bc51xsx1#

如果你使用 curlwget 来测试端点,请注意默认情况下这些工具发送一个 GET,而 /v1/chat/completions 需要一个 POST

$ curl -D - http://localhost:11434/v1/chat/completions
HTTP/1.1 404 Not Found
Content-Type: text/plain
Date: Sun, 21 Jul 2024 14:31:09 GMT
Content-Length: 18

404 page not found
$ curl -D - -X POST http://localhost:11434/v1/chat/completions
HTTP/1.1 400 Bad Request
Content-Type: application/json; charset=utf-8
Date: Sun, 21 Jul 2024 14:31:44 GMT
Content-Length: 83

{"error":{"message":"EOF","type":"invalid_request_error","param":null,"code":null}}

当你提供数据(-d,--data)时,curl 将自动使用 POST:

$ curl -D - http://localhost:11434/v1/chat/completions -d '{"model": "gemma2", "messages": [{"role": "user", "content": "hello"}], "stream": false}'
HTTP/1.1 200 OK
Content-Type: application/json
Date: Sun, 21 Jul 2024 14:36:54 GMT
Content-Length: 326

{"id":"chatcmpl-385","object":"chat.completion","created":1721572614,"model":"gemma2","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"Hello! 👋\n\nHow can I help you today? 😊"},"finish_reason":"stop"}],"usage":{"prompt_tokens":10,"completion_tokens":13,"total_tokens":23}}

相关问题