问题是什么?
http://localhost:11434/v1/chat/completions (给出404)
http://localhost:11434 (显示ollama正在运行)
除此之外,Ollama正在工作。我不确定该怎么办。
操作系统:Linux
GPU:Nvidia
CPU:AMD
Ollama版本:0.2.7
6bc51xsx1#
如果你使用 curl 或 wget 来测试端点,请注意默认情况下这些工具发送一个 GET,而 /v1/chat/completions 需要一个 POST。
curl
wget
GET
POST
$ curl -D - http://localhost:11434/v1/chat/completions HTTP/1.1 404 Not Found Content-Type: text/plain Date: Sun, 21 Jul 2024 14:31:09 GMT Content-Length: 18 404 page not found
$ curl -D - -X POST http://localhost:11434/v1/chat/completions HTTP/1.1 400 Bad Request Content-Type: application/json; charset=utf-8 Date: Sun, 21 Jul 2024 14:31:44 GMT Content-Length: 83 {"error":{"message":"EOF","type":"invalid_request_error","param":null,"code":null}}
当你提供数据(-d,--data)时,curl 将自动使用 POST:
$ curl -D - http://localhost:11434/v1/chat/completions -d '{"model": "gemma2", "messages": [{"role": "user", "content": "hello"}], "stream": false}' HTTP/1.1 200 OK Content-Type: application/json Date: Sun, 21 Jul 2024 14:36:54 GMT Content-Length: 326 {"id":"chatcmpl-385","object":"chat.completion","created":1721572614,"model":"gemma2","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"Hello! 👋\n\nHow can I help you today? 😊"},"finish_reason":"stop"}],"usage":{"prompt_tokens":10,"completion_tokens":13,"total_tokens":23}}
1条答案
按热度按时间6bc51xsx1#
如果你使用
curl
或wget
来测试端点,请注意默认情况下这些工具发送一个GET
,而 /v1/chat/completions 需要一个POST
。当你提供数据(-d,--data)时,
curl
将自动使用POST
: