发生了什么?
在尝试创建一个新的大脑时,出现了下一个错误,更准确地说是在最后一步(上传文件并点击创建大脑之后)。
我严格遵循了文档,并使用了以下命令来启动应用程序(最后一步),还尝试使用docker compose up --build
命令,但这个错误总是出现(日志输出)。
我正在使用Docker桌面。
希望你们能帮助我解决这个问题。另外,我的Windows上是否可以在本地使用llama 2?我已经配置了Open AI,但也希望能够使用本地的LLM。谢谢。
相关日志输出:
backend-core | 2024-02-11 19:36:57,606 [ERROR] models.databases.supabase.user_usage [162]: {'code': 'XX000', 'details': None, 'hint': None, 'message': 'called `Result::unwrap()` on an `Err` value: InvalidPosition'}
backend-core | 2024-02-11 19:36:57,610 [ERROR] models.databases.supabase.user_usage [163]: Error while checking if user is a premium user. Stripe needs to be configured.
backend-core | 2024-02-11 19:36:57,612 [ERROR] models.databases.supabase.user_usage [166]: {'code': 'XX000', 'details': None, 'hint': None, 'message': 'called `Result::unwrap()` on an `Err` value: InvalidPosition'}
backend-core | 2024-02-11 19:36:58,048:INFO - HTTP Request: POST https://ovbvcnwemowuuuaebizd.supabase.co/functions/v1/telemetry "HTTP/1.1 200 OK"
backend-core | INFO: 172.19.0.1:34340 - "POST /brains/ HTTP/1.1" 500 Internal Server Error
backend-core | ERROR: Exception in ASGI application
backend-core | Traceback (most recent call last):
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn
backend-core | sock = connection.create_connection(
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 85, in create_connection
backend-core | raise err
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/util/connection.py", line 73, in create_connection
backend-core | sock.connect(sa)
backend-core | ConnectionRefusedError: [Errno 111] Connection refused
backend-core |
backend-core | The above exception was the direct cause of the following exception:
backend-core |
backend-core | Traceback (most recent call last):
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 791, in urlopen
backend-core | response = self._make_request(
backend-core | ^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 497, in _make_request
backend-core | conn.request(
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 395, in request
backend-core | self.endheaders()
backend-core | File "/usr/local/lib/python3.11/http/client.py", line 1281, in endheaders
backend-core | self._send_output(message_body, encode_chunked=encode_chunked)
backend-core | File "/usr/local/lib/python3.11/http/client.py", line 1041, in _send_output
backend-core | self.send(msg)
backend-core | File "/usr/local/lib/python3.11/http/client.py", line 979, in send
backend-core | self.connect()
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 243, in connect
backend-core | self.sock = self._new_conn()
backend-core | ^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 218, in _new_conn
backend-core | raise NewConnectionError(
backend-core | urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f6709c9c290>: Failed to establish a new connection: [Errno 111] Connection refused
backend-core |
backend-core | The above exception was the direct cause of the following exception:
backend-core |
backend-core | Traceback (most recent call last):
backend-core | File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 486, in send
backend-core | resp = conn.urlopen(
backend-core | ^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 845, in urlopen
backend-core | retries = retries.increment(
backend-core | ^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 515, in increment
backend-core | raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f6709c9c290>: Failed to establish a new connection: [Errno 111] Connection refused'))
backend-core |
backend-core | During handling of the above exception, another exception occurred:
backend-core |
backend-core | Traceback (most recent call last):
backend-core | File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 157, in _process_emb_response
backend-core | res = requests.post(
backend-core | ^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/requests/api.py", line 115, in post
backend-core | return request("post", url, data=data, json=json, **kwargs)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/requests/api.py", line 59, in request
backend-core | return session.request(method=method, url=url, **kwargs)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 589, in request
backend-core | resp = self.send(prep, **send_kwargs)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 703, in send
backend-core | r = adapter.send(request, **kwargs)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 519, in send
backend-core | raise ConnectionError(e, request=request)
backend-core | requests.exceptions.ConnectionError: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f6709c9c290>: Failed to establish a new connection: [Errno 111] Connection refused'))
backend-core |
backend-core | During handling of the above exception, another exception occurred:
backend-core |
backend-core | Traceback (most recent call last):
backend-core | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 428, in run_asgi
backend-core | result = await app( # type: ignore[func-returns-value]
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
backend-core | return await self.app(scope, receive, send)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 276, in __call__
backend-core | await super().__call__(scope, receive, send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 122, in __call__
backend-core | await self.middleware_stack(scope, receive, send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 184, in __call__
backend-core | raise exc
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 162, in __call__
backend-core | await self.app(scope, receive, _send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 91, in __call__
backend-core | await self.simple_response(scope, receive, send, request_headers=headers)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 146, in simple_response
backend-core | await self.app(scope, receive, send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 79, in __call__
backend-core | raise exc
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 68, in __call__
backend-core | await self.app(scope, receive, sender)
backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
backend-core | raise e
backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
backend-core | await self.app(scope, receive, send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 718, in __call__
backend-core | await route.handle(scope, receive, send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 276, in handle
backend-core | await self.app(scope, receive, send)
backend-core | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 66, in app
backend-core | response = await func(request)
backend-core | ^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 237, in app
backend-core | raw_response = await run_endpoint_function(
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 163, in run_endpoint_function
backend-core | return await dependant.call(**values)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/code/modules/brain/controller/brain_routes.py", line 110, in create_new_brain
backend-core | new_brain = brain_service.create_brain(
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/code/modules/brain/service/brain_service.py", line 144, in create_brain
backend-core | created_brain = self.brain_repository.create_brain(brain)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/code/modules/brain/repository/brains.py", line 20, in create_brain
backend-core | brain_meaning = embeddings.embed_query(string_to_embed)
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 217, in embed_query
backend-core | embedding = self._embed([instruction_pair])[0]
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 192, in _embed
backend-core | return [self._process_emb_response(prompt) for prompt in iter_]
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 192, in <listcomp>
backend-core | return [self._process_emb_response(prompt) for prompt in iter_]
backend-core | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
backend-core | File "/usr/local/lib/python3.11/site-packages/langchain_community/embeddings/ollama.py", line 163, in _process_emb_response
backend-core | raise ValueError(f"Error raised by inference endpoint: {e}")
backend-core | ValueError: Error raised by inference endpoint: HTTPConnectionPool(host='host.docker.internal', port=11434): Max retries exceeded with url: /api/embeddings (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f6709c9c290>: Failed to establish a new connection: [Errno 111] Connection refused'))
backend-core | INFO: 127.0.0.1:45962 - "GET /healthz HTTP/1.1" 200 OK
Twitter/LinkedIn详细信息:无响应
4条答案
按热度按时间z9smfwbn1#
目前Ollama仅支持Linux和Mac,但可以通过WSL进行安装。
aoyhnmkz2#
@jessedegans 好的。你尝试过让quivr在windows上运行吗?谢谢。
deyfvvtc3#
感谢您的贡献,我们将关闭此问题,因为它已经过时。如果您想继续讨论,请随时重新打开。
yptwkmov4#
add this bold line to resolve my issue
$ cat /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target
[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/home/sandbox/miniconda3/condabin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/usr/lib/wsl/lib:/mnt/c/Windows/system32:/mnt/c/Windows:/mnt/c/Windows/System32/Wbem:/mnt/c/Windows/System32/WindowsPowerShell/v1.0/:/mnt/c/Windows/System32/OpenSSH/:/mnt/c/Program Files/NVIDIA Corporation/NVIDIA NvDLISR:/mnt/c/Program Files (x86)/NVIDIA Corporation/PhysX/Common:/mnt/c/Users/shanb/AppData/Local/Microsoft/WindowsApps:/mnt/e/VSCode/bin:/snap/bin"
Environment="OLLAMA_HOST=0.0.0.0"
[Install]
WantedBy=default.target
Then restart ollama service