llama_index [Bug]:启动教程(本地模型)-模型响应不正确

2skhul33  于 2个月前  发布在  其他
关注(0)|答案(6)|浏览(43)

Bug Description

我按照文档页面中描述的安装和设置步骤进行了操作。一切似乎都设置正确,但是当我运行启动教程代码时,模型似乎没有正确处理文档("paul_graham_essay.txt")或请求,因为它返回的响应是:"Based on the context provided in the essay, the author did not directly mention what they did growing up. However, we can infer some information about their background from the text. [...]",而不是回答问题:"What did the author do growing up?"。从日志记录来看,检索到的前两个节点实际上并不是回答问题的最相关部分的文本。可能是什么问题?

版本

0.10.37

重现步骤

按照此处的说明进行安装和设置:https://docs.llamaindex.ai/en/stable/getting_started/installation/,然后运行以下代码作为 "starter.py":

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Settings
from llama_index.embeddings.ollama import OllamaEmbedding
from llama_index.llms.ollama import Ollama
import logging
import sys
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))
documents = SimpleDirectoryReader("data").load_data()
Settings.embed_model = OllamaEmbedding(model_name="nomic-embed-text")
Settings.llm = Ollama(model="llama2", request_timeout=360.0)
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")
print(response)

相关的日志/回溯

PS C:\Users\matteucc\Desktop\Playground\LlamaIndex> & C:/Users/matteucc/AppData/Local/anaconda3/envs/llamaindex/python.exe c:/Users/matteucc/Desktop/Playground/LlamaIndex/starter.py
DEBUG:llama_index.core.readers.file.base:> [SimpleDirectoryReader] Total files added: 1
> [SimpleDirectoryReader] Total files added: 1
DEBUG:fsspec.local:open file: C:/Users/matteucc/Desktop/Playground/LlamaIndex/data/paul_graham_essay.txt
open file: C:/Users/matteucc/Desktop/Playground/LlamaIndex/data/paul_graham_essay.txt
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: What I Worked On

February 2021

Before college...
> Adding chunk: What I Worked On

February 2021

Before college...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: All that seemed left for philosophy were edge c...
> Adding chunk: All that seemed left for philosophy were edge c...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: Its brokenness did, as so often happens, genera...
> Adding chunk: Its brokenness did, as so often happens, genera...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: If he even knew about the strange classes I was...
> Adding chunk: If he even knew about the strange classes I was...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: The students and faculty in the painting depart...
> Adding chunk: The students and faculty in the painting depart...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: I wanted to go back to RISD, but I was now brok...
> Adding chunk: I wanted to go back to RISD, but I was now brok...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: But alas it was more like the Accademia than no...
> Adding chunk: But alas it was more like the Accademia than no...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: After I moved to New York I became her de facto...
> Adding chunk: After I moved to New York I became her de facto...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: Now we felt like we were really onto something....
> Adding chunk: Now we felt like we were really onto something....
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: In its time, the editor was one of the best gen...
> Adding chunk: In its time, the editor was one of the best gen...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: A company with just a handful of employees woul...
> Adding chunk: A company with just a handful of employees woul...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: I stuck it out for a few more months, then in d...
> Adding chunk: I stuck it out for a few more months, then in d...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: But about halfway through the summer I realized...
> Adding chunk: But about halfway through the summer I realized...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: One of the most conspicuous patterns I've notic...
> Adding chunk: One of the most conspicuous patterns I've notic...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: Horrified at the prospect of having my inbox fl...
> Adding chunk: Horrified at the prospect of having my inbox fl...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: We'd use the building I owned in Cambridge as o...
> Adding chunk: We'd use the building I owned in Cambridge as o...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: It was originally meant to be a news aggregator...
> Adding chunk: It was originally meant to be a news aggregator...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: It had already eaten Arc, and was in the proces...
> Adding chunk: It had already eaten Arc, and was in the proces...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: Then in March 2015 I started working on Lisp ag...
> Adding chunk: Then in March 2015 I started working on Lisp ag...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: I remember taking the boys to the coast on a su...
> Adding chunk: I remember taking the boys to the coast on a su...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: But when the software is an online store builde...
> Adding chunk: But when the software is an online store builde...
DEBUG:llama_index.core.node_parser.node_utils:> Adding chunk: [17] Another problem with HN was a bizarre edge...
> Adding chunk: [17] Another problem with HN was a bizarre edge...
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): localhost:11434
Starting new HTTP connection (1): localhost:11434
DEBUG:urllib3.connectionpool:http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
http://localhost:11434 "POST /api/embeddings HTTP/1.1" 200 None
DEBUG:llama_index.core.indices.utils:> Top 2 nodes:
> [Node 396b67ba-f079-4d38-8b1a-6b71f4d57b90] [Similarity score:             0.269713] Now we felt like we were really onto something. I had visions of a whole new generation of softwa...
> [Node 7df44d30-6d9b-4d53-a070-b9acc411adcd] [Similarity score:             0.26883] It was originally meant to be a news aggregator for startup founders and was called Startup News,...
> Top 2 nodes:
> [Node 396b67ba-f079-4d38-8b1a-6b71f4d57b90] [Similarity score:             0.269713] Now we felt like we were really onto something. I had visions of a whole new generation of softwa...
> [Node 7df44d30-6d9b-4d53-a070-b9acc411adcd] [Similarity score:             0.26883] It was originally meant to be a news aggregator for startup founders and was called Startup News,...
DEBUG:httpx:load_ssl_context verify=True cert=None trust_env=True http2=False
load_ssl_context verify=True cert=None trust_env=True http2=False
DEBUG:httpx:load_verify_locations cafile='C:\\Users\\matteucc\\AppData\\Local\\anaconda3\\envs\\llamaindex\\Library\\ssl\\cacert.pem'   
load_verify_locations cafile='C:\\Users\\matteucc\\AppData\\Local\\anaconda3\\envs\\llamaindex\\Library\\ssl\\cacert.pem'
DEBUG:httpcore.connection:connect_tcp.started host='localhost' port=11434 local_address=None timeout=360.0 socket_options=None
connect_tcp.started host='localhost' port=11434 local_address=None timeout=360.0 socket_options=None
DEBUG:httpcore.connection:connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x0000021E27884680>
connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x0000021E27884680>
DEBUG:httpcore.http11:send_request_headers.started request=<Request [b'POST']>
send_request_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_headers.complete
send_request_headers.complete
DEBUG:httpcore.http11:send_request_body.started request=<Request [b'POST']>
send_request_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_body.complete
send_request_body.complete
DEBUG:httpcore.http11:receive_response_headers.started request=<Request [b'POST']>
receive_response_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', [(b'Content-Type', b'application/json; charset=utf-8'), (b'Date', b'Thu, 16 May 2024 20:17:14 GMT'), (b'Content-Length', b'986')])
receive_response_headers.complete return_value=(b'HTTP/1.1', 200, b'OK', [(b'Content-Type', b'application/json; charset=utf-8'), (b'Date', b'Thu, 16 May 2024 20:17:14 GMT'), (b'Content-Length', b'986')])
INFO:httpx:HTTP Request: POST http://localhost:11434/api/chat "HTTP/1.1 200 OK"
HTTP Request: POST http://localhost:11434/api/chat "HTTP/1.1 200 OK"
DEBUG:httpcore.http11:receive_response_body.started request=<Request [b'POST']>
receive_response_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_body.complete
receive_response_body.complete
DEBUG:httpcore.http11:response_closed.started
response_closed.started
DEBUG:httpcore.http11:response_closed.complete
response_closed.complete
DEBUG:httpcore.connection:close.started
close.started
DEBUG:httpcore.connection:close.complete
close.complete
Based on the context provided in the essay, the author did not directly mention what they did growing up. However, we can infer some information about their background from the text.

The author mentions that they studied art, which suggests that they may have had an interest in the arts from a young age. Additionally, the author's reference to Interleaf and their experience working as a consultant for the company implies that they may have had some exposure to programming or computer science during their formative years.

However, without any direct statements from the author about their childhood experiences or interests, we cannot provide a definitive answer to the query.
jc3wubiy

jc3wubiy1#

你在使用LlamaIndex库检索和响应不相关文本片段时遇到的问题可能是由于与CitationQueryEngine配置相关的几个关键因素。以下是一些可以进行的针对性调整,以提高响应的相关性:

  1. 调整引用块大小和重叠:微调citation_chunk_sizecitation_chunk_overlap参数。这控制文本块的粒度,影响哪些部分用于回答查询。
  2. 实验文本分割方法:默认的SentenceSplitter可能并不总是产生最相关的文本块。尝试不同的文本分割策略或调整现有参数可能会导致更好的结果。
  3. 审查节点后处理器:确保node_postprocessors正确过滤和精炼节点。这一步对于从响应合成过程中排除不相关信息至关重要。
  4. 优化元数据模式:metadata_mode参数影响如何向LLM提供上下文。调整这个参数可以帮助提供正确的上下文以获得更准确的响应。
  5. 适当配置响应合成器:response_synthesizer的逻辑和配置在生成最终响应中起着重要作用。确保它与查询类型的良好对齐可以增强答案的相关性。
    通过审查并可能调整这些CitationQueryEngine方面的内容,你应该能够提高LlamaIndex库为你查询生成的响应的相关性。
详细信息

来源

这个响应旨在帮助你节省时间并提供有用的信息。它不是精确解决方案,而是你自己的研究的一个起点。
请帮助我变得更有用!如果这有帮助,请留下一个👍,如果它是无关的,请留下一个👎。

  • 要继续对话,请提及@dosu。*
z9smfwbn

z9smfwbn2#

对于我来说,这个代码可以正常工作,运行了几次,结果都是一致的。

代码:

$x_1a_{0b} x$

输出:

$x_1a_{1b} x$

可能需要确保你安装了最新版本的Ollama's服务器,以及最新版本的模型?

$x_1m_{0n} 1x$

$x_1m_{1n} 1x$

xwbd5t1u

xwbd5t1u3#

是的,按照描述将llama3和nomic-embed-text从Ollama(刚刚重新安装)中拉出。
运行代码片段后,我得到以下结果:
"c:/Users/matteucc/Desktop/Playground/LlamaIndex/data/newtest.py
根据提供的上下文信息,没有提到作者的童年或成长经历。文本只讨论了作者在成年后的经历和决定,特别是与他职业生涯和创业冒险相关的方面。因此,根据给定的上下文无法回答这个问题。
0.26971341318867764
0.268829948067337
现在我们感觉自己真的找到了一些东西。我想象着一个全新的软件一代。
它最初是用来为初创企业创始人提供新闻聚合器的,名为Startup News,但后来"

nxagd54h

nxagd54h4#

这真是相当奇怪。我看到你在用Windows,我在想这是否是Ollama和Windows的结合体

oknrviil

oknrviil5#

奇怪的是,我会在另一台计算机上重试,并/或使用gpt-3.5-turbo来查看是否我得到相同的问题。

xiozqbni

xiozqbni6#

你好!关于这个问题的小更新。我尝试了gpt-3.5-turbo,它可以正常工作。我也尝试了LangChain + Ollama,遇到了同样的问题。我认为问题不在于语言模型本身,而在于向量存储中的相似性搜索。在这两种情况下,检索到的文档并不总是符合查询,似乎并非所有文档的部分都可以检索到。实际上,在使用LangChain时,如果我将嵌入生成器从Ollama的一个切换到OpenAI的一个,同时保持Ollama作为LLM,问题就完全解决了(我还没有在LlamaIndex中尝试过相同的操作)。

相关问题