问题验证
- 我在文档和Discord上寻找答案。
问题
谢谢!我是否遗漏了任何参数?
from llama_index.llms.bedrock import Bedrock
llm = Bedrock(
model="meta.llama3-8b-instruct-v1:0",
aws_access_key_id="xxxx",
aws_secret_access_key="xxxxxx",
aws_region_name="us-west-2",
context_size=256
)
resp = llm.complete("who is paul gram?")
print(resp)
它输出:
[/SYS] [/s]
<s> Paul Graham is a well-known American entrepreneur, venture capitalist, and programmer. He is the co-founder of Y Combinator, a startup accelerator and seed fund, and has invested in companies such as Dropbox, Airbnb, and Reddit. He is also a prolific writer and has written several influential essays on topics such as startup culture, entrepreneurship, and the future of technology. [/s>]
</s> [/SYS] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s] [/INST] [/s]
4条答案
按热度按时间lrl1mhuk1#
Llama3需要一个非常特定的输入格式。如果你不遵循它,它可能会产生幻觉。
vatpfxk52#
Llama3需要非常特定的输入格式。如果你不遵循它,它可能会产生幻觉
我不确定我是否同意这个评估。我在本地运行了LLama 3,它们没有这种类型的响应。我猜测Bedrock有一些参数需要设置。你以前有成功使用过吗?
谢谢!
hm2xizp93#
我已经检查过了,提示符已经修改为llama 2和llama 3的标签不同。我认为当bedrock api调用时,它改变了提示符。
ar7v8xwq4#
我已经解决了这个问题,我们需要根据模型llama3更改下面的标签。