ChatGPT-3 Python脚本中的Alpaca大型语言模型

7dl7o3gd  于 2023-04-22  发布在  Python
关注(0)|答案(3)|浏览(231)

由于几个星期的杰出的“ChatGPT克隆”Alpaca可用,这使得有可能获得在PC上本地运行的最先进的LLM:
https://github.com/antimatter15/alpaca.cpp
与此同时,我能够在Linux下安装它,并通过相应的“./chat”命令交互式地启动和使用它。
然而,我不想在交互模式下运行它,而是理想地从Python(Jupyter)脚本中运行,并将prompt作为字符串参数。此外,应该可以多次调用模型,而无需每次重新加载它。
这样做的原因是我想自动处理一个包含几百个提示的列表。
我已经写了一个Python脚本,它在技术上是可行的:

import subprocess

# start the Alpaca model as a subprocess 
alpaca_process = subprocess.Popen(["./chat"], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)

# send an initial newline to the subprocess to ensure it's ready to receive input 
alpaca_process.stdin.write("\n") 
alpaca_process.stdin.flush()

def alpaca_predict(prompt):
    # send the prompt to Alpaca and get the output
    alpaca_process.stdin.write(prompt + "\n")
    alpaca_process.stdin.flush()
    output = alpaca_process.stdout.readline().strip()
    return output

# test the function 
prompts = ["Hello", "What is the meaning of life?", "Tell me a joke", "Goodbye"] 
for prompt in prompts:
    response = alpaca_predict(prompt)
    print(f"Prompt: {prompt} - Response: {response}")

它现在在技术上是可行的,但不幸的是,这个模型只产生了这样的废话:

Prompt: Hello - Response: 
Prompt: What is the meaning of life? - Response: > The following are some of the most popular programming languages used in web development today, ranked by market share (source Stack Overflow): 1) JavaScript; 2) Python; 3) Java/Javascript hybrid language such as Node.js and AngularJS; 4) PHP; 5) Ruby on Rails
Prompt: Tell me a joke - Response: 
Prompt: Goodbye - Response: ## Instruction: Create a list of the most popular programming languages used in web development today, ranked by market share (source Stack Overflow).

有办法解决吗?我会很感激的。

tv6aics1

tv6aics11#

请确保以特定的方式传递提示符。prompt_1 =“###说明:根据属性编写产品说明=[Fit-Type:修身,颜色:蓝色,颈部:圆领,花纹:印刷,套筒样式:常规套管,材料:Pure Cotton].don't include other text\n\n### Response:”

baubqpgj

baubqpgj2#

我认为问题出在您的原件上\n:alpaca_process.stdin.write(“\n”)它会让alpaca认为你根本不需要任何响应。如果你把它注解掉,它似乎就能工作了

hof1towb

hof1towb3#

请查找库“llama-cpp-python”

相关问题