WebSocket如何同时接收两个请求chatgpt

tpgth1q7  于 12个月前  发布在  其他
关注(0)|答案(2)|浏览(207)

我有一个WebSocket(在Django中)来接收来自客户端的请求(reactjs)。请求调用chatgpt API来将其响应流传输到客户端。

class GPTAPIConsumer(AsyncWebsocketConsumer):
    async def connect(self):
        await self.accept()

    async def receive(self, text_data):
        data = json.loads(text_data)
        print('Start', data['requestID'])
        asyncio.create_task(self.streamToClient(data['requestID']))

    async def streamToClient(self, requestID):
        completion = openai.ChatCompletion.create(...)
        content = ''
        for chunk in completion:
            if chunk['choices'][0].get('delta', {}).get('function_call'):
                chunkContent = chunk['choices'][0]['delta']['function_call']['arguments']
                if chunkContent is not None:
                    content += chunkContent
                    await self.send(text_data=json.dumps({'text': content}))
        print('End', requestID)

字符串
从客户端,我发送了两条消息,请求ID为1和2。在服务器端,请求1花了大约10秒才完成,所以日志是:

Start 1
End 1
Start 2
End 2


我想要的是:

Start 1
Start 2
End 1 or End 2 (depends which one ends first)


帮帮我!谢谢!

quhf5bfb

quhf5bfb1#

我通过openai API调用aprc来实现它:

completion = await openai.ChatCompletion.acreate(...)
content = ''
async for chunk in completion:

字符串

bjp0bcyl

bjp0bcyl2#

您面临的问题是,streamToClient函数在移动到下一个请求之前等待每个请求的完成。为了实现多个请求的并发处理和交错响应,您可以利用Pencio的gather并发运行任务。下面是您的代码的更新版本:

class GPTAPIConsumer(AsyncWebsocketConsumer):
    async def connect(self):
        await self.accept()

    async def receive(self, text_data):
        data = json.loads(text_data)
        print('Start', data['requestID'])
        asyncio.create_task(self.streamToClient(data['requestID']))

    async def streamToClient(self, requestID):
        completion = openai.ChatCompletion.create(...)
        content = ''
        async for chunk in completion:
            if chunk['choices'][0].get('delta', {}).get('function_call'):
                chunkContent = chunk['choices'][0]['delta']['function_call']['arguments']
                if chunkContent is not None:
                    content += chunkContent
                    await self.send(text_data=json.dumps({'text': content}))
        print('End', requestID)

# You can use asyncio.gather to run tasks concurrently
async def run_multiple_requests(request_ids):
    tasks = [GPTAPIConsumer().streamToClient(request_id) for request_id in request_ids]
    await asyncio.gather(*tasks)

# Example usage:
request_ids = [1, 2]  # List of request IDs
await run_multiple_requests(request_ids)

字符串
这段代码将使用asyncio.gather并发地为每个请求执行streamToClient方法。它应该允许响应交错,而不是等待一个请求完成才处理下一个请求。调整它以适应Django设置并使用必要的组件。

相关问题