Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

调用restful api时报错 #2720

Open
1 of 3 tasks
zhangwonderful opened this issue Dec 29, 2024 · 1 comment
Open
1 of 3 tasks

调用restful api时报错 #2720

zhangwonderful opened this issue Dec 29, 2024 · 1 comment
Labels
Milestone

Comments

@zhangwonderful
Copy link

zhangwonderful commented Dec 29, 2024

System Info / 系統信息

win11,transformer 版本为4.46.3,cuda版本为12.4,python版本为3.11.0

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

v1.1.1

The command used to start Xinference / 用以启动 xinference 的命令

xinference-local -H 192.168.3.7 --port 7861 --auth-config auth.json

Reproduction / 复现过程

1、启动yi-coder-chat模型
2、编写客户端代码,通过restfulclient调用服务
客户端代码为:

def ask_local_ai(prompt):
    """
    使用本地 API 解析自然语言生成 SQL 查询。
    """
    print("自然语言:",prompt)
    #headers = {"Content-Type": "application/json"}
    client = RESTfulClient(base_url)
    client._set_token("sk-35tkasdyGLYMy")
    model = client.get_model("yi-coder-chat")
    chat_history: List["ChatCompletionMessage"] = []
    assert isinstance(model, RESTfulChatModelHandle)
    messages = [dict] #to_chat(flatten(history))
    messages.append(dict(role="user", content=prompt))
    response_content = ""
    data =  [dict(role ="user", content = prompt)]
    json_string = json.dumps(data)
    response = model.chat(
        messages=data,
        generate_config={
            "max_tokens": 1024,
            "temperature": 0.7,
            "stream": False,
        },
    )
    return response["choices"][0]["message"]

3、编写提示词

prompt="    请为此请求生成一条SQL查询语句: 统计2024年销售员合同金额汇总排名. 数据库有两张表,分别是'orders'(订单表)和'products'(产品表),orders表有id(订单id)、sales_person(销售员)、prodcut_id(产品id)、contract_amount(合同金额)、contract_date(合同日期),products表有product_id(产品id)、product_name(产品名称)。请根据上面的要求生成sql语句,返回结果是可执行的sql语句,不能包含其他字符    "

4、运行程序调用ask_local_ai报错,详细报错信息为:

2024-12-29 21:26:52,736 xinference.api.restful_api 31152 ERROR    [address=192.168.3.7:63101, pid=30092] can only concatenate str (not "list") to str
Traceback (most recent call last):
  File "D:\envs\hss-inference\Lib\site-packages\xinference\api\restful_api.py", line 2098, in create_chat_completion
    data = await model.chat(
           ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xoscar\backends\context.py", line 231, in send
    return self._process_result_message(result)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xoscar\backends\context.py", line 102, in _process_result_message
    raise message.as_instanceof_cause()
  File "D:\envs\hss-inference\Lib\site-packages\xoscar\backends\pool.py", line 667, in send
    result = await self._run_coro(message.message_id, coro)
    ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xoscar\backends\pool.py", line 370, in _run_coro
    return await coro
  File "D:\envs\hss-inference\Lib\site-packages\xoscar\api.py", line 384, in __on_receive__
    return await super().__on_receive__(message)  # type: ignore
    ^^^^^^^^^^^^^^^^^
  File "xoscar\\core.pyx", line 558, in __on_receive__
    raise ex
  File "xoscar\\core.pyx", line 520, in xoscar.core._BaseActor.__on_receive__
    async with self._lock:
    ^^^^^^^^^^^^^^^^^
  File "xoscar\\core.pyx", line 521, in xoscar.core._BaseActor.__on_receive__
    with debug_async_timeout('actor_lock_timeout',
    ^^^^^^^^^^^^^^^^^
  File "xoscar\\core.pyx", line 526, in xoscar.core._BaseActor.__on_receive__
    result = await result
    ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xinference\core\model.py", line 102, in wrapped_func
    ret = await fn(self, *args, **kwargs)
    ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xoscar\api.py", line 462, in _wrapper
    r = await func(self, *args, **kwargs)
    ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xinference\core\utils.py", line 90, in wrapped
    ret = await func(*args, **kwargs)
    ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xinference\core\model.py", line 739, in chat
    return await self.handle_batching_request(
    ^^^^^^^^^^^^^^^^^
  File "D:\envs\hss-inference\Lib\site-packages\xinference\core\model.py", line 721, in handle_batching_request
    result = await fut
    ^^^^^^^^^^^^^^^^^
ValueError: [address=192.168.3.7:63101, pid=30092] can only concatenate str (not "list") to str

Expected behavior / 期待表现

期待程序正常执行,返回一条sql语句,而不是报错: can only concatenate str (not "list") to str

@XprobeBot XprobeBot added the gpu label Dec 29, 2024
@XprobeBot XprobeBot added this to the v1.x milestone Dec 29, 2024
@qinxuye
Copy link
Contributor

qinxuye commented Jan 3, 2025

messages = [dict] #to_chat(flatten(history))

这句话是否有问题,dict 是个类型为啥要加到 messages 里?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants