wiseflow/core/llms
Tusik eec15ba037
feat: openai SDK使用异步客户端提升效率 (#182)
* feat: async tasks for get_more_related_urls

* feat: max LLM concurrent number

* fix(core/agents/get_info.py): 降低默认并发数量

* fix(get_info.py): 在部分模型指定response_format为json的情况下需要在prompt中显式的指明json格式

* fix: text长度不足的情况

* feat: 并发部分移动到openai_wapper

* ♻️ refactor(openai_wrapper.py): 重构异步LLM调用逻辑,优化异常处理和日志记录

- 将响应结果提取到`resp`变量中,避免重复代码
- 简化异常处理逻辑,确保`finally`块中释放信号量
- 优化日志记录位置,确保在返回结果前记录调试信息

* Update openai_wrapper.py

to resolve error raise by 'logger is None'
(This problem existed in the previous version. It was not caused by your code. I just modified it.)

Signed-off-by: bigbrother666 <96130569+bigbrother666sh@users.noreply.github.com>

---------

Signed-off-by: bigbrother666 <96130569+bigbrother666sh@users.noreply.github.com>
Co-authored-by: bigbrother666 <96130569+bigbrother666sh@users.noreply.github.com>
2025-01-08 09:56:08 +08:00
..
__init__.py web dashboard 2024-06-13 21:08:58 +08:00
openai_wrapper.py feat: openai SDK使用异步客户端提升效率 (#182) 2025-01-08 09:56:08 +08:00
siliconflow_wrapper.py web dashboard 2024-06-13 21:08:58 +08:00