wiseflow/core
Tusik eec15ba037
feat: openai SDK使用异步客户端提升效率 (#182)
* feat: async tasks for get_more_related_urls

* feat: max LLM concurrent number

* fix(core/agents/get_info.py): 降低默认并发数量

* fix(get_info.py): 在部分模型指定response_format为json的情况下需要在prompt中显式的指明json格式

* fix: text长度不足的情况

* feat: 并发部分移动到openai_wapper

* ♻️ refactor(openai_wrapper.py): 重构异步LLM调用逻辑,优化异常处理和日志记录

- 将响应结果提取到`resp`变量中,避免重复代码
- 简化异常处理逻辑,确保`finally`块中释放信号量
- 优化日志记录位置,确保在返回结果前记录调试信息

* Update openai_wrapper.py

to resolve error raise by 'logger is None'
(This problem existed in the previous version. It was not caused by your code. I just modified it.)

Signed-off-by: bigbrother666 <96130569+bigbrother666sh@users.noreply.github.com>

---------

Signed-off-by: bigbrother666 <96130569+bigbrother666sh@users.noreply.github.com>
Co-authored-by: bigbrother666 <96130569+bigbrother666sh@users.noreply.github.com>
2025-01-08 09:56:08 +08:00
..
agents feat: openai SDK使用异步客户端提升效率 (#182) 2025-01-08 09:56:08 +08:00
custom_fetchings v0.3.6 release 2025-01-05 18:12:36 +08:00
llms feat: openai SDK使用异步客户端提升效率 (#182) 2025-01-08 09:56:08 +08:00
utils v0.3.6 release 2025-01-05 18:12:36 +08:00
docker_entrypoint.sh V0.3.5 2024-12-10 14:18:03 +08:00
general_process.py v0.3.6fix 2025-01-05 21:54:06 +08:00
requirements.txt v0.3.6 mockup 2025-01-04 23:36:18 +08:00
run_task.sh V0.3.5 2024-12-10 14:18:03 +08:00
run.sh V0.3.5 2024-12-10 14:18:03 +08:00
tasks.py v0.3.6 mockup 2025-01-04 23:36:18 +08:00