-
Notifications
You must be signed in to change notification settings - Fork 871
Open
Description
Hi!
I have been trying to use Open WebUI as an LLM provider with no luck. I have set the following parameters in the config.yaml file and exported the API key of course:
llm:
api_base: "http://localhost:3000/api/"
model: "glm-ocr"
When attempting to run one of the examples (function_minimization) instead of getting the expected result I receive heaps of failed attempts at parsing the LLMs output:
2026-03-03 03:48:11,747 - ERROR - LLM generation failed: list index out of range
2026-03-03 03:48:11,747 - ERROR - LLM generation failed: list index out of range
2026-03-03 03:48:11,756 - WARNING - Iteration 39 error: LLM generation failed: list index out of range
2026-03-03 03:48:11,757 - WARNING - Iteration 40 error: LLM generation failed: list index out of range
The same endpoint, key and model results in the expected behaviour when making API calls via the openai python interface.
I was unable to identify the problem beyond what is seen in the logs.
Any idea what the problem is? Is Open WebUI simply not supported?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels