Skip to content

Open WebUI as LLM provider #427

@bsulyok

Description

@bsulyok

Hi!

I have been trying to use Open WebUI as an LLM provider with no luck. I have set the following parameters in the config.yaml file and exported the API key of course:

llm:
  api_base: "http://localhost:3000/api/" 
  model: "glm-ocr"

When attempting to run one of the examples (function_minimization) instead of getting the expected result I receive heaps of failed attempts at parsing the LLMs output:

2026-03-03 03:48:11,747 - ERROR - LLM generation failed: list index out of range
2026-03-03 03:48:11,747 - ERROR - LLM generation failed: list index out of range
2026-03-03 03:48:11,756 - WARNING - Iteration 39 error: LLM generation failed: list index out of range
2026-03-03 03:48:11,757 - WARNING - Iteration 40 error: LLM generation failed: list index out of range

The same endpoint, key and model results in the expected behaviour when making API calls via the openai python interface.

I was unable to identify the problem beyond what is seen in the logs.
Any idea what the problem is? Is Open WebUI simply not supported?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions