Skip to content

Feature request: Add support for custom embedding base URL #426

@lesshaste

Description

@lesshaste

I apologise if I have missed how to do this, but as far as I can tell the feature doesn't exist yet.

It would be great to add support for a custom base URL for embedding models (like OPENAI_EMBEDDING_BASE_URL or embedding_base_url config option).
Currently, the EmbeddingClient in openevolve/embedding.py only supports:

  • OpenAI embedding models (text-embedding-3-small, text-embedding-3-large) using api.openai.com
  • Azure embedding models
    This makes it impossible to use other providers for embeddings like:
  • OpenRouter (which offers qwen/qwen3-embedding-8b and other embedding models)
  • Local embedding servers
  • Other OpenAI-compatible APIs
    The fix would involve:
  1. Adding a new config option (e.g., embedding_base_url) to DatabaseConfig in openevolve/config.py
  2. Modifying EmbeddingClient.init to accept a base_url parameter
  3. Passing base_url to the OpenAI client initialization
    This would allow users to set OPENAI_EMBEDDING_BASE_URL env var or embedding_base_url in their config to use any OpenAI-compatible endpoint for embeddings.
    Example use case: Using OpenRouter for embeddings while using a different provider for LLMs.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions