Skip to content

OpenRouter models not visible with OPENROUTER_KEY=<key> #10

Closed
@akaihola

Description

@akaihola

OpenRouter models are listed and do work if I do llm keys set openrouter, but not with the OPENROUTER_KEY environment variable.

I've tried this on GitHub Workflows, on a local Ubuntu 22.04 container, and on my Windows 11 laptop. The OpenRouter models aren't visible or usable. In the Ubuntu container, for example, I see this:

$ podman run -it ubuntu:22.04
# apt update ; apt install -y python3-venv
# python3 -m venv venv ; . venv/bin/activate ; pip install llm-openrouter
# export OPENROUTER_KEY=<my-openrouter-key>
# llm models
OpenAI Chat: gpt-3.5-turbo (aliases: 3.5, chatgpt)
OpenAI Chat: gpt-3.5-turbo-16k (aliases: chatgpt-16k, 3.5-16k)
OpenAI Chat: gpt-4 (aliases: 4, gpt4)
OpenAI Chat: gpt-4-32k (aliases: 4-32k)
OpenAI Chat: gpt-4-1106-preview
OpenAI Chat: gpt-4-0125-preview
OpenAI Chat: gpt-4-turbo-2024-04-09
OpenAI Chat: gpt-4-turbo (aliases: gpt-4-turbo-preview, 4-turbo, 4t)
OpenAI Chat: gpt-4o (aliases: 4o)
OpenAI Chat: gpt-4o-mini (aliases: 4o-mini)
OpenAI Completion: gpt-3.5-turbo-instruct (aliases: 3.5-instruct, chatgpt-instruct)

On Windows 11 and Python 3.11.4 in both PowerShell and the mintty terminal installed by Git:

PS C:\> pip install llm-openrouter
$ export OPENROUTER_KEY=<my-openrouter-key>  # on mintty
PS C:\> $env:OPENROUTER_KEY = '<my-openrouter-key>'  # on PowerShell
PS C:\> llm models
<the same output as above>

Trying to use the models on the CLI or through the Python API doesn't work either, I get this on the Python API (here in a GitHub build):

  File "/opt/hostedtoolcache/Python/3.12.4/x64/lib/python3.12/site-packages/llm/__init__.py", line 152, in get_model
    raise UnknownModelError("Unknown model: " + name)
llm.UnknownModelError: 'Unknown model: openrouter/anthropic/claude-3.5-sonnet:beta'

I also tried, without success:

  • running as a user
  • not in a virtualenv
  • pip install llm ; llm install llm-openrouter
  • apt install -y git ; pip install llm@git+https://github.com/simonw/llm@main llm-openrouter@git+https://github.com/simonw/llm-openrouter@main
  • I also tried other llm plugins, and here are my results after doing llm install llm-<plugin> ; llm models without setting any API keys. 👎 means the models from the plugin aren't included in the list, and 💯 means they are on the list.
    • 👎 llm-mistral
    • 💯 llm-gemini
    • 💯 llm-claude
    • 💯 llm-claude-3
    • 💯 llm-command-r
    • 💯 llm-reka
    • 💯 llm-perplexity
    • 💯 llm-groq
    • 👎 llm-anyscale-endpoints
    • 👎 llm-replicate
    • 👎 llm-fireworks
    • 💯 llm-palm
    • 👎 llm-openrouter
    • 💯 llm-cohere
    • 💯 llm-bedrock-anthropic
    • 💯 llm-bedrock-meta
    • 👎 llm-together (llm.errors.NeedsKeyException)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingdocumentationImprovements or additions to documentation

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions