Skip to main content

OpenAI-compatible APIs

Any provider that speaks the OpenAI Chat Completions protocol works with ClipSlop. Common targets:

  • OpenRouter — single key, hundreds of models from different vendors.
  • LM Studio — local model runner with an OpenAI-compatible HTTP server.
  • Together, Groq, Fireworks — cloud inference platforms.
  • Self-hosted — vLLM, Text Generation Inference, or any server exposing /v1/chat/completions.

Setup

  1. Open Settings → ProvidersOpenAI-compatible.
  2. Set:
    • Base URL — e.g. https://openrouter.ai/api/v1 or http://localhost:1234/v1 for LM Studio.
    • API key (if the endpoint requires one).
    • Model — pick or type the model identifier.
  3. Click Test connection.

CLI providers

ClipSlop also supports invoking command-line tools as providers. Useful for:

  • claude CLI on a developer Mac.
  • gemini CLI.
  • Any custom script that takes a prompt on stdin and emits the response on stdout.

Configure the binary path and arguments in Settings → Providers → CLI.

When to use this vs the dedicated providers

ScenarioUse
ChatGPT Plus / Pro / Team accountSign in with ChatGPT
OpenAI API keyOpenAI API
Anthropic API keyAnthropic
Local on the same MacOllama or LM Studio
Multiple vendors via one keyOpenRouter (this page)
Specialised hosting (Groq, Fireworks…)This page