OpenAI-compatible APIs
Any provider that speaks the OpenAI Chat Completions protocol works with ClipSlop. Common targets:
- OpenRouter — single key, hundreds of models from different vendors.
- LM Studio — local model runner with an OpenAI-compatible HTTP server.
- Together, Groq, Fireworks — cloud inference platforms.
- Self-hosted — vLLM, Text Generation Inference, or any server exposing
/v1/chat/completions.
Setup
- Open Settings → Providers → OpenAI-compatible.
- Set:
- Base URL — e.g.
https://openrouter.ai/api/v1orhttp://localhost:1234/v1for LM Studio. - API key (if the endpoint requires one).
- Model — pick or type the model identifier.
- Base URL — e.g.
- Click Test connection.
CLI providers
ClipSlop also supports invoking command-line tools as providers. Useful for:
claudeCLI on a developer Mac.geminiCLI.- Any custom script that takes a prompt on stdin and emits the response on stdout.
Configure the binary path and arguments in Settings → Providers → CLI.
When to use this vs the dedicated providers
| Scenario | Use |
|---|---|
| ChatGPT Plus / Pro / Team account | Sign in with ChatGPT |
| OpenAI API key | OpenAI API |
| Anthropic API key | Anthropic |
| Local on the same Mac | Ollama or LM Studio |
| Multiple vendors via one key | OpenRouter (this page) |
| Specialised hosting (Groq, Fireworks…) | This page |