Installation
TypeScript browser packages
Section titled “TypeScript browser packages”Most browser apps start with @slop-ai/client, then add the framework adapter that matches the UI layer.
bun add @slop-ai/client| Framework | Package | Install |
|---|---|---|
| React | @slop-ai/react | bun add @slop-ai/client @slop-ai/react |
| Vue 3 | @slop-ai/vue | bun add @slop-ai/client @slop-ai/vue |
| SolidJS | @slop-ai/solid | bun add @slop-ai/client @slop-ai/solid |
| Angular | @slop-ai/angular | bun add @slop-ai/client @slop-ai/angular |
| Svelte 5 | @slop-ai/svelte | bun add @slop-ai/client @slop-ai/svelte |
| Vanilla JS / TS | @slop-ai/client | bun add @slop-ai/client |
Use @slop-ai/core when you need the shared types, helpers, and tree utilities directly:
bun add @slop-ai/coreTypeScript server and consumer packages
Section titled “TypeScript server and consumer packages”| Package | Use case | Install |
|---|---|---|
@slop-ai/server | Node.js, Bun, desktop helpers, and CLI providers | bun add @slop-ai/server |
@slop-ai/consumer | custom agents, inspectors, bridges, and tests | bun add @slop-ai/consumer |
@slop-ai/tanstack-start | TanStack Start full-stack integration | bun add @slop-ai/server @slop-ai/tanstack-start |
@slop-ai/openclaw-plugin | OpenClaw integration | bun add @slop-ai/openclaw-plugin |
Python
Section titled “Python”pip install slop-ai[websocket]from fastapi import FastAPIfrom slop_ai import SlopServerfrom slop_ai.transports.asgi import SlopMiddleware
app = FastAPI()slop = SlopServer("my-api", "My API")app.add_middleware(SlopMiddleware, slop=slop)See the Python guide and Python API page for transport and consumer examples.
go get github.com/devteapot/slop/packages/go/slop-aipackage main
import ( "net/http"
slop "github.com/devteapot/slop/packages/go/slop-ai")
func main() { server := slop.NewServer("my-app", "My App") server.Mount(http.DefaultServeMux) http.ListenAndServe(":8080", nil)}See the Go guide and Go API page for Unix, stdio, and consumer support.
cargo add slop-aiuse serde_json::json;use slop_ai::SlopServer;
let slop = SlopServer::new("my-app", "My App");slop.register("status", json!({"type": "status", "props": {"healthy": true}}));See the Rust guide and Rust API page for feature-flag and transport details.
The shipped consumer apps both include AI chat, but chat only works once you have an active model profile configured.
Both apps start with a default local profile:
- provider: Ollama
- endpoint:
http://localhost:11434 - model:
qwen2.5:14b
If you are using OpenAI, OpenRouter, or Gemini instead, add your endpoint and API key before chatting.
Chrome extension
Section titled “Chrome extension”cd apps/extensionbun installbun run buildLoad the apps/extension directory in chrome://extensions as an unpacked extension.
After loading the extension:
- Open the popup.
- Use
LLM Settings ->to open the options page. - Add or edit a profile for Ollama, OpenAI, OpenRouter, or Gemini.
- Open the in-page chat overlay and pick the active profile and model.
Desktop app
Section titled “Desktop app”cd apps/desktopbun installbun run devThe Tauri app will build the frontend and launch the native desktop shell.
After launching the desktop app:
- Open
Settings. - Add or edit a profile for Ollama, OpenAI, OpenRouter, or Gemini.
- Use the top-bar selectors to switch the active profile and model.