Skip to content

Installation

Most browser apps start with @slop-ai/client, then add the framework adapter that matches the UI layer.

Terminal window
bun add @slop-ai/client
FrameworkPackageInstall
React@slop-ai/reactbun add @slop-ai/client @slop-ai/react
Vue 3@slop-ai/vuebun add @slop-ai/client @slop-ai/vue
SolidJS@slop-ai/solidbun add @slop-ai/client @slop-ai/solid
Angular@slop-ai/angularbun add @slop-ai/client @slop-ai/angular
Svelte 5@slop-ai/sveltebun add @slop-ai/client @slop-ai/svelte
Vanilla JS / TS@slop-ai/clientbun add @slop-ai/client

Use @slop-ai/core when you need the shared types, helpers, and tree utilities directly:

Terminal window
bun add @slop-ai/core
PackageUse caseInstall
@slop-ai/serverNode.js, Bun, desktop helpers, and CLI providersbun add @slop-ai/server
@slop-ai/consumercustom agents, inspectors, bridges, and testsbun add @slop-ai/consumer
@slop-ai/tanstack-startTanStack Start full-stack integrationbun add @slop-ai/server @slop-ai/tanstack-start
@slop-ai/openclaw-pluginOpenClaw integrationbun add @slop-ai/openclaw-plugin
Terminal window
pip install slop-ai[websocket]
from fastapi import FastAPI
from slop_ai import SlopServer
from slop_ai.transports.asgi import SlopMiddleware
app = FastAPI()
slop = SlopServer("my-api", "My API")
app.add_middleware(SlopMiddleware, slop=slop)

See the Python guide and Python API page for transport and consumer examples.

Terminal window
go get github.com/devteapot/slop/packages/go/slop-ai
package main
import (
"net/http"
slop "github.com/devteapot/slop/packages/go/slop-ai"
)
func main() {
server := slop.NewServer("my-app", "My App")
server.Mount(http.DefaultServeMux)
http.ListenAndServe(":8080", nil)
}

See the Go guide and Go API page for Unix, stdio, and consumer support.

Terminal window
cargo add slop-ai
use serde_json::json;
use slop_ai::SlopServer;
let slop = SlopServer::new("my-app", "My App");
slop.register("status", json!({"type": "status", "props": {"healthy": true}}));

See the Rust guide and Rust API page for feature-flag and transport details.

The shipped consumer apps both include AI chat, but chat only works once you have an active model profile configured.

Both apps start with a default local profile:

  • provider: Ollama
  • endpoint: http://localhost:11434
  • model: qwen2.5:14b

If you are using OpenAI, OpenRouter, or Gemini instead, add your endpoint and API key before chatting.

Terminal window
cd apps/extension
bun install
bun run build

Load the apps/extension directory in chrome://extensions as an unpacked extension.

After loading the extension:

  1. Open the popup.
  2. Use LLM Settings -> to open the options page.
  3. Add or edit a profile for Ollama, OpenAI, OpenRouter, or Gemini.
  4. Open the in-page chat overlay and pick the active profile and model.
Terminal window
cd apps/desktop
bun install
bun run dev

The Tauri app will build the frontend and launch the native desktop shell.

After launching the desktop app:

  1. Open Settings.
  2. Add or edit a profile for Ollama, OpenAI, OpenRouter, or Gemini.
  3. Use the top-bar selectors to switch the active profile and model.