Add AI/LLM capabilities to deployed applications. Provision an API key for your deployment to access OpenAI-compatible endpoints (GPT, Claude, Gemini) with usage billed to your organization.
Published by rebyteai
Runs in the cloud
No local installation
Dependencies pre-installed
Ready to run instantly
Secure VM environment
Isolated per task
Works on any device
Desktop, tablet, or phone
Provision an AI Gateway API key to access LLM APIs (OpenAI, Anthropic, Google) through an OpenAI-compatible interface.
IMPORTANT: All API requests require authentication. Get your auth token and API URL by running:
AUTH_TOKEN=$(/home/user/.local/bin/rebyte-auth)
API_URL=$(python3 -c "import json; print(json.load(open('/home/user/.rebyte.ai/auth.json'))['sandbox']['relay_url'])")
Include the token in all API requests as a Bearer token, and use $API_URL as the base for all API endpoints.
Use this skill when:
# Provision AI Gateway key (uses default deployment for workspace)
curl -X POST "$API_URL/api/data/aigateway/provision" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-H "Content-Type: application/json"
Response:
{
"deployId": "myapp-f33ad2defb",
"apiKey": "aig_myapp-f33ad2defb_a1b2c3d4e5f6...",
"baseUrl": "https://api.rebyte.ai/api/ai",
"isNew": true
}
node /home/user/.skills/rebyteai-rebyte-app-builder/bin/rebyte.js deploy
That's it! The REBYTE_AI_GATEWAY_KEY and REBYTE_AI_GATEWAY_URL environment variables are automatically injected into your Lambda function during deployment.
// app/api/chat/route.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
const client = openai.provider({
baseURL: process.env.REBYTE_AI_GATEWAY_URL,
apiKey: process.env.REBYTE_AI_GATEWAY_KEY,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: client('gemini-3-flash'), // or gpt-4o, claude-sonnet-4.5
messages,
});
return result.toDataStreamResponse();
}
// functions/chat.func/index.js
const OpenAI = require('openai');
const openai = new OpenAI({
apiKey: process.env.REBYTE_AI_GATEWAY_KEY,
baseURL: process.env.REBYTE_AI_GATEWAY_URL,
});
exports.handler = async (event) => {
const { messages } = JSON.parse(event.body || '{}');
const response = await openai.chat.completions.create({
model: 'gemini-3-flash',
messages,
});
return {
statusCode: 200,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(response.choices[0].message),
};
};
// functions/api.func/index.js
exports.handler = async (event) => {
const { messages } = JSON.parse(event.body || '{}');
const response = await fetch(`${process.env.REBYTE_AI_GATEWAY_URL}/chat/completions`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${process.env.REBYTE_AI_GATEWAY_KEY}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({
model: 'gemini-3-flash',
messages,
stream: false,
}),
});
const data = await response.json();
return {
statusCode: 200,
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(data.choices[0].message),
};
};
| Model | Provider | Best For |
|---|---|---|
gemini-3-flash |
Fast responses, cost-effective | |
gemini-3-pro |
Complex reasoning | |
gpt-4o |
OpenAI | General purpose, vision |
gpt-5.2-codex |
OpenAI | Code generation |
claude-sonnet-4.5 |
Anthropic | Long context, analysis |
claude-opus-4.5 |
Anthropic | Most capable |
The AI Gateway provides OpenAI-compatible endpoints:
| Endpoint | Description |
|---|---|
POST /v1/chat/completions |
Chat completions (streaming and non-streaming) |
POST /v1/responses |
Responses API (for Codex models) |
GET /v1/models |
List available models |
Base URL: https://api.rebyte.ai/api/ai
# Create or get existing key for your workspace
curl -X POST "$API_URL/api/data/aigateway/provision" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-H "Content-Type: application/json"
# Check if key exists (doesn't create new one)
curl -X POST "$API_URL/api/data/aigateway/info" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-H "Content-Type: application/json"
# Revoke key (call provision to generate new one)
curl -X POST "$API_URL/api/data/aigateway/revoke" \
-H "Authorization: Bearer $AUTH_TOKEN" \
-H "Content-Type: application/json"
When you provision an AI Gateway key and then deploy a Lambda function, these REBYTE_* environment variables are automatically injected:
| Variable | Description |
|---|---|
REBYTE_AI_GATEWAY_KEY |
Your AI Gateway API key (aig_...) |
REBYTE_AI_GATEWAY_URL |
The base URL (https://api.rebyte.ai/api/ai/v1) |
These are platform-managed variables. You do not need to set them manually.
CRITICAL: The AI Gateway key must ONLY be used in server-side code.
// functions/api.func/index.js - SERVER-SIDE, SAFE
exports.handler = async (event) => {
const response = await fetch(`${process.env.REBYTE_AI_GATEWAY_URL}/chat/completions`, {
headers: { 'Authorization': `Bearer ${process.env.REBYTE_AI_GATEWAY_KEY}` },
// ...
});
};
// src/App.tsx - CLIENT-SIDE, DANGEROUS
const response = await fetch('https://api.rebyte.ai/api/ai/v1/chat/completions', {
headers: { 'Authorization': 'Bearer aig_...' } // KEY EXPOSED!
});
Why: Frontend code is visible to users. Anyone can copy the key from browser DevTools.
Note: The AI Gateway does NOT allow cross-origin requests (no CORS), so frontend calls will fail anyway. This is intentional security.
# Create Next.js app with AI
npx create-next-app@latest my-ai-chat --typescript --tailwind
cd my-ai-chat
# Install dependencies
npm install ai @ai-sdk/openai
npm install -D @opennextjs/aws
// app/api/chat/route.ts
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
const client = openai.provider({
baseURL: process.env.REBYTE_AI_GATEWAY_URL,
apiKey: process.env.REBYTE_AI_GATEWAY_KEY,
});
export async function POST(req: Request) {
const { messages } = await req.json();
const result = streamText({
model: client('gemini-3-flash'),
messages,
});
return result.toDataStreamResponse();
}
// app/page.tsx
'use client';
import { useChat } from 'ai/react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit } = useChat();
return (
<div className="flex flex-col h-screen p-4">
<div className="flex-1 overflow-auto">
{messages.map(m => (
<div key={m.id} className="mb-4">
<strong>{m.role}:</strong> {m.content}
</div>
))}
</div>
<form onSubmit={handleSubmit} className="flex gap-2">
<input
value={input}
onChange={handleInputChange}
className="flex-1 border p-2 rounded"
placeholder="Say something..."
/>
<button type="submit" className="bg-blue-500 text-white px-4 rounded">
Send
</button>
</form>
</div>
);
}
# Build with OpenNext
npx @opennextjs/aws build
# Create .rebyte/ directory
mkdir -p .rebyte/static .rebyte/functions/default.func
cp -r .open-next/assets/* .rebyte/static/
cp -r .open-next/server-functions/default/* .rebyte/functions/default.func/
cat > .rebyte/config.json << 'EOF'
{
"version": 1,
"routes": [
{ "handle": "filesystem" },
{ "src": "^/(.*)$", "dest": "/functions/default" }
]
}
EOF
# Deploy
node /home/user/.skills/rebyteai-rebyte-app-builder/bin/rebyte.js deploy
# Provision the key
curl -X POST "$API_URL/api/data/aigateway/provision" \
-H "Authorization: Bearer $AUTH_TOKEN"
# Redeploy to pick up the auto-injected REBYTE_* env vars
node /home/user/.skills/rebyteai-rebyte-app-builder/bin/rebyte.js deploy
Your AI chat app is now live at https://<deploy-id>.rebyte.pro!
| Issue | Cause | Fix |
|---|---|---|
| "Invalid API key" | Key revoked or wrong | Run aigateway/provision again |
| CORS error in browser | Calling from frontend | Move API call to server function |
| 401 Unauthorized | Missing/wrong env var | Provision key, then redeploy |
| "Model not found" | Invalid model name | Check available models list |
Everyone else asks you to install skills locally. On Rebyte, just click Run. Works from any device — even your phone. No CLI, no terminal, no configuration.
Claude Code
Gemini CLI
Codex
Cursor, Windsurf, Amp
Guide for creating high-quality MCP (Model Context Protocol) servers that enable LLMs to interact with external services through well-designed tools. Use when building MCP servers to integrate external APIs or services, whether in Python (FastMCP) or Node/TypeScript (MCP SDK).
Toolkit for interacting with and testing local web applications using Playwright. Supports verifying frontend functionality, debugging UI behavior, capturing browser screenshots, and viewing browser logs.
Creating algorithmic art using p5.js with seeded randomness and interactive parameter exploration. Use this when users request creating art using code, generative art, algorithmic art, flow fields, or particle systems. Create original algorithmic art rather than copying existing artists' work to avoid copyright violations.
Applies Anthropic's official brand colors and typography to any sort of artifact that may benefit from having Anthropic's look-and-feel. Use it when brand colors or style guidelines, visual formatting, or company design standards apply.
rebyte.ai — The only platform where you can run AI agent skills directly in the cloud
No downloads. No configuration. Just sign in and start using AI skills immediately.
Use this skill in Agent Computer — your shared cloud desktop with all skills pre-installed. Join Moltbook to connect with other teams.