Skip to main content

What You’ll Build

A customer support chatbot that:
  • Answers questions from your documentation
  • Maintains conversation context
  • Streams responses in real-time
  • Displays source citations
Time: 30 minutes
Prerequisites: Cuadra AI account, Node.js 18+

Step 1: Create a Model

Create an AI model via the Dashboard or API. For API access, authenticate via M2M OAuth 2.0:
curl -X POST https://api.cuadra.ai/v1/models \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: create-model-001" \
  -d '{
    "name": "Support Bot",
    "provider": "openai",
    "providerModelId": "gpt-4o-mini"
  }'
Save the returned id (e.g., model_abc123).

Step 2: Create a Knowledge Base

Upload your documentation:

Create Dataset

curl -X POST https://api.cuadra.ai/v1/datasets \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: create-dataset-001" \
  -d '{"name": "Product Docs"}'

Upload Documents

Upload files and associate them with the dataset:
# Step 1: Upload the file
FILE_RESPONSE=$(curl -X POST https://api.cuadra.ai/v1/files \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Idempotency-Key: upload-doc-001" \
  -F "file=@docs/getting-started.pdf")
FILE_ID=$(echo $FILE_RESPONSE | jq -r '.id')

# Step 2: Associate with dataset
curl -X POST https://api.cuadra.ai/v1/files/$FILE_ID/associations \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"datasetId": "ds_xyz"}'

Connect your knowledge base:
curl -X POST https://api.cuadra.ai/v1/models/model_abc123/datasets \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: link-dataset-001" \
  -d '{"datasetId": "ds_xyz", "usageType": "rag"}'

Step 4: Add System Prompt

Create a particle for the bot’s behavior:
curl -X POST https://api.cuadra.ai/v1/particles \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: create-particle-001" \
  -d '{
    "name": "Support Role",
    "category": "role",
    "content": "You are a helpful support agent. Answer questions based on the provided documentation. If unsure, say so and suggest contacting support@example.com."
  }'

Create System Prompt

curl -X POST https://api.cuadra.ai/v1/system-prompts \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: create-sysprompt-001" \
  -d '{
    "name": "Support Prompt",
    "particles": [{"particleId": "particle_xxx", "order": 1}]
  }'

Attach to Model

curl -X PATCH https://api.cuadra.ai/v1/models/model_abc123 \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"systemPromptId": "sysprompt_yyy"}'

Step 5: Test via API

Verify everything works:
curl -X POST https://api.cuadra.ai/v1/chats \
  -H "Authorization: Bearer $ACCESS_TOKEN" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: test-chat-001" \
  -d '{
    "modelId": "model_abc123",
    "messages": [{"role": "user", "content": "How do I get started?"}]
  }'
You should see a response with content from your documentation and source citations.

Step 6: Build the React UI

Install the Cuadra AI UI Kit:
npm install @cuadra-ai/uikit
Create the chat component:
// src/components/SupportChat.tsx
import { CuadraChat } from '@cuadra-ai/uikit';

export function SupportChat({ sessionToken }: { sessionToken: string }) {
  return (
    <div style={{ height: '600px', width: '400px' }}>
      <CuadraChat
        connection={{
          baseUrl: "https://api.cuadra.ai",
          sessionToken: sessionToken
        }}
        chat={{
          modelId: "model_abc123",
          mode: "multiChat"
        }}
      />
    </div>
  );
}

Step 7: User Authentication

Users authenticate via your Stytch B2B integration. The session token from Stytch is passed to the UI Kit.

Frontend Integration

import { useState, useEffect } from 'react';
import { CuadraChat } from '@cuadra-ai/uikit';

export function App() {
  // Session token from your Stytch B2B authentication
  const [sessionToken, setSessionToken] = useState<string | null>(null);

  useEffect(() => {
    // Your auth system provides the session token after user login
    const token = getStytchSessionToken(); // From your auth implementation
    setSessionToken(token);
  }, []);

  if (!sessionToken) {
    return <div>Please log in</div>;
  }

  return (
    <CuadraChat
      connection={{
        baseUrl: "https://api.cuadra.ai",
        sessionToken: sessionToken
      }}
      chat={{
        modelId: "model_abc123",
        mode: "multiChat"
      }}
    />
  );
}

Proxy Mode (Alternative)

If you prefer backend-handled auth, route requests through your backend: Frontend:
<CuadraChat
  connection={{
    proxyUrl: "/api/chat"  // Your backend handles auth
  }}
  chat={{
    modelId: "model_abc123"
  }}
/>
Backend Proxy:
from fastapi import FastAPI, Request
import httpx

app = FastAPI()

@app.post("/api/chat")
async def proxy_chat(request: Request):
    body = await request.json()
    
    async with httpx.AsyncClient() as client:
        response = await client.post(
            "https://api.cuadra.ai/v1/chats",
            headers={
                "Authorization": f"Bearer {M2M_ACCESS_TOKEN}",
                "Content-Type": "application/json",
            },
            json=body
        )
        return response.json()

Next Steps