Skip to main content

Overview

This example demonstrates a complete chat application that remembers user preferences, context, and conversations using Satori.

Features

  • Persistent memory across sessions
  • Automatic information extraction
  • Semantic context retrieval
  • Natural conversation flow

Complete Implementation

API Route

app/api/chat/route.ts
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';
import { memoryTools, getMemoryContext } from '@satori/tools';
import { auth } from '@clerk/nextjs/server';

export async function POST(req: Request) {
  try {
    // Authenticate user
    const { userId } = await auth();
    
    if (!userId) {
      return new Response('Unauthorized', { status: 401 });
    }
    
    // Parse request
    const { messages } = await req.json();
    const userMessage = messages[messages.length - 1].content;
    
    // Configure memory
    const memoryConfig = {
      apiKey: process.env.SATORI_API_KEY!,
      baseUrl: process.env.SATORI_URL!,
      userId,
    };
    
    // Create memory tools
    const tools = memoryTools(memoryConfig);
    
    // Pre-fetch relevant context
    const memoryContext = await getMemoryContext(
      memoryConfig,
      userMessage,
      { limit: 5 }
    );
    
    // Stream response with memory
    const result = await streamText({
      model: openai('gpt-4o'),
      system: `You are a helpful AI assistant with long-term memory.

${memoryContext ? `What you know about this user:\n${memoryContext}\n` : ''}

When the user shares important information, use the add_memory tool to save it.
Important information includes:
- Personal preferences (favorite things, dislikes)
- Personal details (name, occupation, location)
- Goals and intentions
- Important dates or events
- Opinions and beliefs

Be natural and conversational. Don't explicitly mention that you're saving memories unless asked.`,
      messages,
      tools,
      maxSteps: 5,
    });
    
    return result.toDataStreamResponse();
  } catch (error) {
    console.error('Chat error:', error);
    return new Response('Internal Server Error', { status: 500 });
  }
}

Chat Interface

app/chat/page.tsx
'use client';

import { useChat } from 'ai/react';
import { useUser } from '@clerk/nextjs';

export default function ChatPage() {
  const { user } = useUser();
  const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat();

  return (
    <div className="flex flex-col h-screen max-w-4xl mx-auto p-4">
      {/* Header */}
      <div className="mb-4 pb-4 border-b">
        <h1 className="text-2xl font-bold">Chat with Memory</h1>
        <p className="text-sm text-gray-500">
          I'll remember what you tell me across conversations
        </p>
      </div>

      {/* Messages */}
      <div className="flex-1 overflow-y-auto space-y-4 mb-4">
        {messages.length === 0 && (
          <div className="text-center text-gray-500 mt-8">
            <p className="text-lg mb-4">Start a conversation!</p>
            <div className="space-y-2 text-sm">
              <p>Try saying:</p>
              <ul className="space-y-1">
                <li>"My name is Alex and I'm a software engineer"</li>
                <li>"I prefer TypeScript over JavaScript"</li>
                <li>"I love hiking and photography"</li>
                <li>"What do you know about me?"</li>
              </ul>
            </div>
          </div>
        )}
        
        {messages.map((message) => (
          <div
            key={message.id}
            className={`flex ${
              message.role === 'user' ? 'justify-end' : 'justify-start'
            }`}
          >
            <div
              className={`max-w-[80%] p-4 rounded-lg ${
                message.role === 'user'
                  ? 'bg-purple-600 text-white'
                  : 'bg-gray-100 text-gray-900'
              }`}
            >
              <p className="text-sm font-semibold mb-1">
                {message.role === 'user' ? 'You' : 'Assistant'}
              </p>
              <p className="whitespace-pre-wrap">{message.content}</p>
              
              {/* Show when memories are saved */}
              {message.toolInvocations?.map((tool, i) => (
                <div key={i} className="mt-2 text-xs opacity-75 flex items-center gap-1">
                  {tool.toolName === 'add_memory' && (
                    <>
                      <span>💾</span>
                      <span>Saved to memory</span>
                    </>
                  )}
                </div>
              ))}
            </div>
          </div>
        ))}
        
        {isLoading && (
          <div className="flex justify-start">
            <div className="bg-gray-100 p-4 rounded-lg">
              <div className="flex space-x-2">
                <div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce" />
                <div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce delay-100" />
                <div className="w-2 h-2 bg-gray-400 rounded-full animate-bounce delay-200" />
              </div>
            </div>
          </div>
        )}
      </div>

      {/* Input */}
      <form onSubmit={handleSubmit} className="flex gap-2">
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Type a message..."
          className="flex-1 p-3 border rounded-lg focus:outline-none focus:ring-2 focus:ring-purple-600"
          disabled={isLoading}
        />
        <button
          type="submit"
          disabled={isLoading || !input.trim()}
          className="px-6 py-3 bg-purple-600 text-white rounded-lg hover:bg-purple-700 disabled:opacity-50"
        >
          Send
        </button>
      </form>
    </div>
  );
}

Example Conversations

Conversation 1: Building Context

User: “Hi! My name is Alex and I’m a software engineer at Acme Corp.”Assistant: “Nice to meet you, Alex! It’s great to connect with a software engineer. What kind of projects do you work on at Acme Corp?”Behind the scenes: Saves “User’s name is Alex and works as a software engineer at Acme Corp”
User: “I really prefer TypeScript over JavaScript. The type safety is just so much better.”Assistant: “I completely understand! TypeScript’s type safety does make a huge difference in catching bugs early and improving code maintainability. Do you use TypeScript for all your projects?”Behind the scenes: Saves “User prefers TypeScript over JavaScript for type safety”
User: “In my free time, I love hiking and photography. I try to get out into nature every weekend.”Assistant: “That’s a wonderful combination! Photography and hiking complement each other perfectly. Do you have a favorite hiking spot where you like to take photos?”Behind the scenes: Saves “User enjoys hiking and photography, tries to go every weekend”

Conversation 2: Recalling Context (New Session)

User: “Hey, what’s up?”Assistant: “Hey Alex! How’s it going? Did you get a chance to go hiking this weekend?”Behind the scenes: Retrieved memories about name and hiking hobby
User: “I’m looking for a new side project to work on.”Assistant: “Since you love TypeScript and photography, have you thought about building a photo management app? You could use TypeScript with Next.js for the frontend and maybe add some AI features for automatic tagging. It would combine your technical preferences with your hobby!”Behind the scenes: Retrieved memories about TypeScript preference and photography hobby
User: “What do you remember about me?”Assistant: “Let me see… I know that your name is Alex and you work as a software engineer at Acme Corp. You prefer TypeScript over JavaScript because of the type safety. In your free time, you enjoy hiking and photography, and you try to get out into nature every weekend. Is there anything else you’d like me to remember?”Behind the scenes: Retrieved all relevant memories

Conversation 3: Updating Information

User: “Actually, I just started a new job at TechCo!”Assistant: “Congratulations on the new position at TechCo! That’s exciting. What will you be working on there?”Behind the scenes: Saves “User now works at TechCo” (new memory)
User: “Can you forget that I work at TechCo? I’d rather not share work details.”Assistant: “Of course, I’ve removed that information. Your privacy is important.”Behind the scenes: Searches for and deletes work-related memories

Key Features Demonstrated

Automatic Extraction

The LLM automatically identifies and saves important information without explicit commands

Semantic Retrieval

Context is retrieved based on meaning, not just keywords

Natural Flow

Memory operations happen transparently without disrupting conversation

User Control

Users can ask what’s remembered and request deletions

Customization Ideas

const tools = {
  add_preference: tool({
    description: 'Save a user preference',
    parameters: z.object({ preference: z.string() }),
    execute: async ({ preference }) => {
      await client.addMemory(preference, {
        metadata: { category: 'preference' },
      });
    },
  }),
  add_fact: tool({
    description: 'Save a factual piece of information',
    parameters: z.object({ fact: z.string() }),
    execute: async ({ fact }) => {
      await client.addMemory(fact, {
        metadata: { category: 'fact' },
      });
    },
  }),
};
system: `When saving memories, consider importance:
- High: Core identity, strong preferences, critical information
- Medium: Casual preferences, general interests
- Low: Temporary states, minor details

Add importance to metadata when saving.`
// After each conversation
const summary = await generateSummary(messages);
await client.addMemory(summary, {
  metadata: {
    type: 'conversation_summary',
    date: new Date().toISOString(),
  },
});

Next Steps