Skip to main content
Integrate Memvid with the Vercel AI SDK to build AI-powered web applications. The vercel-ai adapter provides tools formatted for use with generateText, streamText, and other AI SDK functions.

Installation

npm install @memvid/sdk ai @ai-sdk/openai

Quick Start

import { use } from '@memvid/sdk';

// Open with Vercel AI adapter
const mem = await use('vercel-ai', 'knowledge.mv2');

// Access Vercel AI tools
const tools = mem.tools;  // Object with tool definitions

Available Tools

The Vercel AI adapter provides three tools:
ToolDescription
memvid_putStore documents in memory with title, label, and text
memvid_findSearch for relevant documents by query
memvid_askAsk questions with RAG-style answer synthesis

Using with generateText

import { use } from '@memvid/sdk';
import { generateText } from 'ai';
import { openai } from '@ai-sdk/openai';

// Get Memvid tools
const mem = await use('vercel-ai', 'knowledge.mv2');

// Use with generateText
const result = await generateText({
  model: openai('gpt-4o-mini'),
  tools: mem.tools,
  maxSteps: 5,  // Allow multiple tool calls
  system: 'You are a helpful assistant with access to a knowledge base.',
  prompt: 'Search for information about authentication and summarize it.',
});

// Access the result
console.log(result.text);

// View tool calls made
for (const step of result.steps) {
  if (step.toolCalls) {
    for (const call of step.toolCalls) {
      console.log(`Tool: ${call.toolName}, Args: ${JSON.stringify(call.args)}`);
    }
  }
}

Using with streamText

import { use } from '@memvid/sdk';
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

// Get Memvid tools
const mem = await use('vercel-ai', 'knowledge.mv2');

// Stream response with tool use
const result = await streamText({
  model: openai('gpt-4o-mini'),
  tools: mem.tools,
  maxSteps: 3,
  system: 'You are a helpful assistant with access to a knowledge base.',
  prompt: 'What features does the product have?',
});

// Stream the text output
for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

Direct Tool Usage

You can also call tools directly without using an LLM:
import { use } from '@memvid/sdk';

const mem = await use('vercel-ai', 'knowledge.mv2');
const tools = mem.tools;

// Store a document
const putResult = await tools.memvid_put.execute({
  title: 'API Documentation',
  label: 'docs',
  text: 'The API supports REST and GraphQL endpoints...',
});
console.log(putResult);  // "Document stored with frame_id: 2"

// Search for documents
const findResult = await tools.memvid_find.execute({
  query: 'API endpoints',
  top_k: 5,
});
console.log(findResult);

// Ask a question
const askResult = await tools.memvid_ask.execute({
  question: 'How do I authenticate with the API?',
  mode: 'auto',
});
console.log(askResult);

Next.js API Route

// app/api/chat/route.ts
import { use } from '@memvid/sdk';
import { streamText } from 'ai';
import { openai } from '@ai-sdk/openai';

export async function POST(req: Request) {
  const { messages } = await req.json();

  // Get Memvid tools
  const mem = await use('vercel-ai', 'knowledge.mv2');

  const result = await streamText({
    model: openai('gpt-4o-mini'),
    tools: mem.tools,
    messages,
    maxSteps: 5,
  });

  return result.toDataStreamResponse();
}

Next.js with useChat

// app/page.tsx
'use client';
import { useChat } from 'ai/react';

export default function Chat() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: '/api/chat',
  });

  return (
    <div>
      {messages.map((m) => (
        <div key={m.id}>
          <strong>{m.role}:</strong> {m.content}
        </div>
      ))}
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Ask about your knowledge base..."
        />
        <button type="submit">Send</button>
      </form>
    </div>
  );
}

Tool Parameters

memvid_put

ParameterTypeRequiredDescription
titlestringYesTitle of the document
labelstringYesCategory or label
textstringYesText content to store
metadataobjectNoOptional key-value metadata

memvid_find

ParameterTypeRequiredDescription
querystringYesSearch query string
top_knumberNoNumber of results (default: 5)

memvid_ask

ParameterTypeRequiredDescription
questionstringYesQuestion to answer
modestringNo'auto', 'lex', or 'sem'

Best Practices

  1. Set maxSteps to allow the model to make multiple tool calls when needed
  2. Use streaming for better user experience with streamText
  3. Handle tool results by checking result.steps for tool call history
  4. Close the memory when done with mem.seal()
const mem = await use('vercel-ai', 'knowledge.mv2');
try {
  // Use tools...
  const result = await generateText({
    model: openai('gpt-4o-mini'),
    tools: mem.tools,
    prompt: 'Search for...',
  });
} finally {
  await mem.seal();
}

Next Steps