Skip to main content
Integrate Memvid with the OpenAI SDK to use function calling with your knowledge base. The openai adapter provides function schemas formatted for OpenAI’s chat completions API.

Installation

npm install @memvid/sdk openai

Quick Start

import { use } from '@memvid/sdk';

// Open with OpenAI adapter
const mem = await use('openai', 'knowledge.mv2');

// Access function schemas
const functions = mem.functions;  // Array of function schemas

Available Functions

The OpenAI adapter provides three functions:
FunctionDescription
memvid_putStore documents in memory with title, label, and text
memvid_findSearch for relevant documents by query
memvid_askAsk questions with RAG-style answer synthesis

Function Calling Example

import { use } from '@memvid/sdk';
import OpenAI from 'openai';

// Get Memvid functions
const mem = await use('openai', 'knowledge.mv2');
const functions = mem.functions;

// Create OpenAI client
const client = new OpenAI();

const messages: OpenAI.Chat.Completions.ChatCompletionMessageParam[] = [
  { role: 'system', content: 'You are a helpful assistant with access to a knowledge base.' },
  { role: 'user', content: 'Search for information about authentication' },
];

// Create completion with function calling
const response = await client.chat.completions.create({
  model: 'gpt-4o',
  messages,
  tools: functions.map((f: any) => ({ type: 'function' as const, function: f })),
  tool_choice: 'auto',
});

// Handle function calls
const message = response.choices[0].message;
if (message.tool_calls) {
  for (const toolCall of message.tool_calls) {
    const funcName = toolCall.function.name;
    const funcArgs = JSON.parse(toolCall.function.arguments);

    let result: any;
    if (funcName === 'memvid_find') {
      result = await mem.find(funcArgs.query, { k: funcArgs.top_k || 5 });
    } else if (funcName === 'memvid_put') {
      result = await mem.put({
        title: funcArgs.title,
        label: funcArgs.label,
        text: funcArgs.text,
      });
    } else if (funcName === 'memvid_ask') {
      result = await mem.ask(funcArgs.question, { mode: funcArgs.mode || 'auto' });
    }

    console.log(`Function ${funcName} result:`, result);
  }
}

Complete Conversation Loop

import { use } from '@memvid/sdk';
import OpenAI from 'openai';

const mem = await use('openai', 'knowledge.mv2');
const functions = mem.functions;

const client = new OpenAI();
const messages: OpenAI.Chat.Completions.ChatCompletionMessageParam[] = [
  { role: 'system', content: 'You have access to a knowledge base. Use the tools to help users.' },
  { role: 'user', content: 'What authentication methods are supported?' },
];

// Function to execute tool calls
async function executeFunction(name: string, args: any): Promise<any> {
  if (name === 'memvid_find') {
    return mem.find(args.query, { k: args.top_k || 5 });
  } else if (name === 'memvid_put') {
    return mem.put({ title: args.title, label: args.label, text: args.text });
  } else if (name === 'memvid_ask') {
    return mem.ask(args.question, { mode: args.mode || 'auto' });
  }
  return null;
}

// Conversation loop
while (true) {
  const response = await client.chat.completions.create({
    model: 'gpt-4o',
    messages,
    tools: functions.map((f: any) => ({ type: 'function' as const, function: f })),
    tool_choice: 'auto',
  });

  const message = response.choices[0].message;
  messages.push(message);

  if (message.tool_calls) {
    for (const toolCall of message.tool_calls) {
      const funcName = toolCall.function.name;
      const funcArgs = JSON.parse(toolCall.function.arguments);
      const result = await executeFunction(funcName, funcArgs);

      messages.push({
        role: 'tool',
        tool_call_id: toolCall.id,
        content: JSON.stringify(result) || 'Function executed',
      });
    }
  } else {
    // No more tool calls, print the response
    console.log(message.content);
    break;
  }
}

Function Schemas

memvid_put

{
  "name": "memvid_put",
  "description": "Store a document in Memvid memory for later retrieval",
  "parameters": {
    "type": "object",
    "properties": {
      "title": { "type": "string", "description": "Title of the document" },
      "label": { "type": "string", "description": "Category or label" },
      "text": { "type": "string", "description": "Text content to store" },
      "metadata": { "type": "object", "description": "Optional metadata" }
    },
    "required": ["title", "label", "text"]
  }
}

memvid_find

{
  "name": "memvid_find",
  "description": "Search Memvid memory for documents matching a query",
  "parameters": {
    "type": "object",
    "properties": {
      "query": { "type": "string", "description": "Search query string" },
      "top_k": { "type": "number", "description": "Number of results (default: 5)" }
    },
    "required": ["query"]
  }
}

memvid_ask

{
  "name": "memvid_ask",
  "description": "Ask a question and get an answer from Memvid memory",
  "parameters": {
    "type": "object",
    "properties": {
      "question": { "type": "string", "description": "Question to answer" },
      "mode": { "type": "string", "enum": ["auto", "lex", "sem"], "description": "Search mode" }
    },
    "required": ["question"]
  }
}

Best Practices

  1. Use tool_choice=“auto” to let the model decide when to use tools
  2. Handle multiple tool calls - the model may call multiple functions
  3. Complete the loop - continue until no more tool_calls are returned
  4. Close the memory when done
mem = use('openai', 'knowledge.mv2')
try:
    # Use functions...
finally:
    mem.seal()

Next Steps