You can enable built-in cross-conversational memory by sending Supermemory a user_id or a conversation_id in the request.

How Supermemory Identifies Users

Supermemory will find the user ID in the following places (in order of priority):

For all endpoints:

  1. Query Parameters:

    • user_id in the query string
  2. HTTP Headers:

    • x-supermemory-user-id header

For OpenAI-compatible endpoints:

  1. Request Body:
    • user field
    • metadata.user_id field
    • metadata.conversation_id field (fallback)

For Anthropic-compatible endpoints:

  1. Request Body:
    • metadata.user_id field
    • metadata.conversation_id field (fallback)

For Gemini-compatible endpoints:

  1. Request Body:
    • labels.user_id field
    • labels.conversation_id field (fallback)

Stateful vs. Stateless Usage

Supermemory is stateful when a user_id is provided. This means:

  • The system will store and retrieve context across multiple conversations for the same user ID
  • The full context of previous interactions is maintained

If no user ID is provided, Supermemory operates in a stateless mode, treating each conversation independently.

Important Considerations

  • If multiple users have the first 20k tokens exactly (matching content) and no user ID is provided, users may potentially see each other’s data
  • Using a hash-matching system, Supermemory ignores duplicate messages
  • For users with many distinct conversations, Supermemory will internally track and manage these conversations
  • It’s recommended to always provide a user ID for consistent memory retrieval across sessions

Implementation Examples

Google Gemini

const ai = new GoogleGenAI({ apiKey: "YOUR_API_KEY" });

async function main() {
  const response = await ai.models.generateContent({
    model: "gemini-2.0-flash",
    contents: "Explain how AI works in a few words",
    config: {
      // Option 1: Using labels in the request body
      labels: {
        user_id: "user_123"
      },
      // Option 2: Using headers
      httpOptions: {
        headers: {
          'x-supermemory-user-id': "user_123"
        }
      }
    },
  });
  console.log(response.text);
}

Anthropic

const anthropic = new Anthropic({
  apiKey: 'YOUR_API_KEY', // defaults to process.env["ANTHROPIC_API_KEY"]
});

async function main() {
  const msg = await anthropic.messages.create({
    model: "claude-sonnet-4-20250514",
    max_tokens: 1024,
    messages: [{ role: "user", content: "Hello, Claude" }],
    // Option 1: Using metadata in the request body
    metadata: {
      user_id: "user_123"
    }
  }, {
    // Option 2: Using headers
    headers: {
      'x-supermemory-user-id': "user_123"
    }
  });
  
  console.log(msg);
}

OpenAI

const openai = new OpenAI({
  apiKey: "YOUR_API_KEY"
});

async function main() {
  const completion = await openai.chat.completions.create({
    messages: [
      { role: "user", content: "Hello, Assistant" }
    ],
    model: "gpt-4o",
    // Option 1: Using user field
    user: "user_123",
    // Option 2: Using metadata
    metadata: {
      user_id: "user_123"
    }
  }, {
    // Option 3: Using headers
    headers: {
      'x-supermemory-user-id': "user_123"
    }
  });
  
  console.log(completion.choices[0].message);
}