autogen_ext.memory.chromadb#

class ChromaDBVectorMemory(config: ChromaDBVectorMemoryConfig | None = None)[Quelle]#

Bases: Memory, Component[ChromaDBVectorMemoryConfig]

Speichert und ruft Speicher über die Vektorsuche ab, die von ChromaDB unterstützt wird.

ChromaDBVectorMemory bietet eine vektorbasierte Speicherimplementierung, die ChromaDB zum Speichern und Abrufen von Inhalten basierend auf semantischer Ähnlichkeit verwendet. Sie verbessert Agenten mit der Fähigkeit, kontextuell relevante Informationen während Gesprächen abzurufen, indem sie Vektor-Embeddings nutzt, um ähnliche Inhalte zu finden.

Diese Implementierung dient als Referenz für komplexere Speichersysteme, die Vektor-Embeddings verwenden. Für fortgeschrittene Anwendungsfälle, die eine spezielle Formatierung abgerufener Inhalte erfordern, sollten Benutzer diese Klasse erweitern und die Methode update_context() überschreiben.

Diese Implementierung erfordert, dass die ChromaDB-Erweiterung installiert ist. Installieren Sie mit

pip install "autogen-ext[chromadb]"
Parameter:

config (ChromaDBVectorMemoryConfig | None) – Konfiguration für den ChromaDB-Speicher. Wenn None, wird standardmäßig eine PersistentChromaDBVectorMemoryConfig mit Standardwerten verwendet. Es werden zwei Konfigurationstypen unterstützt: * PersistentChromaDBVectorMemoryConfig: Für lokale Speicherung * HttpChromaDBVectorMemoryConfig: Für die Verbindung zu einem entfernten ChromaDB-Server

Beispiel

import os
import asyncio
from pathlib import Path
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.ui import Console
from autogen_core.memory import MemoryContent, MemoryMimeType
from autogen_ext.memory.chromadb import (
    ChromaDBVectorMemory,
    PersistentChromaDBVectorMemoryConfig,
    SentenceTransformerEmbeddingFunctionConfig,
    OpenAIEmbeddingFunctionConfig,
)
from autogen_ext.models.openai import OpenAIChatCompletionClient


def get_weather(city: str) -> str:
    return f"The weather in {city} is sunny with a high of 90°F and a low of 70°F."


def fahrenheit_to_celsius(fahrenheit: float) -> float:
    return (fahrenheit - 32) * 5.0 / 9.0


async def main() -> None:
    # Use default embedding function
    default_memory = ChromaDBVectorMemory(
        config=PersistentChromaDBVectorMemoryConfig(
            collection_name="user_preferences",
            persistence_path=os.path.join(str(Path.home()), ".chromadb_autogen"),
            k=3,  # Return top 3 results
            score_threshold=0.5,  # Minimum similarity score
        )
    )

    # Using a custom SentenceTransformer model
    custom_memory = ChromaDBVectorMemory(
        config=PersistentChromaDBVectorMemoryConfig(
            collection_name="multilingual_memory",
            persistence_path=os.path.join(str(Path.home()), ".chromadb_autogen"),
            embedding_function_config=SentenceTransformerEmbeddingFunctionConfig(
                model_name="paraphrase-multilingual-mpnet-base-v2"
            ),
        )
    )

    # Using OpenAI embeddings
    openai_memory = ChromaDBVectorMemory(
        config=PersistentChromaDBVectorMemoryConfig(
            collection_name="openai_memory",
            persistence_path=os.path.join(str(Path.home()), ".chromadb_autogen"),
            embedding_function_config=OpenAIEmbeddingFunctionConfig(
                api_key=os.environ["OPENAI_API_KEY"], model_name="text-embedding-3-small"
            ),
        )
    )

    # Add user preferences to memory
    await openai_memory.add(
        MemoryContent(
            content="The user prefers weather temperatures in Celsius",
            mime_type=MemoryMimeType.TEXT,
            metadata={"category": "preferences", "type": "units"},
        )
    )

    # Create assistant agent with ChromaDB memory
    assistant = AssistantAgent(
        name="assistant",
        model_client=OpenAIChatCompletionClient(
            model="gpt-4.1",
        ),
        tools=[
            get_weather,
            fahrenheit_to_celsius,
        ],
        max_tool_iterations=10,
        memory=[openai_memory],
    )

    # The memory will automatically retrieve relevant content during conversations
    await Console(assistant.run_stream(task="What's the temperature in New York?"))

    # Remember to close the memory when finished
    await default_memory.close()
    await custom_memory.close()
    await openai_memory.close()


asyncio.run(main())

Ausgabe

---------- TextMessage (user) ----------
What's the temperature in New York?
---------- MemoryQueryEvent (assistant) ----------
[MemoryContent(content='The user prefers weather temperatures in Celsius', mime_type='MemoryMimeType.TEXT', metadata={'type': 'units', 'category': 'preferences', 'mime_type': 'MemoryMimeType.TEXT', 'score': 0.3133561611175537, 'id': 'fb00506c-acf4-4174-93d7-2a942593f3f7'}), MemoryContent(content='The user prefers weather temperatures in Celsius', mime_type='MemoryMimeType.TEXT', metadata={'mime_type': 'MemoryMimeType.TEXT', 'category': 'preferences', 'type': 'units', 'score': 0.3133561611175537, 'id': '34311689-b419-4e1a-8bc4-09143f356c66'})]
---------- ToolCallRequestEvent (assistant) ----------
[FunctionCall(id='call_7TjsFd430J1aKwU5T2w8bvdh', arguments='{"city":"New York"}', name='get_weather')]
---------- ToolCallExecutionEvent (assistant) ----------
[FunctionExecutionResult(content='The weather in New York is sunny with a high of 90°F and a low of 70°F.', name='get_weather', call_id='call_7TjsFd430J1aKwU5T2w8bvdh', is_error=False)]
---------- ToolCallRequestEvent (assistant) ----------
[FunctionCall(id='call_RTjMHEZwDXtjurEYTjDlvq9c', arguments='{"fahrenheit": 90}', name='fahrenheit_to_celsius'), FunctionCall(id='call_3mMuCK1aqtzZPTqIHPoHKxtP', arguments='{"fahrenheit": 70}', name='fahrenheit_to_celsius')]
---------- ToolCallExecutionEvent (assistant) ----------
[FunctionExecutionResult(content='32.22222222222222', name='fahrenheit_to_celsius', call_id='call_RTjMHEZwDXtjurEYTjDlvq9c', is_error=False), FunctionExecutionResult(content='21.11111111111111', name='fahrenheit_to_celsius', call_id='call_3mMuCK1aqtzZPTqIHPoHKxtP', is_error=False)]
---------- TextMessage (assistant) ----------
The temperature in New York today is sunny with a high of about 32°C and a low of about 21°C.
component_config_schema#

Alias von ChromaDBVectorMemoryConfig

component_provider_override: ClassVar[str | None] = 'autogen_ext.memory.chromadb.ChromaDBVectorMemory'#

Überschreibe den Anbieter-String für die Komponente. Dies sollte verwendet werden, um zu verhindern, dass interne Modulnamen Teil des Modulnamens werden.

property collection_name: str#

Ruft den Namen der ChromaDB-Sammlung ab.

async update_context(model_context: ChatCompletionContext) UpdateContextResult[Quelle]#

Aktualisiert den bereitgestellten Modellkontext mit relevanten Speicherinhalten.

Parameter:

model_context – Der zu aktualisierende Kontext.

Gibt zurück:

UpdateContextResult, das relevante Erinnerungen enthält

async add(content: MemoryContent, cancellation_token: CancellationToken | None = None) None[Quelle]#

Fügt einen neuen Inhalt zum Speicher hinzu.

Parameter:
  • content – Der hinzuzufügende Speicherinhalt

  • cancellation_token – Optionales Token zum Abbrechen der Operation

async query(query: str | MemoryContent, cancellation_token: CancellationToken | None = None, **kwargs: Any) MemoryQueryResult[Quelle]#

Fragt den Speicher ab und gibt relevante Einträge zurück.

Parameter:
  • query – Abfrage von Inhaltselementen

  • cancellation_token – Optionales Token zum Abbrechen der Operation

  • **kwargs – Zusätzliche implementierungsspezifische Parameter

Gibt zurück:

MemoryQueryResult, das Speicherereinträge mit Relevanzbewertungen enthält

async clear() None[Quelle]#

Löschen Sie alle Einträge aus dem Speicher.

async close() None[Quelle]#

Bereinigt den ChromaDB-Client und die Ressourcen.

async reset() None[Quelle]#
pydantic model ChromaDBVectorMemoryConfig[Quelle]#

Bases: BaseModel

Basis-Konfiguration für die ChromaDB-basierte Speicherimplementierung.

Geändert in Version v0.4.1: Unterstützung für benutzerdefinierte Embedding-Funktionen über embedding_function_config hinzugefügt.

JSON-Schema anzeigen
{
   "title": "ChromaDBVectorMemoryConfig",
   "description": "Base configuration for ChromaDB-based memory implementation.\n\n.. versionchanged:: v0.4.1\n   Added support for custom embedding functions via embedding_function_config.",
   "type": "object",
   "properties": {
      "client_type": {
         "enum": [
            "persistent",
            "http"
         ],
         "title": "Client Type",
         "type": "string"
      },
      "collection_name": {
         "default": "memory_store",
         "description": "Name of the ChromaDB collection",
         "title": "Collection Name",
         "type": "string"
      },
      "distance_metric": {
         "default": "cosine",
         "description": "Distance metric for similarity search",
         "title": "Distance Metric",
         "type": "string"
      },
      "k": {
         "default": 3,
         "description": "Number of results to return in queries",
         "title": "K",
         "type": "integer"
      },
      "score_threshold": {
         "anyOf": [
            {
               "type": "number"
            },
            {
               "type": "null"
            }
         ],
         "default": null,
         "description": "Minimum similarity score threshold",
         "title": "Score Threshold"
      },
      "allow_reset": {
         "default": false,
         "description": "Whether to allow resetting the ChromaDB client",
         "title": "Allow Reset",
         "type": "boolean"
      },
      "tenant": {
         "default": "default_tenant",
         "description": "Tenant to use",
         "title": "Tenant",
         "type": "string"
      },
      "database": {
         "default": "default_database",
         "description": "Database to use",
         "title": "Database",
         "type": "string"
      },
      "embedding_function_config": {
         "description": "Configuration for the embedding function",
         "discriminator": {
            "mapping": {
               "default": "#/$defs/DefaultEmbeddingFunctionConfig",
               "openai": "#/$defs/OpenAIEmbeddingFunctionConfig",
               "sentence_transformer": "#/$defs/SentenceTransformerEmbeddingFunctionConfig"
            },
            "propertyName": "function_type"
         },
         "oneOf": [
            {
               "$ref": "#/$defs/DefaultEmbeddingFunctionConfig"
            },
            {
               "$ref": "#/$defs/SentenceTransformerEmbeddingFunctionConfig"
            },
            {
               "$ref": "#/$defs/OpenAIEmbeddingFunctionConfig"
            }
         ],
         "title": "Embedding Function Config"
      }
   },
   "$defs": {
      "DefaultEmbeddingFunctionConfig": {
         "description": "Configuration for the default ChromaDB embedding function.\n\nUses ChromaDB's default embedding function (Sentence Transformers all-MiniLM-L6-v2).\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.",
         "properties": {
            "function_type": {
               "const": "default",
               "default": "default",
               "title": "Function Type",
               "type": "string"
            }
         },
         "title": "DefaultEmbeddingFunctionConfig",
         "type": "object"
      },
      "OpenAIEmbeddingFunctionConfig": {
         "description": "Configuration for OpenAI embedding functions.\n\nUses OpenAI's embedding API for generating embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    api_key (str): OpenAI API key. If empty, will attempt to use environment variable.\n    model_name (str): OpenAI embedding model name. Defaults to \"text-embedding-ada-002\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import OpenAIEmbeddingFunctionConfig\n\n        _ = OpenAIEmbeddingFunctionConfig(api_key=\"sk-...\", model_name=\"text-embedding-3-small\")",
         "properties": {
            "function_type": {
               "const": "openai",
               "default": "openai",
               "title": "Function Type",
               "type": "string"
            },
            "api_key": {
               "default": "",
               "description": "OpenAI API key",
               "title": "Api Key",
               "type": "string"
            },
            "model_name": {
               "default": "text-embedding-ada-002",
               "description": "OpenAI embedding model name",
               "title": "Model Name",
               "type": "string"
            }
         },
         "title": "OpenAIEmbeddingFunctionConfig",
         "type": "object"
      },
      "SentenceTransformerEmbeddingFunctionConfig": {
         "description": "Configuration for SentenceTransformer embedding functions.\n\nAllows specifying a custom SentenceTransformer model for embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    model_name (str): Name of the SentenceTransformer model to use.\n        Defaults to \"all-MiniLM-L6-v2\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import SentenceTransformerEmbeddingFunctionConfig\n\n        _ = SentenceTransformerEmbeddingFunctionConfig(model_name=\"paraphrase-multilingual-mpnet-base-v2\")",
         "properties": {
            "function_type": {
               "const": "sentence_transformer",
               "default": "sentence_transformer",
               "title": "Function Type",
               "type": "string"
            },
            "model_name": {
               "default": "all-MiniLM-L6-v2",
               "description": "SentenceTransformer model name to use",
               "title": "Model Name",
               "type": "string"
            }
         },
         "title": "SentenceTransformerEmbeddingFunctionConfig",
         "type": "object"
      }
   },
   "required": [
      "client_type"
   ]
}

Felder:
  • allow_reset (bool)

  • client_type (Literal['persistent', 'http'])

  • collection_name (str)

  • database (str)

  • distance_metric (str)

  • embedding_function_config (autogen_ext.memory.chromadb._chroma_configs.DefaultEmbeddingFunctionConfig | autogen_ext.memory.chromadb._chroma_configs.SentenceTransformerEmbeddingFunctionConfig | autogen_ext.memory.chromadb._chroma_configs.OpenAIEmbeddingFunctionConfig | autogen_ext.memory.chromadb._chroma_configs.CustomEmbeddingFunctionConfig)

  • k (int)

  • score_threshold (float | None)

  • tenant (str)

field client_type: Literal['persistent', 'http'] [Required]#
field collection_name: str = 'memory_store'#

Name der ChromaDB-Sammlung

field distance_metric: str = 'cosine'#

Distanzmetrik für Ähnlichkeitssuche

field k: int = 3#

Anzahl der zurückzugebenden Ergebnisse bei Abfragen

field score_threshold: float | None = None#

Mindestschwellenwert für die Ähnlichkeitsbewertung

field allow_reset: bool = False#

Ob das Zurücksetzen des ChromaDB-Clients erlaubt ist

field tenant: str = 'default_tenant'#

Zu verwendender Mandant (Tenant)

field database: str = 'default_database'#

Zu verwendende Datenbank

field embedding_function_config: Annotated[DefaultEmbeddingFunctionConfig | SentenceTransformerEmbeddingFunctionConfig | OpenAIEmbeddingFunctionConfig | CustomEmbeddingFunctionConfig, FieldInfo(annotation=NoneType, required=True, discriminator='function_type')] [Optional]#

Konfiguration für die Embedding-Funktion

pydantic model PersistentChromaDBVectorMemoryConfig[Quelle]#

Bases: ChromaDBVectorMemoryConfig

Konfiguration für persistenten ChromaDB-Speicher.

JSON-Schema anzeigen
{
   "title": "PersistentChromaDBVectorMemoryConfig",
   "description": "Configuration for persistent ChromaDB memory.",
   "type": "object",
   "properties": {
      "client_type": {
         "default": "persistent",
         "enum": [
            "persistent",
            "http"
         ],
         "title": "Client Type",
         "type": "string"
      },
      "collection_name": {
         "default": "memory_store",
         "description": "Name of the ChromaDB collection",
         "title": "Collection Name",
         "type": "string"
      },
      "distance_metric": {
         "default": "cosine",
         "description": "Distance metric for similarity search",
         "title": "Distance Metric",
         "type": "string"
      },
      "k": {
         "default": 3,
         "description": "Number of results to return in queries",
         "title": "K",
         "type": "integer"
      },
      "score_threshold": {
         "anyOf": [
            {
               "type": "number"
            },
            {
               "type": "null"
            }
         ],
         "default": null,
         "description": "Minimum similarity score threshold",
         "title": "Score Threshold"
      },
      "allow_reset": {
         "default": false,
         "description": "Whether to allow resetting the ChromaDB client",
         "title": "Allow Reset",
         "type": "boolean"
      },
      "tenant": {
         "default": "default_tenant",
         "description": "Tenant to use",
         "title": "Tenant",
         "type": "string"
      },
      "database": {
         "default": "default_database",
         "description": "Database to use",
         "title": "Database",
         "type": "string"
      },
      "embedding_function_config": {
         "description": "Configuration for the embedding function",
         "discriminator": {
            "mapping": {
               "default": "#/$defs/DefaultEmbeddingFunctionConfig",
               "openai": "#/$defs/OpenAIEmbeddingFunctionConfig",
               "sentence_transformer": "#/$defs/SentenceTransformerEmbeddingFunctionConfig"
            },
            "propertyName": "function_type"
         },
         "oneOf": [
            {
               "$ref": "#/$defs/DefaultEmbeddingFunctionConfig"
            },
            {
               "$ref": "#/$defs/SentenceTransformerEmbeddingFunctionConfig"
            },
            {
               "$ref": "#/$defs/OpenAIEmbeddingFunctionConfig"
            }
         ],
         "title": "Embedding Function Config"
      },
      "persistence_path": {
         "default": "./chroma_db",
         "description": "Path for persistent storage",
         "title": "Persistence Path",
         "type": "string"
      }
   },
   "$defs": {
      "DefaultEmbeddingFunctionConfig": {
         "description": "Configuration for the default ChromaDB embedding function.\n\nUses ChromaDB's default embedding function (Sentence Transformers all-MiniLM-L6-v2).\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.",
         "properties": {
            "function_type": {
               "const": "default",
               "default": "default",
               "title": "Function Type",
               "type": "string"
            }
         },
         "title": "DefaultEmbeddingFunctionConfig",
         "type": "object"
      },
      "OpenAIEmbeddingFunctionConfig": {
         "description": "Configuration for OpenAI embedding functions.\n\nUses OpenAI's embedding API for generating embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    api_key (str): OpenAI API key. If empty, will attempt to use environment variable.\n    model_name (str): OpenAI embedding model name. Defaults to \"text-embedding-ada-002\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import OpenAIEmbeddingFunctionConfig\n\n        _ = OpenAIEmbeddingFunctionConfig(api_key=\"sk-...\", model_name=\"text-embedding-3-small\")",
         "properties": {
            "function_type": {
               "const": "openai",
               "default": "openai",
               "title": "Function Type",
               "type": "string"
            },
            "api_key": {
               "default": "",
               "description": "OpenAI API key",
               "title": "Api Key",
               "type": "string"
            },
            "model_name": {
               "default": "text-embedding-ada-002",
               "description": "OpenAI embedding model name",
               "title": "Model Name",
               "type": "string"
            }
         },
         "title": "OpenAIEmbeddingFunctionConfig",
         "type": "object"
      },
      "SentenceTransformerEmbeddingFunctionConfig": {
         "description": "Configuration for SentenceTransformer embedding functions.\n\nAllows specifying a custom SentenceTransformer model for embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    model_name (str): Name of the SentenceTransformer model to use.\n        Defaults to \"all-MiniLM-L6-v2\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import SentenceTransformerEmbeddingFunctionConfig\n\n        _ = SentenceTransformerEmbeddingFunctionConfig(model_name=\"paraphrase-multilingual-mpnet-base-v2\")",
         "properties": {
            "function_type": {
               "const": "sentence_transformer",
               "default": "sentence_transformer",
               "title": "Function Type",
               "type": "string"
            },
            "model_name": {
               "default": "all-MiniLM-L6-v2",
               "description": "SentenceTransformer model name to use",
               "title": "Model Name",
               "type": "string"
            }
         },
         "title": "SentenceTransformerEmbeddingFunctionConfig",
         "type": "object"
      }
   }
}

Felder:
  • client_type (Literal['persistent', 'http'])

  • persistence_path (str)

field client_type: Literal['persistent', 'http'] = 'persistent'#
field persistence_path: str = './chroma_db'#

Pfad für die persistente Speicherung

pydantic model HttpChromaDBVectorMemoryConfig[Quelle]#

Bases: ChromaDBVectorMemoryConfig

Konfiguration für den HTTP-ChromaDB-Speicher.

JSON-Schema anzeigen
{
   "title": "HttpChromaDBVectorMemoryConfig",
   "description": "Configuration for HTTP ChromaDB memory.",
   "type": "object",
   "properties": {
      "client_type": {
         "default": "http",
         "enum": [
            "persistent",
            "http"
         ],
         "title": "Client Type",
         "type": "string"
      },
      "collection_name": {
         "default": "memory_store",
         "description": "Name of the ChromaDB collection",
         "title": "Collection Name",
         "type": "string"
      },
      "distance_metric": {
         "default": "cosine",
         "description": "Distance metric for similarity search",
         "title": "Distance Metric",
         "type": "string"
      },
      "k": {
         "default": 3,
         "description": "Number of results to return in queries",
         "title": "K",
         "type": "integer"
      },
      "score_threshold": {
         "anyOf": [
            {
               "type": "number"
            },
            {
               "type": "null"
            }
         ],
         "default": null,
         "description": "Minimum similarity score threshold",
         "title": "Score Threshold"
      },
      "allow_reset": {
         "default": false,
         "description": "Whether to allow resetting the ChromaDB client",
         "title": "Allow Reset",
         "type": "boolean"
      },
      "tenant": {
         "default": "default_tenant",
         "description": "Tenant to use",
         "title": "Tenant",
         "type": "string"
      },
      "database": {
         "default": "default_database",
         "description": "Database to use",
         "title": "Database",
         "type": "string"
      },
      "embedding_function_config": {
         "description": "Configuration for the embedding function",
         "discriminator": {
            "mapping": {
               "default": "#/$defs/DefaultEmbeddingFunctionConfig",
               "openai": "#/$defs/OpenAIEmbeddingFunctionConfig",
               "sentence_transformer": "#/$defs/SentenceTransformerEmbeddingFunctionConfig"
            },
            "propertyName": "function_type"
         },
         "oneOf": [
            {
               "$ref": "#/$defs/DefaultEmbeddingFunctionConfig"
            },
            {
               "$ref": "#/$defs/SentenceTransformerEmbeddingFunctionConfig"
            },
            {
               "$ref": "#/$defs/OpenAIEmbeddingFunctionConfig"
            }
         ],
         "title": "Embedding Function Config"
      },
      "host": {
         "default": "localhost",
         "description": "Host of the remote server",
         "title": "Host",
         "type": "string"
      },
      "port": {
         "default": 8000,
         "description": "Port of the remote server",
         "title": "Port",
         "type": "integer"
      },
      "ssl": {
         "default": false,
         "description": "Whether to use HTTPS",
         "title": "Ssl",
         "type": "boolean"
      },
      "headers": {
         "anyOf": [
            {
               "additionalProperties": {
                  "type": "string"
               },
               "type": "object"
            },
            {
               "type": "null"
            }
         ],
         "default": null,
         "description": "Headers to send to the server",
         "title": "Headers"
      }
   },
   "$defs": {
      "DefaultEmbeddingFunctionConfig": {
         "description": "Configuration for the default ChromaDB embedding function.\n\nUses ChromaDB's default embedding function (Sentence Transformers all-MiniLM-L6-v2).\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.",
         "properties": {
            "function_type": {
               "const": "default",
               "default": "default",
               "title": "Function Type",
               "type": "string"
            }
         },
         "title": "DefaultEmbeddingFunctionConfig",
         "type": "object"
      },
      "OpenAIEmbeddingFunctionConfig": {
         "description": "Configuration for OpenAI embedding functions.\n\nUses OpenAI's embedding API for generating embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    api_key (str): OpenAI API key. If empty, will attempt to use environment variable.\n    model_name (str): OpenAI embedding model name. Defaults to \"text-embedding-ada-002\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import OpenAIEmbeddingFunctionConfig\n\n        _ = OpenAIEmbeddingFunctionConfig(api_key=\"sk-...\", model_name=\"text-embedding-3-small\")",
         "properties": {
            "function_type": {
               "const": "openai",
               "default": "openai",
               "title": "Function Type",
               "type": "string"
            },
            "api_key": {
               "default": "",
               "description": "OpenAI API key",
               "title": "Api Key",
               "type": "string"
            },
            "model_name": {
               "default": "text-embedding-ada-002",
               "description": "OpenAI embedding model name",
               "title": "Model Name",
               "type": "string"
            }
         },
         "title": "OpenAIEmbeddingFunctionConfig",
         "type": "object"
      },
      "SentenceTransformerEmbeddingFunctionConfig": {
         "description": "Configuration for SentenceTransformer embedding functions.\n\nAllows specifying a custom SentenceTransformer model for embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    model_name (str): Name of the SentenceTransformer model to use.\n        Defaults to \"all-MiniLM-L6-v2\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import SentenceTransformerEmbeddingFunctionConfig\n\n        _ = SentenceTransformerEmbeddingFunctionConfig(model_name=\"paraphrase-multilingual-mpnet-base-v2\")",
         "properties": {
            "function_type": {
               "const": "sentence_transformer",
               "default": "sentence_transformer",
               "title": "Function Type",
               "type": "string"
            },
            "model_name": {
               "default": "all-MiniLM-L6-v2",
               "description": "SentenceTransformer model name to use",
               "title": "Model Name",
               "type": "string"
            }
         },
         "title": "SentenceTransformerEmbeddingFunctionConfig",
         "type": "object"
      }
   }
}

Felder:
  • client_type (Literal['persistent', 'http'])

  • headers (Dict[str, str] | None)

  • host (str)

  • port (int)

  • ssl (bool)

field client_type: Literal['persistent', 'http'] = 'http'#
field host: str = 'localhost'#

Host des entfernten Servers

field port: int = 8000#

Port des entfernten Servers

field ssl: bool = False#

Ob HTTPS verwendet werden soll

field headers: Dict[str, str] | None = None#

An den Server zu sendende Header

pydantic model DefaultEmbeddingFunctionConfig[Quelle]#

Bases: BaseModel

Konfiguration für die Standard-Embedding-Funktion von ChromaDB.

Verwendet die Standard-Embedding-Funktion von ChromaDB (Sentence Transformers all-MiniLM-L6-v2).

Hinzugefügt in Version v0.4.1: Unterstützung für benutzerdefinierte Embedding-Funktionen im ChromaDB-Speicher.

JSON-Schema anzeigen
{
   "title": "DefaultEmbeddingFunctionConfig",
   "description": "Configuration for the default ChromaDB embedding function.\n\nUses ChromaDB's default embedding function (Sentence Transformers all-MiniLM-L6-v2).\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.",
   "type": "object",
   "properties": {
      "function_type": {
         "const": "default",
         "default": "default",
         "title": "Function Type",
         "type": "string"
      }
   }
}

Felder:
  • function_type (Literal['default'])

field function_type: Literal['default'] = 'default'#
pydantic model SentenceTransformerEmbeddingFunctionConfig[Quelle]#

Bases: BaseModel

Konfiguration für SentenceTransformer-Embedding-Funktionen.

Ermöglicht die Angabe eines benutzerdefinierten SentenceTransformer-Modells für Embeddings.

Hinzugefügt in Version v0.4.1: Unterstützung für benutzerdefinierte Embedding-Funktionen im ChromaDB-Speicher.

Parameter:

model_name (str) – Name des zu verwendenden SentenceTransformer-Modells. Standardmäßig „all-MiniLM-L6-v2“.

Beispiel

from autogen_ext.memory.chromadb import SentenceTransformerEmbeddingFunctionConfig

_ = SentenceTransformerEmbeddingFunctionConfig(model_name="paraphrase-multilingual-mpnet-base-v2")

JSON-Schema anzeigen
{
   "title": "SentenceTransformerEmbeddingFunctionConfig",
   "description": "Configuration for SentenceTransformer embedding functions.\n\nAllows specifying a custom SentenceTransformer model for embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    model_name (str): Name of the SentenceTransformer model to use.\n        Defaults to \"all-MiniLM-L6-v2\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import SentenceTransformerEmbeddingFunctionConfig\n\n        _ = SentenceTransformerEmbeddingFunctionConfig(model_name=\"paraphrase-multilingual-mpnet-base-v2\")",
   "type": "object",
   "properties": {
      "function_type": {
         "const": "sentence_transformer",
         "default": "sentence_transformer",
         "title": "Function Type",
         "type": "string"
      },
      "model_name": {
         "default": "all-MiniLM-L6-v2",
         "description": "SentenceTransformer model name to use",
         "title": "Model Name",
         "type": "string"
      }
   }
}

Felder:
  • function_type (Literal['sentence_transformer'])

  • model_name (str)

field function_type: Literal['sentence_transformer'] = 'sentence_transformer'#
field model_name: str = 'all-MiniLM-L6-v2'#

Zu verwendender SentenceTransformer-Modellname

pydantic model OpenAIEmbeddingFunctionConfig[Quelle]#

Bases: BaseModel

Konfiguration für OpenAI-Embedding-Funktionen.

Verwendet die Embedding-API von OpenAI zur Generierung von Embeddings.

Hinzugefügt in Version v0.4.1: Unterstützung für benutzerdefinierte Embedding-Funktionen im ChromaDB-Speicher.

Parameter:
  • api_key (str) – OpenAI API-Schlüssel. Wenn leer, wird versucht, die Umgebungsvariable zu verwenden.

  • model_name (str) – Name des zu verwendenden OpenAI-Embedding-Modells. Standardmäßig „text-embedding-ada-002“.

Beispiel

from autogen_ext.memory.chromadb import OpenAIEmbeddingFunctionConfig

_ = OpenAIEmbeddingFunctionConfig(api_key="sk-...", model_name="text-embedding-3-small")

JSON-Schema anzeigen
{
   "title": "OpenAIEmbeddingFunctionConfig",
   "description": "Configuration for OpenAI embedding functions.\n\nUses OpenAI's embedding API for generating embeddings.\n\n.. versionadded:: v0.4.1\n   Support for custom embedding functions in ChromaDB memory.\n\nArgs:\n    api_key (str): OpenAI API key. If empty, will attempt to use environment variable.\n    model_name (str): OpenAI embedding model name. Defaults to \"text-embedding-ada-002\".\n\nExample:\n    .. code-block:: python\n\n        from autogen_ext.memory.chromadb import OpenAIEmbeddingFunctionConfig\n\n        _ = OpenAIEmbeddingFunctionConfig(api_key=\"sk-...\", model_name=\"text-embedding-3-small\")",
   "type": "object",
   "properties": {
      "function_type": {
         "const": "openai",
         "default": "openai",
         "title": "Function Type",
         "type": "string"
      },
      "api_key": {
         "default": "",
         "description": "OpenAI API key",
         "title": "Api Key",
         "type": "string"
      },
      "model_name": {
         "default": "text-embedding-ada-002",
         "description": "OpenAI embedding model name",
         "title": "Model Name",
         "type": "string"
      }
   }
}

Felder:
  • api_key (str)

  • function_type (Literal['openai'])

  • model_name (str)

field function_type: Literal['openai'] = 'openai'#
Feld api_key: str = ''#

OpenAI API-Schlüssel

Feld model_name: str = 'text-embedding-ada-002'#

Name des OpenAI-Einbettungsmodells

Pydantic Modell CustomEmbeddingFunctionConfig[Quelle]#

Bases: BaseModel

Konfiguration für benutzerdefinierte Einbettungsfunktionen.

Ermöglicht die Verwendung einer benutzerdefinierten Funktion, die eine ChromaDB-kompatible Einbettungsfunktion zurückgibt.

Hinzugefügt in Version v0.4.1: Unterstützung für benutzerdefinierte Embedding-Funktionen im ChromaDB-Speicher.

Warnung

Konfigurationen, die benutzerdefinierte Funktionen enthalten, sind nicht serialisierbar.

Parameter:
  • funktion (Callable) – Funktion, die eine ChromaDB-kompatible Einbettungsfunktion zurückgibt.

  • params (Dict[str, Any]) – Parameter, die an die Funktion übergeben werden.

JSON-Schema anzeigen
{
   "title": "CustomEmbeddingFunctionConfig",
   "type": "object",
   "properties": {
      "function_type": {
         "const": "custom",
         "default": "custom",
         "title": "Function Type",
         "type": "string"
      },
      "function": {
         "default": null,
         "title": "Function"
      },
      "params": {
         "description": "Parameters to pass to the function",
         "title": "Params",
         "type": "object"
      }
   }
}

Felder:
  • funktion (Callable[[...], Any])

  • funktion_typ (Literal['custom'])

  • params (Dict[str, Any])

Feld function_type: Literal['custom'] = 'custom'#
Feld function: Callable[[...], Any] [Erforderlich]#

Funktion, die eine Einbettungsfunktion zurückgibt

Feld params: Dict[str, Any] [Optional]#

Parameter, die an die Funktion übergeben werden