- Python
- TypeScript
- Platform
- Vercel AI SDK
New Features & Updates:
- OpenMemory:
- Added memory export / import feature
- Added vector store integrations: Weaviate, FAISS, PGVector, Chroma, Redis, Elasticsearch, Milvus
- Added
export_openmemory.sh
migration script
- Vector Stores:
- Added Amazon S3 Vectors support
- Added Databricks Mosaic AI vector store support
- Added support for OpenAI Store
- Graph Memory: Added support for graph memory using Kuzu
- Azure: Added Azure Identity for Azure OpenAI and Azure AI Search authentication
- Elasticsearch: Added headers configuration support
- Added custom connection client to enable connecting to local containers for Weaviate
- Updated configuration AWS Bedrock
- Fixed dependency issues and tests; updated docstrings
- Documentation:
- Fixed Graph Docs page missing in sidebar
- Updated integration documentation
- Added version param in Search V2 API documentation
- Updated Databricks documentation and refactored docs
- Updated favicon logo
- Fixed typos and Typescript docs
- Baidu: Added missing provider for Baidu vector DB
- MongoDB: Replaced
query_vector
args in search method - Fixed new memory mistaken for current
- AsyncMemory._add_to_vector_store: handled edge case when no facts found
- Fixed missing commas in Kuzu graph INSERT queries
- Fixed inconsistent created and updated properties for Graph
- Fixed missing
app_id
on client for Neptune Analytics - Correctly pick AWS region from environment variable
- Fixed Ollama model existence check
- PGVector: Use internal connection pools and context managers
New Features & Updates:
- Pinecone: Added namespace support and improved type safety
- Milvus: Added db_name field to MilvusDBConfig
- Vector Stores: Added multi-id filters support
- Vercel AI SDK: Migration to AI SDK V5.0
- Python Support: Added Python 3.12 support
- Graph Memory: Added sanitizer methods for nodes and relationships
- LLM Monitoring: Added monitoring callback support
- Performance:
- Improved async handling in AsyncMemory class
- Documentation:
- Added async add announcement
- Added personalized search docs
- Added Neptune examples
- Added V5 migration docs
- Configuration:
- Refactored base class config for LLMs
- Added sslmode for pgvector
- Dependencies:
- Updated psycopg to version 3
- Updated Docker compose
- Tests:
- Fixed failing tests
- Restricted package versions
- Memgraph:
- Fixed async attribute errors
- Fixed n_embeddings usage
- Fixed indexing issues
- Vector Stores:
- Fixed Qdrant cloud indexing
- Fixed Neo4j Cypher syntax
- Fixed LLM parameters
- Graph Store:
- Fixed LM config prioritization
- Dependencies:
- Fixed JSON import for psycopg
- Google AI: Refactored from Gemini to Google AI
- Base Classes: Refactored LLM base class configuration
New Features & Updates:
- Enhanced project management via
client.project
andAsyncMemoryClient.project
interfaces - Full support for project CRUD operations (create, read, update, delete)
- Project member management: add, update, remove, and list members
- Manage project settings including custom instructions, categories, retrieval criteria, and graph enablement
- Both sync and async support for all project management operations
- Documentation:
- Added detailed API reference and usage examples for new project management methods.
- Updated all docs to use
client.project.get()
andclient.project.update()
instead of deprecated methods.
- Deprecation:
- Marked
get_project()
andupdate_project()
as deprecated (these methods were already present); added warnings to guide users to the new API.
- Marked
- Tests:
- Fixed Gemini embedder and LLM test mocks for correct error handling and argument structure.
- vLLM:
- Fixed duplicate import in vLLM module.
New Features:
- OpenAI Agents: Added OpenAI agents SDK support
- Amazon Neptune: Added Amazon Neptune Analytics graph_store configuration and integration
- vLLM: Added vLLM support
- Documentation:
- Added SOC2 and HIPAA compliance documentation
- Enhanced group chat feature documentation for platform
- Added Google AI ADK Integration documentation
- Fixed documentation images and links
- Setup: Fixed Mem0 setup, logging, and documentation issues
- MongoDB: Fixed MongoDB Vector Store misaligned strings and classes
- vLLM: Fixed missing OpenAI import in vLLM module and call errors
- Dependencies: Fixed CI issues related to missing dependencies
- Installation: Reverted pip install changes
Bug Fixes:
- Gemini: Fixed Gemini embedder configuration
New Features:
- Memory: Added immutable parameter to add method
- OpenMemory: Added async_mode parameter support
- Documentation:
- Enhanced platform feature documentation
- Fixed documentation links
- Added async_mode documentation
- MongoDB: Fixed MongoDB configuration name
- Bedrock: Fixed Bedrock LLM, embeddings, tools, and temporary credentials
- Memory: Fixed memory categorization by updating dependencies and correcting API usage
- Gemini: Fixed Gemini Embeddings and LLM issues
New Features:
- OpenMemory:
- Added OpenMemory augment support
- Added OpenMemory Local Support using new library
- vLLM: Added vLLM support integration
- Documentation:
- Added MCP Client Integration Guide and updated installation commands
- Improved Agent Id documentation for Mem0 OSS Graph Memory
- Core: Added JSON parsing to solve hallucination errors
- Gemini: Fixed Gemini Embeddings migration
New Features:
- Baidu: Added Baidu vector database integration
- Documentation:
- Updated changelog
- Fixed example in quickstart page
- Updated client.update() method documentation in OpenAPI specification
- OpenSearch: Updated logger warning
- CI: Fixed failing CI pipeline
New Features:
- AgentOps: Added AgentOps integration
- LM Studio: Added response_format parameter for LM Studio configuration
- Examples: Added Memory agent powered by voice (Cartesia + Agno)
- AI SDK: Added output_format parameter
- Client: Enhanced update method to support metadata
- Google: Added Google Genai library support
- Build: Fixed Build CI failure
- Pinecone: Fixed pinecone for async memory
New Features:
- MongoDB: Added MongoDB Vector Store support
- Client: Added client support for summary functionality
- Pinecone: Fixed pinecone version issues
- OpenSearch: Added logger support
- Testing: Added python version test environments
Improvements:
- Documentation:
- Updated Livekit documentation migration
- Updated OpenMemory hosted version documentation
- Core: Updated categorization flow
- Storage: Fixed migration issues
New Features:
- Cloudflare: Added Cloudflare vector store support
- Search: Added threshold parameter to search functionality
- API: Added wildcard character support for v2 Memory APIs
- Documentation: Updated README docs for OpenMemory environment setup
- Core: Added support for unique user IDs
- Core: Fixed error handling exceptions
Bug Fixes:
- Vector Stores: Fixed GET_ALL functionality for FAISS and OpenSearch
New Features:
- LLM: Added support for OpenAI compatible LLM providers with baseUrl configuration
- Documentation:
- Fixed broken links
- Improved Graph Memory features documentation clarity
- Updated enable_graph documentation
- TypeScript SDK: Updated Google SDK peer dependency version
- Client: Added async mode parameter
New Features:
- Examples: Added Neo4j example
- AI SDK: Added Google provider support
- OpenMemory: Added LLM and Embedding Providers support
- Documentation:
- Updated memory export documentation
- Enhanced role-based memory attribution rules documentation
- Updated API reference and messages documentation
- Added Mastra and Raycast documentation
- Added NOT filter documentation for Search and GetAll V2
- Announced Claude 4 support
- Core:
- Removed support for passing string as input in client.add()
- Added support for sarvam-m model
- TypeScript SDK: Fixed types from message interface
- Memory: Prevented saving prompt artifacts as memory when no new facts are present
- OpenMemory: Fixed typos in MCP tool description
New Features:
- Neo4j: Added base label configuration support
- Documentation:
- Updated Healthcare example index
- Enhanced collaborative task agent documentation clarity
- Added criteria-based filtering documentation
- OpenMemory: Added cURL command for easy installation
- Build: Migrated to Hatch build system
New Features:
- Memory: Added Group Chat Memory Feature support
- Examples: Added Healthcare assistant using Mem0 and Google ADK
- SSE: Fixed SSE connection issues
- MCP: Fixed memories not appearing in MCP clients added from Dashboard
New Features:
- OpenMemory: Added OpenMemory support
- Neo4j: Added weights to Neo4j model
- AWS: Added support for Opsearch Serverless
- Examples: Added ElizaOS Example
- Documentation: Updated Azure AI documentation
- AI SDK: Added missing parameters and updated demo application
- OSS: Fixed AOSS and AWS BedRock LLM
New Features:
- Neo4j: Added support for Neo4j database
- AWS: Added support for AWS Bedrock Embeddings
- Client: Updated delete_users() to use V2 API endpoints
- Documentation: Updated timestamp and dual-identity memory management docs
- Neo4j: Improved Neo4j queries and removed warnings
- AI SDK: Added support for graceful failure when services are down
- Fixed AI SDK filters
- Fixed new memories wrong type
- Fixed duplicated metadata issue while adding/updating memories
New Features:
- HuggingFace: Added support for HF Inference
- Fixed proxy for Mem0
New Features:
- Vercel AI SDK: Added Graph Memory support
- Documentation: Fixed timestamp and README links
- Client: Updated TS client to use proper types for deleteUsers
- Dependencies: Removed unnecessary dependencies from base package
Improvements:
- Client: Fixed Ping Method for using default org_id and project_id
- Documentation: Updated documentation
- Fixed mem0-migrations issue
New Features:
- Integrations: Added Memgraph integration
- Memory: Added timestamp support
- Vector Stores: Added reset function for VectorDBs
- Documentation:
- Updated timestamp and expiration_date documentation
- Fixed v2 search documentation
- Added “memory” in EC “Custom config” section
- Fixed typos in the json config sample
Improvements:
- Vector Stores: Initialized embedding_model_dims in all vectordbs
- Documentation: Fixed agno link
New Features:
- Memory: Added Memory Reset functionality
- Client: Added support for Custom Instructions
- Examples: Added Fitness Checker powered by memory
- Core: Updated capture_event
- Documentation: Fixed curl for v2 get_all
- Vector Store: Fixed user_id functionality
- Client: Various client improvements
New Features:
- LLM Integrations: Added Azure OpenAI Embedding Model
- Examples:
- Added movie recommendation using grok3
- Added Voice Assistant using Elevenlabs
- Documentation:
- Added keywords AI
- Reformatted navbar page URLs
- Updated changelog
- Updated openai.mdx
- FAISS: Silenced FAISS info logs
New Features:
- LLM Integrations: Added Mistral AI as LLM provider
- Documentation:
- Updated changelog
- Fixed memory exclusion example
- Updated xAI documentation
- Updated YouTube Chrome extension example documentation
- Core: Fixed EmbedderFactory.create() in GraphMemory
- Azure OpenAI: Added patch to fix Azure OpenAI
- Telemetry: Fixed telemetry issue
New Features:
- Langchain Integration: Added support for Langchain VectorStores
- Examples:
- Added personal assistant example
- Added personal study buddy example
- Added YouTube assistant Chrome extension example
- Added agno example
- Updated OpenAI Responses API examples
- Vector Store: Added capability to store user_id in vector database
- Async Memory: Added async support for OSS
- Documentation: Updated formatting and examples
New Features:
- Upstash Vector: Added support for Upstash Vector store
- Code Quality: Removed redundant code lines
- Build: Updated MAKEFILE
- Documentation: Updated memory export documentation
Improvements:
- FAISS: Added embedding_dims parameter to FAISS vector store
New Features:
- Langchain Embedder: Added Langchain embedder integration
- Langchain LLM: Updated Langchain LLM integration to directly pass the Langchain object LLM
Bug Fixes:
- Langchain LLM: Fixed issues with Langchain LLM integration
New Features:
- LLM Integrations: Added support for Langchain LLMs, Google as new LLM and embedder
- Development: Added development docker compose
- Output Format: Set output_format=‘v1.1’ and updated documentation
- Integrations: Added LMStudio and Together.ai documentation
- API Reference: Updated output_format documentation
- Integrations: Added PipeCat integration documentation
- Integrations: Added Flowise integration documentation for Mem0 memory setup
- Tests: Fixed failing unit tests
New Features:
- FAISS Support: Added FAISS vector store support
New Features:
- Livekit Integration: Added Mem0 livekit example
- Evaluation: Added evaluation framework and tools
- Multimodal: Updated multimodal documentation
- Examples: Added examples for email processing
- API Reference: Updated API reference section
- Elevenlabs: Added Elevenlabs integration example
- OpenAI Environment Variables: Fixed issues with OpenAI environment variables
- Deployment Errors: Added
package.json
file to fix deployment errors - Tools: Fixed tools issues and improved formatting
- Docs: Updated API reference section for
expiration date
Bug Fixes:
- OpenAI Environment Variables: Fixed issues with OpenAI environment variables
- Deployment Errors: Added
package.json
file to fix deployment errors - Tools: Fixed tools issues and improved formatting
- Docs: Updated API reference section for
expiration date
New Features:
- Supabase Vector Store: Added support for Supabase Vector Store
- Supabase History DB: Added Supabase History DB to run Mem0 OSS on Serverless
- Feedback Method: Added feedback method to client
- Azure OpenAI: Fixed issues with Azure OpenAI
- Azure AI Search: Fixed test cases for Azure AI Search