From e7b43602c56becbe709d4d31da4f7e1d7636ac42 Mon Sep 17 00:00:00 2001 From: Manik Aggarwal Date: Tue, 14 Oct 2025 17:07:04 +0530 Subject: [PATCH] Update README.md Added Self-Hosting section in the readme --- README.md | 48 +++++++++++++++++++++++++++++++++++++++--------- 1 file changed, 39 insertions(+), 9 deletions(-) diff --git a/README.md b/README.md index 5b86a1f..daef956 100644 --- a/README.md +++ b/README.md @@ -63,18 +63,47 @@ Developers waste time re-explaining context to AI tools. Hit token limits in Cla CORE is an open-source unified, persistent memory layer for all your AI tools. Your context follows you from Cursor to Claude to ChatGPT to Claude Code. One knowledge graph remembers who said what, when, and why. Connect once, remember everywhere. Stop managing context and start building. -## 🚀 Get Started +## 🚀 CORE Self-Hosting +Want to run CORE on your own infrastructure? Self-hosting gives you complete control over your data and deployment. + +**Prerequisites**: + +- Docker (20.10.0+) and Docker Compose (2.20.0+) installed +- OpenAI API key + +> **Note on Open-Source Models:** We tested OSS options like Ollama and GPT models, but their fact extraction and graph quality fell short. We're actively looking for options. + +### Setup + +1. Clone the repository: +``` +git clone https://github.com/RedPlanetHQ/core.git +cd core +``` +2. Configure environment variables in `core/.env`: +``` +OPENAI_API_KEY=your_openai_api_key +``` +3. Start the service +``` +docker-compose up -d +``` + +Once deployed, you can configure your AI providers (OpenAI, Anthropic) and start building your memory graph. + +👉 [View complete self-hosting guide](https://docs.heysol.ai/self-hosting/docker) + +Note: We tried open-source models like Ollama or GPT OSS but facts generation were not good, we are still figuring out how to improve on that and then will also support OSS models. + +## 🚀 CORE Cloud **Build your unified memory graph in 5 minutes:** +Don't want to manage infrastructure? CORE Cloud lets you build your personal memory system instantly - no setup, no servers, just memory that works. + 1. **Sign Up** at [core.heysol.ai](https://core.heysol.ai) and create your account -2. **Add your first memory** - share context about yourself - - first-memory - - -3. **Visualize your memory graph** and see how CORE automatically forms connections between facts -5. **Test it out** - ask "What do you know about me?" in conversatio section -6. Connect to your tools: +2. **Visualize your memory graph** and see how CORE automatically forms connections between facts +3. **Test it out** - ask "What do you know about me?" in conversation section +4. Connect to your tools: - [Claude](https://docs.heysol.ai/providers/claude) & [Cursor](https://docs.heysol.ai/providers/cursor) - coding with context - [CLaude Code CLI](https://docs.heysol.ai/providers/claude-code) & [Codex CLI](https://docs.heysol.ai/providers/codex) - terminal-based coding with memory - [Add Browser Extension](https://docs.heysol.ai/providers/browser-extension) - bring your memory to any website @@ -227,3 +256,4 @@ Have questions or feedback? We're here to help: +