From 5970cc0244b11e658a44500d4aae162648e80b44 Mon Sep 17 00:00:00 2001 From: Manik Aggarwal Date: Wed, 23 Jul 2025 23:06:59 +0530 Subject: [PATCH] Update README.md Added Ask deepwiki badge in readme --- README.md | 345 +++++++++++++++++++++++++++--------------------------- 1 file changed, 174 insertions(+), 171 deletions(-) diff --git a/README.md b/README.md index 70cd267..9a2e11c 100644 --- a/README.md +++ b/README.md @@ -27,174 +27,177 @@ - - -
- - CORE logo - - -### C.O.R.E: Your digital brain for the AI era - -

- Documentation • - Discord -

-
- -## 🧠 C.O.R.E. - -**Contextual Observation & Recall Engine** - -C.O.R.E is a portable memory graph built from your llm interactions and personal data, making all your context and workflow history accessible to any AI tool, just like a digital brain. This eliminates the need for repeated context sharing . The aim is to provide: - -- **Unified, Portable Memory**: Add and recall context seamlessly, and connect your memory across apps like Claude, Cursor, Windsurf and more. -- **Relational, Not just Flat Facts**: CORE organizes your knowledge, storing both facts and relationships for a deeper richer memory like a real brain. -- **User Owned**: You decide what to keep, update or delete and share your memory across the tool you want and be freed from vendor lock-in. - -## 🎥 Demo Video - -[Check C.O.R.E Demo](https://youtu.be/iANZ32dnK60) - -Core dashboard - -## 🧩 Key Features - -- **Memory Graph**: Visualise how your facts and preferences link together -- **Chat with Memory**: Ask questions about memory for instant insights and understanding -- **Plug n Play**: Instantly use CORE memory in apps like Cursor, Claude - -## ☁️ C.O.R.E Cloud Setup - -1. Sign up to [Core Cloud](https://core.heysol.ai) and start building your memory graph. -2. Add your text that you want to save in memory. Once clicking on `+ Add` button your memory graph will be generated. -3. [Connect Core Memory MCP with Cursor](#connecting-core-mcp-with-cursor) - -## 💻 C.O.R.E Local Setup - -#### Prerequisites - -1. Docker -2. OpenAI API Key - - -> **Note:** We are actively working on improving support for Llama models. At the moment, C.O.R.E does not provide optimal results with Llama-based models, but we are making progress to ensure better compatibility and output in the near future. -> -#### Run C.O.R.E locally - -1. **Copy Environment Variables** - - Copy the example environment file to `.env`: - - ```bash - cp .env.example .env - ``` - -2. **Start the Application** - - Use Docker Compose to start all required services: - - ```bash - docker-compose up - ``` - -3. **Access the App** - - Once the containers are running, open your browser and go to [http://localhost:3000](http://localhost:3000). - -4. **Create Account with Magic Link** - - - To Create an account, click on `Continue with email` button - - Create account - - - Enter your email and click on `Send a Magic Link` button - - Enter email - - - `Copy the magic link from terminal logs` and open it in your browser - - Magic link - - -5. **Create Your Private Space & Add Data** - - - In the dashboard, go to the top right section -> Type a message, e.g., `I love playing badminton`, and click `+Add`. - - Your memory is queued for processing; you can monitor its status in the `Logs` section. - - Core memory logs - - - Once processing is complete, nodes will be added to your private knowledge graph and visible in the dashboard. - - You can later choose to connect this memory to other tools or keep it private. - -6. **Search Your Memory** - - - Use the dashboard's search feature to query your ingested data within your private space. - - -## Connecting CORE MCP with Cursor - -1. Open the CORE dashboard and navigate to the API section to generate a new API token. -2. In Cursor, go to: Settings → Tools & Integrations → New MCP Server. -3. Add the CORE MCP server using the configuration format below. Be sure to replace the API_TOKEN value with the token you generated in step 1. - - MCP configuration to add in Cursor - - ```json - { - "mcpServers": { - "memory": { - "command": "npx", - "args": ["-y", "@redplanethq/core-mcp"], - "env": { - "API_TOKEN": "YOUR_API_TOKEN_HERE", - "API_BASE_URL": "https://core.heysol.ai", - "SOURCE": "cursor" - } - } - } - } - ``` - -4. Go to Settings-> User rules -> New Rule -> and add the below rule to ensure all your chat interactions are being stored in CORE memory - -``` -After every interaction, update the memory with the user's query and the assistant's -response to core-memory mcp. sessionId should be the uuid of the conversation -``` - -## Documentation - -Explore our documentation to get the most out of CORE -- [Basic Concepts](https://docs.heysol.ai/core/overview) -- [API Reference](https://docs.heysol.ai/core/local-setup) -- [Connect Core Memory MCP with Cursor](#connecting-core-mcp-with-cursor) - - -## 🧑‍💻 Support -Have questions or feedback? We're here to help: -- Discord: [Join core-support channel](https://discord.gg/YGUZcvDjUa) -- Documentation: [docs.heysol.ai/core](https://docs.heysol.ai/core/overview) -- Email: manik@poozle.dev - -## Usage Guidelines - -**Store:** - -- Conversation history -- User preferences -- Task context -- Reference materials - -**Don't Store:** - -- Sensitive data (PII) -- Credentials -- System logs -- Temporary data - -## 👥 Contributors - - - - + + +
+ + CORE logo + + +### C.O.R.E: Your digital brain for the AI era + +

+ DeepWiki Badge +

+ +

+ Documentation • + Discord +

+ +## 🧠 C.O.R.E. + +**Contextual Observation & Recall Engine** + +C.O.R.E is a portable memory graph built from your llm interactions and personal data, making all your context and workflow history accessible to any AI tool, just like a digital brain. This eliminates the need for repeated context sharing . The aim is to provide: + +- **Unified, Portable Memory**: Add and recall context seamlessly, and connect your memory across apps like Claude, Cursor, Windsurf and more. +- **Relational, Not just Flat Facts**: CORE organizes your knowledge, storing both facts and relationships for a deeper richer memory like a real brain. +- **User Owned**: You decide what to keep, update or delete and share your memory across the tool you want and be freed from vendor lock-in. + +## 🎥 Demo Video + +[Check C.O.R.E Demo](https://youtu.be/iANZ32dnK60) + +Core dashboard + +## 🧩 Key Features + +- **Memory Graph**: Visualise how your facts and preferences link together +- **Chat with Memory**: Ask questions about memory for instant insights and understanding +- **Plug n Play**: Instantly use CORE memory in apps like Cursor, Claude + +## ☁️ C.O.R.E Cloud Setup + +1. Sign up to [Core Cloud](https://core.heysol.ai) and start building your memory graph. +2. Add your text that you want to save in memory. Once clicking on `+ Add` button your memory graph will be generated. +3. [Connect Core Memory MCP with Cursor](#connecting-core-mcp-with-cursor) + +## 💻 C.O.R.E Local Setup + +#### Prerequisites + +1. Docker +2. OpenAI API Key + + +> **Note:** We are actively working on improving support for Llama models. At the moment, C.O.R.E does not provide optimal results with Llama-based models, but we are making progress to ensure better compatibility and output in the near future. +> +#### Run C.O.R.E locally + +1. **Copy Environment Variables** + + Copy the example environment file to `.env`: + + ```bash + cp .env.example .env + ``` + +2. **Start the Application** + + Use Docker Compose to start all required services: + + ```bash + docker-compose up + ``` + +3. **Access the App** + + Once the containers are running, open your browser and go to [http://localhost:3000](http://localhost:3000). + +4. **Create Account with Magic Link** + + - To Create an account, click on `Continue with email` button + + Create account + + - Enter your email and click on `Send a Magic Link` button + + Enter email + + - `Copy the magic link from terminal logs` and open it in your browser + + Magic link + + +5. **Create Your Private Space & Add Data** + + - In the dashboard, go to the top right section -> Type a message, e.g., `I love playing badminton`, and click `+Add`. + - Your memory is queued for processing; you can monitor its status in the `Logs` section. + + Core memory logs + + - Once processing is complete, nodes will be added to your private knowledge graph and visible in the dashboard. + - You can later choose to connect this memory to other tools or keep it private. + +6. **Search Your Memory** + + - Use the dashboard's search feature to query your ingested data within your private space. + + +## Connecting CORE MCP with Cursor + +1. Open the CORE dashboard and navigate to the API section to generate a new API token. +2. In Cursor, go to: Settings → Tools & Integrations → New MCP Server. +3. Add the CORE MCP server using the configuration format below. Be sure to replace the API_TOKEN value with the token you generated in step 1. + + MCP configuration to add in Cursor + + ```json + { + "mcpServers": { + "memory": { + "command": "npx", + "args": ["-y", "@redplanethq/core-mcp"], + "env": { + "API_TOKEN": "YOUR_API_TOKEN_HERE", + "API_BASE_URL": "https://core.heysol.ai", + "SOURCE": "cursor" + } + } + } + } + ``` + +4. Go to Settings-> User rules -> New Rule -> and add the below rule to ensure all your chat interactions are being stored in CORE memory + +``` +After every interaction, update the memory with the user's query and the assistant's +response to core-memory mcp. sessionId should be the uuid of the conversation +``` + +## Documentation + +Explore our documentation to get the most out of CORE +- [Basic Concepts](https://docs.heysol.ai/core/overview) +- [API Reference](https://docs.heysol.ai/core/local-setup) +- [Connect Core Memory MCP with Cursor](#connecting-core-mcp-with-cursor) + + +## 🧑‍💻 Support +Have questions or feedback? We're here to help: +- Discord: [Join core-support channel](https://discord.gg/YGUZcvDjUa) +- Documentation: [docs.heysol.ai/core](https://docs.heysol.ai/core/overview) +- Email: manik@poozle.dev + +## Usage Guidelines + +**Store:** + +- Conversation history +- User preferences +- Task context +- Reference materials + +**Don't Store:** + +- Sensitive data (PII) +- Credentials +- System logs +- Temporary data + +## 👥 Contributors + + + +