Fix: add note regarding llama

This commit is contained in:
Harshith Mullapudi 2025-06-30 23:35:58 +05:30
parent 934eb7f3e2
commit 13d0c1462e
3 changed files with 25 additions and 24 deletions

View File

@ -4,8 +4,6 @@
</a>
</div>
## C.O.R.E.
**Contextual Observation & Recall Engine**
@ -17,9 +15,11 @@ C.O.R.E is built for two reasons:
1. To give you complete ownership of your memory, stored locally and accessible across any app needing LLM context.
2. To help SOL (your AI assistant) access your context, facts, and preferences for more relevant and personalized responses.
## Demo Video
[Check C.O.R.E Demo](https://youtu.be/iANZ32dnK60)
> **Note:** We are actively working on improving support for Llama models. At the moment, C.O.R.E does not provide optimal results with Llama-based models, but we are making progress to ensure better compatibility and output in the near future.
## Demo Video
[Check C.O.R.E Demo](https://youtu.be/iANZ32dnK60)
## How is C.O.R.E different from other Memory Providers?
@ -37,7 +37,6 @@ With C.O.R.E, you see exactly what prices changed, who approved them, the contex
Or ask: “What does Mike know about Project Phoenix?” and get a timeline of meetings, decisions, and facts Mike was involved in, with full traceability to those specific events.
## C.O.R.E Cloud Setup
1. Sign up to [Core Cloud](https://core.heysol.ai) and start building your memory graph.
@ -97,23 +96,24 @@ Or ask: “What does Mike know about Project Phoenix?” and get a timeline of m
2. In Cursor, go to: Settings → Tools & Integrations → New MCP Server.
3. Add the CORE MCP server using the configuration format below. Be sure to replace the API_TOKEN value with the token you generated in step 1.
MCP configuration to add in Cursor
MCP configuration to add in Cursor
```json
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@redplanethq/core-mcp"],
"env": {
"API_TOKEN": "YOUR_API_TOKEN_HERE",
"API_BASE_URL": "https://core.heysol.ai",
"SOURCE": "cursor"
}
}
}
}
```
``` json
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@redplanethq/core-mcp"],
"env": {
"API_TOKEN": "YOUR_API_TOKEN_HERE",
"API_BASE_URL": "https://core.heysol.ai",
"SOURCE": "cursor"
}
}
}
}
```
4. Go to Settings-> User rules -> New Rule -> and add the below rule to ensure all your chat interactions are being stored in CORE memory
```

View File

@ -1,6 +1,6 @@
{
"name": "@redplanethq/core-mcp",
"version": "0.1.3",
"version": "0.1.5",
"type": "module",
"main": "dist/index.js",
"bin": {

View File

@ -14,6 +14,7 @@ export async function ingestKnowledgeGraph(args: IngestKG) {
const response = await axiosInstance.post(`/ingest`, {
episodeBody: args.message,
source: process.env.SOURCE,
referenceTime: new Date().toISOString(),
sessionId: undefined,
});
return response.data;