Compare commits

...

83 Commits
0.1.20 ... main

Author SHA1 Message Date
Harshith Mullapudi
f038ad5c61 bump: version 0.1.27 2025-10-30 12:32:59 +05:30
Harshith Mullapudi
4f27d2128b fix: worker logging 2025-10-30 12:32:44 +05:30
Harshith Mullapudi
c869096be8
Feat: Space v3
* feat: space v3

* feat: connected space creation

* fix:

* fix: session_id for memory ingestion

* chore: simplify gitignore patterns for agent directories

---------

Co-authored-by: Manoj <saimanoj58@gmail.com>
2025-10-30 12:30:56 +05:30
Harshith Mullapudi
c5407be54d fix: add check to wait for neo4j 2025-10-28 23:46:55 +05:30
Manik
6c37b41ca4 docs:updated agents.md instruction 2025-10-27 21:11:59 +05:30
Manik
023a220d3e Feat:added windsurf guide, improved other guides 2025-10-27 21:11:59 +05:30
Manoj
b9c4fc13c2
Update README.md 2025-10-27 17:29:21 +05:30
Manoj
0ad2bba2ad
Update README.md 2025-10-27 17:26:57 +05:30
Manoj
faad985e48
Update README.md 2025-10-27 17:25:18 +05:30
Harshith Mullapudi
8de059bb2e
Update README.md 2025-10-27 17:24:43 +05:30
Harshith Mullapudi
76228d6aac
Update README.md 2025-10-27 17:24:12 +05:30
Harshith Mullapudi
6ac74a3f0b feat: added railway based deployment 2025-10-26 17:16:44 +05:30
Harshith Mullapudi
b255bbe7e6 feat: added railway based deployment 2025-10-26 17:13:29 +05:30
Harshith Mullapudi
da3d06782e fix: accept redis password in redis connection 2025-10-26 16:29:48 +05:30
Harshith Mullapudi
a727671a30 fix: build is failing because of bad export 2025-10-26 15:16:41 +05:30
Manoj
e7ed6eb288 feat: track token count in recall logs and improve search query documentation 2025-10-26 12:56:40 +05:30
Manoj
5b31c8ed62 refactor: implement hierarchical search ranking with episode graph and source tracking 2025-10-26 12:56:40 +05:30
Harshith Mullapudi
f39c7cc6d0
feat: remove trigger and run base on bullmq (#126)
* feat: remove trigger and run base on bullmq
* fix: telemetry and trigger deploymen
* feat: add Ollama container and update ingestion status for unchanged documents
* feat: add logger to bullmq workers
* 1. Remove chat and deep-search from trigger
2. Add ai/sdk for chat UI
3. Added a better model manager

* refactor: simplify clustered graph query and add stop conditions for AI responses

* fix: streaming

* fix: docker docs

---------

Co-authored-by: Manoj <saimanoj58@gmail.com>
2025-10-26 12:56:12 +05:30
Harshith Mullapudi
b78713df41 fix: add cascade to all files, return 401 when session is cleared 2025-10-24 22:10:41 +05:30
Harshith Mullapudi
6f1037e8e1 fix: added more cascading tables 2025-10-23 10:42:58 +05:30
Manik
af56d7016e Fix: Updated docs; added agents.md instruction in agents guide 2025-10-22 18:31:20 +05:30
Manoj
3a10ee53e8
refactor: add cascade delete for user and workspace relations to simplify deletion logic (#122) 2025-10-21 18:21:41 +05:30
Harshith Mullapudi
ef1c8eac52 fix: default structured for search 2025-10-21 14:19:06 +05:30
Manoj
33bec831c6 feat: add markdown formatting and session compacts to search results 2025-10-21 14:19:06 +05:30
Manoj
8a6b06383e feat: add session compaction models and search integration with Neo4j 2025-10-21 14:19:06 +05:30
Harshith Mullapudi
60dd4bfa6f bump: new version 0.1.25 2025-10-21 11:32:36 +05:30
Harshith Mullapudi
00f983079f 1. Added onboarding
2. fix: remove custom tool calls in chat and link directly to mcp
2025-10-21 11:32:36 +05:30
Manoj
170eed76fb feat: clean up Neo4j graph nodes when deleting user account 2025-10-21 11:32:36 +05:30
Harshith Mullapudi
1db2628af4 fix: remove attribute login 2025-10-21 11:32:36 +05:30
Harshith Mullapudi
95636f96a8 Feat: ability to delete the account and clean up all resources 2025-10-21 11:32:36 +05:30
Harshith Mullapudi
bcae1bd4a1 fix: change the space docs according to mintlify 2025-10-18 22:29:04 +05:30
Sanyam Suyal
e372a38572 docs: add comprehensive Spaces documentation
- Add complete guide for Spaces feature
- Include API reference with 9 endpoints
- Add use cases, examples, and best practices
- Document troubleshooting and advanced features

Closes #98
2025-10-18 22:26:58 +05:30
Harshith Mullapudi
b0e141c2a2 1. fix: delete api key is not working
2. moved docs into main repo
2025-10-18 22:23:13 +05:30
Harshith Mullapudi
d0126797de fix: return error in mcp and activity when there are no credits 2025-10-18 00:18:35 +05:30
Harshith Mullapudi
6732ff71c5
Feat: deep search for extension and obsidian (#107)
* Feat: add deep search api

* Feat: deep search agent

* fix: stream utils for deep search

* fix: deep search

---------

Co-authored-by: Manoj <saimanoj58@gmail.com>
2025-10-15 23:51:21 +05:30
Sandro Munda
7523c99660 Fix typo "Personal" in the README.md file 2025-10-15 00:02:16 +05:30
Manik Aggarwal
e7b43602c5
Update README.md
Added Self-Hosting section in the readme
2025-10-14 17:07:04 +05:30
Manik Aggarwal
c8252a1c89
Update README.md 2025-10-14 11:42:00 +05:30
Manik Aggarwal
0616c1debd
Update README.md 2025-10-14 03:52:51 +05:30
Harshith Mullapudi
ddb7604fb2 fix: onboarding component loading 2025-10-10 10:03:19 +05:30
Harshith Mullapudi
3bdf051b32 fix: add option to remove episode from space 2025-10-09 16:12:30 +05:30
Harshith Mullapudi
a14b83d66d fix: document view is broken in log view 2025-10-09 15:44:24 +05:30
Harshith Mullapudi
2281dab166 fix: remove space description check 2025-10-09 13:20:50 +05:30
Harshith Mullapudi
ecba7f5aa0 fix: add queue for space-assignemt 2025-10-09 13:03:19 +05:30
Harshith Mullapudi
bcc0560cf0
Feat: Space (#93)
* Feat: change space assignment from statement to episode

* feat: add default spaces and improve integration, space tools discovery in MCP

* feat: change spaces to episode based

* Feat: take multiple spaceIds while ingesting

* Feat: modify mcp tool descriptions, add spaceId in mcp url

* feat: add copy

* bump: new version 0.1.24

---------

Co-authored-by: Manoj <saimanoj58@gmail.com>
2025-10-09 12:38:42 +05:30
Manoj
27f8740691
Fix: Semantic Search issue (#89)
* Fix: normalization prompt

* Fix: improve knowledge graph and better recall

* fix: add user context to search reranking

* fix: in search log the source

* fix: remove harcoded limit

---------

Co-authored-by: Harshith Mullapudi <harshithmullapudi@gmail.com>
2025-10-06 14:06:52 +05:30
Manik Aggarwal
3d1b93d97d
Update billing for cloud (#82) 2025-10-02 16:15:08 +05:30
Harshith Mullapudi
665f98d7bf fix: reduce the graph iterations 2025-10-02 13:20:24 +05:30
Harshith Mullapudi
159e003d2e fix: if profile summary is null then it violates the mcp 2025-10-02 13:16:29 +05:30
Harshith Mullapudi
7c737cf51f fix: mcp tool call failing for get_user_profile 2025-10-02 12:25:19 +05:30
Harshith Mullapudi
f0debd5678 fix: for new mcp connections create billing automatically 2025-10-02 11:26:22 +05:30
Harshith Mullapudi
27762262d2 fix: init billing server for new users 2025-10-02 11:25:52 +05:30
Harshith Mullapudi
489fb5934a
feat: add stripe billing for cloud (#81)
* feat: add stripe billing for cloud

* fix: mcp tools
2025-10-02 10:58:11 +05:30
Manoj
92ca34a02f
Fix: episode normalization with high complexity model (#80) 2025-10-02 08:48:56 +05:30
Manik Aggarwal
46407b0fac
Update README.md
Fix the docs URL.
2025-10-02 03:36:34 +05:30
Manik Aggarwal
5347c7a700
Update README.md
Fix the CODEX CLI guide url
2025-10-01 22:56:40 +05:30
Manik Aggarwal
dc9b149445
Update README.md 2025-10-01 21:41:16 +05:30
Manoj
f539ad1ecd
Feat: AWS bedrock support (#78)
* Feat: add support to AWS bedrock

* Feat: add token counter
Feat: high, low complexity model based on task

* feat: add model complexity selection for batch processing tasks
2025-10-01 11:45:35 +05:30
Harshith Mullapudi
7903dd08c3 fix: graph is not working in chrome 140 2025-09-29 09:48:52 +05:30
Harshith Mullapudi
1509e8d502 fix: show loading in onboarding form 2025-09-22 22:26:18 +05:30
Harshith Mullapudi
62fdf6181a 1. fix: when episode is already deleted then directly remove the ingestionQueue
2. fix: mcp tool description
2025-09-22 22:11:35 +05:30
Manoj
812d7dea51
Improve GitHub notification messages and add README for integrations 2025-09-22 19:58:21 +05:30
Harshith Mullapudi
59620151f2 fix: delete log needs no body 2025-09-20 10:29:02 +05:30
Manoj
5150fab210
fix: increase reranking score threshold (#75) 2025-09-19 11:34:39 +05:30
Harshith Mullapudi
a0b3128329 bump: new version 0.1.23 2025-09-19 09:04:16 +05:30
Manoj
a4b6a4f984 fix: skip unchanged docs, and enhance entity extraction prompts 2025-09-19 08:18:59 +05:30
Harshith Mullapudi
840ca64174 fix: UI for document logs
feat: added logs API to delete the episode
2025-09-19 08:18:59 +05:30
Manoj
43c3482351 Feat: generate space summary by topics 2025-09-19 08:18:59 +05:30
Manoj
e89e7c1024
fix: add includeInvalidated option to search queries to optionally show invalidated statements (#73) 2025-09-15 18:38:15 +05:30
Harshith Mullapudi
15d04fb577 feat: obsidian and figma logos 2025-09-15 12:09:40 +05:30
Harshith Mullapudi
a083e2fccf fix: new functions are added to the trigger 2025-09-15 12:05:04 +05:30
Harshith Mullapudi
3de929cdd1 bump: new version 0.1.22 2025-09-15 12:05:04 +05:30
Harshith Mullapudi
d4c4e16ac2 feat: changed the activity UI 2025-09-15 12:05:04 +05:30
Harshith Mullapudi
c1c93e0cb1 fix: ui fixes 2025-09-15 12:05:04 +05:30
Harshith Mullapudi
654de54ab9 fix: remove console logs and gpt-5 to test 2025-09-09 20:47:34 +05:30
Manoj
952386ca0e refactor: make entity handling type-free and simplify entity resolution in knowledge graph 2025-09-09 20:47:34 +05:30
Harshith Mullapudi
6ddcab873a fix: migration for user metadata 2025-09-09 08:58:58 +05:30
Harshith Mullapudi
db7608d735 bump: new version 0.1.21 2025-09-08 23:47:35 +05:30
Harshith Mullapudi
33986e584a fix: save onboarding answers 2025-09-08 23:45:54 +05:30
Manoj
df711b1af6 feat: reduce chunk size to 1-3k tokens and add user profile memory tool 2025-09-08 23:45:54 +05:30
Harshith Mullapudi
35bb158089 fix: save onboarding answers 2025-09-08 23:45:54 +05:30
Harshith Mullapudi
6b165bfa7f fix: onboarding 2025-09-08 23:45:54 +05:30
Manoj
be64630819 Fix: incremental space summary 2025-09-04 10:23:32 +05:30
350 changed files with 29915 additions and 14608 deletions

View File

@ -1,4 +1,4 @@
VERSION=0.1.20
VERSION=0.1.27
# Nest run in docker, change host to database container name
DB_HOST=localhost
@ -41,17 +41,17 @@ NEO4J_USERNAME=neo4j
NEO4J_PASSWORD=27192e6432564f4788d55c15131bd5ac
OPENAI_API_KEY=
MAGIC_LINK_SECRET=27192e6432564f4788d55c15131bd5ac
NEO4J_AUTH=neo4j/27192e6432564f4788d55c15131bd5ac
OLLAMA_URL=http://ollama:11434
EMBEDDING_MODEL=text-embedding-3-small
MODEL=gpt-4.1-2025-04-14
## Trigger ##
TRIGGER_PROJECT_ID=
TRIGGER_SECRET_KEY=
TRIGGER_API_URL=http://host.docker.internal:8030
## AWS Bedrock ##
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=us-east-1
QUEUE_PROVIDER=bullmq

View File

@ -7,32 +7,6 @@ on:
workflow_dispatch:
jobs:
build-init:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
ref: main
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1
- name: Login to Docker Registry
run: echo "${{ secrets.DOCKER_PASSWORD }}" | docker login -u "${{ secrets.DOCKER_USERNAME }}" --password-stdin
- name: Build and Push Frontend Docker Image
uses: docker/build-push-action@v2
with:
context: .
file: ./apps/init/Dockerfile
platforms: linux/amd64,linux/arm64
push: true
tags: redplanethq/init:${{ github.ref_name }}
build-webapp:
runs-on: ubuntu-latest

10
.gitignore vendored
View File

@ -46,6 +46,14 @@ registry/
.cursor
CLAUDE.md
AGENTS.md
.claude
.clinerules
.kilocode
.roo
.windsurf
.cursor
.kiro
.qoder
.augment

View File

@ -1,7 +0,0 @@
{
"eslint.workingDirectories": [
{
"mode": "auto"
}
]
}

View File

@ -1,4 +1,4 @@
Sol License
Core License
GNU AFFERO GENERAL PUBLIC LICENSE
Version 3, 19 November 2007

106
README.md
View File

@ -33,7 +33,7 @@
<img src="https://github.com/user-attachments/assets/89066cdd-204b-46c2-8ad4-4935f5ca9edd" width="200px" alt="CORE logo" />
</a>
### CORE: Unified Memory Layer for Claude, Cursor, ChatGPT & All AI Tools
### CORE: Your Personal Memory Layer for AI Apps
<p align="center">
<a href="https://deepwiki.com/RedPlanetHQ/core">
@ -41,21 +41,21 @@
</a>
</p>
<p align="center">
<a href="https://docs.heysol.ai/core/overview"><b>Documentation</b></a>
<a href="https://docs.heysol.ai/introduction"><b>Documentation</b></a>
<a href="https://discord.gg/YGUZcvDjUa"><b>Discord</b></a>
</p>
</div>
## 🔥 Research Highlights
CORE memory achieves **88.24%** average accuracy in Locomo dataset across all reasoning tasks, significantly outperforming other memory providers. Check out this [blog](https://blog.heysol.ai/we-built-memory-for-individuals-and-achieved-sota-on-locomo-benchmark/) for more info.
CORE memory achieves **88.24%** average accuracy in Locomo dataset across all reasoning tasks, significantly outperforming other memory providers. Check out this [blog](https://blog.heysol.ai/core-build-memory-knowledge-graph-for-individuals-and-achieved-sota-on-locomo-benchmark/) for more info.
<img width="6048" height="3428" alt="benchmark" src="https://github.com/user-attachments/assets/2e5fdac5-02ed-4d00-9312-c21d09974e1f" />
(1) Single-hop questions require answers based on a single session; (2) Multi-hop questions require synthesizing information from multiple different sessions; (3) Open-domain knowledge questions can be answered by integrating a speakers provided information with external knowledge such as commonsense or world facts; (4) Temporal reasoning questions can be answered through temporal reasoning and capturing time-related data cues within the conversation;
## Overview
**Problem**
**Problem**
Developers waste time re-explaining context to AI tools. Hit token limits in Claude? Start fresh and lose everything. Switch from ChatGPT/Claude to Cursor? Explain your context again. Your conversations, decisions, and insights vanish between sessions. With every new AI tool, the cost of context switching grows.
@ -63,43 +63,83 @@ Developers waste time re-explaining context to AI tools. Hit token limits in Cla
CORE is an open-source unified, persistent memory layer for all your AI tools. Your context follows you from Cursor to Claude to ChatGPT to Claude Code. One knowledge graph remembers who said what, when, and why. Connect once, remember everywhere. Stop managing context and start building.
## 🚀 Get Started
## 🚀 CORE Self-Hosting
Want to run CORE on your own infrastructure? Self-hosting gives you complete control over your data and deployment.
**Quick Deploy Options:**
[![Deploy on Railway](https://railway.com/button.svg)](https://railway.com/deploy/core?referralCode=LHvbIb&utm_medium=integration&utm_source=template&utm_campaign=generic)
**Prerequisites**:
- Docker (20.10.0+) and Docker Compose (2.20.0+) installed
- OpenAI API key
> **Note on Open-Source Models:** We tested OSS options like Ollama and GPT models, but their fact extraction and graph quality fell short. We're actively looking for options.
### Setup
1. Clone the repository:
```
git clone https://github.com/RedPlanetHQ/core.git
cd core
```
2. Configure environment variables in `core/.env`:
```
OPENAI_API_KEY=your_openai_api_key
```
3. Start the service
```
docker-compose up -d
```
Once deployed, you can configure your AI providers (OpenAI, Anthropic) and start building your memory graph.
👉 [View complete self-hosting guide](https://docs.heysol.ai/self-hosting/docker)
Note: We tried open-source models like Ollama or GPT OSS but facts generation were not good, we are still figuring out how to improve on that and then will also support OSS models.
## 🚀 CORE Cloud
**Build your unified memory graph in 5 minutes:**
Don't want to manage infrastructure? CORE Cloud lets you build your personal memory system instantly - no setup, no servers, just memory that works.
1. **Sign Up** at [core.heysol.ai](https://core.heysol.ai) and create your account
2. **Add your first memory** - share context about yourself
<img width="2088" height="1212" alt="first-memory" src="https://github.com/user-attachments/assets/ecfab88e-e91a-474d-9ef5-fc6c19b655a8" />
3. **Visualize your memory graph** and see how CORE automatically forms connections between facts
5. **Test it out** - ask "What do you know about me?" in conversatio section
6. Connect to your tools:
2. **Visualize your memory graph** and see how CORE automatically forms connections between facts
3. **Test it out** - ask "What do you know about me?" in conversation section
4. Connect to your tools:
- [Claude](https://docs.heysol.ai/providers/claude) & [Cursor](https://docs.heysol.ai/providers/cursor) - coding with context
- [CLaude Code CLI](https://docs.heysol.ai/providers/claude-code) & [Gemini CLI](https://docs.heysol.ai/providers/claude-code) - terminal-based coding with memory
- [CLaude Code CLI](https://docs.heysol.ai/providers/claude-code) & [Codex CLI](https://docs.heysol.ai/providers/codex) - terminal-based coding with memory
- [Add Browser Extension](https://docs.heysol.ai/providers/browser-extension) - bring your memory to any website
- [Linear](https://docs.heysol.ai/integrations/linear), [Github](https://docs.heysol.ai/integrations/github) - add project context automatically
## 🧩 Key Features
### 🧠 **Unified, Portable Memory**:
### 🧠 **Unified, Portable Memory**:
Add and recall your memory across **Cursor, Windsurf, Claude Desktop, Claude Code, Gemini CLI, AWS's Kiro, VS Code, and Roo Code** via MCP
![core-claude](https://github.com/user-attachments/assets/56c98288-ee87-4cd0-8b02-860aca1c7f9a)
### 🕸️ **Temporal + Reified Knowledge Graph**:
### 🕸️ **Temporal + Reified Knowledge Graph**:
Remember the story behind every fact—track who said what, when, and why with rich relationships and full provenance, not just flat storage
![core-memory-graph](https://github.com/user-attachments/assets/5d1ee659-d519-4624-85d1-e0497cbdd60a)
### 🌐 **Browser Extension**:
### 🌐 **Browser Extension**:
Save conversations and content from ChatGPT, Grok, Gemini, Twitter, YouTube, blog posts, and any webpage directly into your CORE memory.
**How to Use Extension**
1. [Download the Extension](https://chromewebstore.google.com/detail/core-extension/cglndoindnhdbfcbijikibfjoholdjcc) from the Chrome Web Store.
2. Login to [CORE dashboard](https://core.heysol.ai)
- Navigate to Settings (bottom left)
@ -108,29 +148,26 @@ Save conversations and content from ChatGPT, Grok, Gemini, Twitter, YouTube, blo
https://github.com/user-attachments/assets/6e629834-1b9d-4fe6-ae58-a9068986036a
### 💬 **Chat with Memory**:
### 💬 **Chat with Memory**:
Ask questions like "What are my writing preferences?" with instant insights from your connected knowledge
![chat-with-memory](https://github.com/user-attachments/assets/d798802f-bd51-4daf-b2b5-46de7d206f66)
### ⚡ **Auto-Sync from Apps**:
### ⚡ **Auto-Sync from Apps**:
Automatically capture relevant context from Linear, Slack, Notion, GitHub and other connected apps into your CORE memory
📖 **[View All Integrations](./integrations/README.md)** - Complete list of supported services and their features
![core-slack](https://github.com/user-attachments/assets/d5fefe38-221e-4076-8a44-8ed673960f03)
### 🔗 **MCP Integration Hub**:
### 🔗 **MCP Integration Hub**:
Connect Linear, Slack, GitHub, Notion once to CORE—then use all their tools in Claude, Cursor, or any MCP client with a single URL
![core-linear-claude](https://github.com/user-attachments/assets/7d59d92b-8c56-4745-a7ab-9a3c0341aa32)
## How CORE create memory
<img width="12885" height="3048" alt="memory-ingest-diagram" src="https://github.com/user-attachments/assets/c51679de-8260-4bee-bebf-aff32c6b8e13" />
@ -144,7 +181,6 @@ COREs ingestion pipeline has four phases designed to capture evolving context
The Result: Instead of a flat database, CORE gives you a memory that grows and changes with you - preserving context, evolution, and ownership so agents can actually use it.
![memory-ingest-eg](https://github.com/user-attachments/assets/1d0a8007-153a-4842-9586-f6f4de43e647)
## How CORE recalls from memory
@ -168,9 +204,11 @@ Explore our documentation to get the most out of CORE
- [Self Hosting](https://docs.heysol.ai/self-hosting/overview)
- [Connect Core MCP with Claude](https://docs.heysol.ai/providers/claude)
- [Connect Core MCP with Cursor](https://docs.heysol.ai/providers/cursor)
- [Connect Core MCP with Claude Code](https://docs.heysol.ai/providers/claude-code)
- [Connect Core MCP with Codex](https://docs.heysol.ai/providers/codex)
- [Basic Concepts](https://docs.heysol.ai/overview)
- [API Reference](https://docs.heysol.ai/local-setup)
- [API Reference](https://docs.heysol.ai/api-reference/get-user-profile)
## 🔒 Security
@ -179,7 +217,7 @@ CORE takes security seriously. We implement industry-standard security practices
- **Data Encryption**: All data in transit (TLS 1.3) and at rest (AES-256)
- **Authentication**: OAuth 2.0 and magic link authentication
- **Access Control**: Workspace-based isolation and role-based permissions
- **Vulnerability Reporting**: Please report security issues to harshith@tegon.ai
- **Vulnerability Reporting**: Please report security issues to harshith@poozle.dev
For detailed security information, see our [Security Policy](SECURITY.md).
@ -212,9 +250,11 @@ Have questions or feedback? We're here to help:
<a href="https://github.com/RedPlanetHQ/core/graphs/contributors">
<img src="https://contrib.rocks/image?repo=RedPlanetHQ/core" />
</a>
<<<<<<< Updated upstream
<<<<<<< HEAD
# =======
> > > > > > > Stashed changes
> > > > > > > 62db6c1 (feat: automatic space identification)

51
apps/init/.gitignore vendored
View File

@ -1,51 +0,0 @@
# See https://help.github.com/articles/ignoring-files/ for more about ignoring files.
# Dependencies
node_modules
.pnp
.pnp.js
# Local env files
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
# Testing
coverage
# Turbo
.turbo
# Vercel
.vercel
# Build Outputs
.next/
out/
build
dist
.tshy/
.tshy-build/
# Debug
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# Misc
.DS_Store
*.pem
docker-compose.dev.yaml
clickhouse/
.vscode/
registry/
.cursor
CLAUDE.md
.claude

View File

@ -1,70 +0,0 @@
ARG NODE_IMAGE=node:20.11.1-bullseye-slim@sha256:5a5a92b3a8d392691c983719dbdc65d9f30085d6dcd65376e7a32e6fe9bf4cbe
FROM ${NODE_IMAGE} AS pruner
WORKDIR /core
COPY --chown=node:node . .
RUN npx -q turbo@2.5.3 prune --scope=@redplanethq/init --docker
RUN find . -name "node_modules" -type d -prune -exec rm -rf '{}' +
# Base strategy to have layer caching
FROM ${NODE_IMAGE} AS base
RUN apt-get update && apt-get install -y openssl dumb-init postgresql-client
WORKDIR /core
COPY --chown=node:node .gitignore .gitignore
COPY --from=pruner --chown=node:node /core/out/json/ .
COPY --from=pruner --chown=node:node /core/out/pnpm-lock.yaml ./pnpm-lock.yaml
COPY --from=pruner --chown=node:node /core/out/pnpm-workspace.yaml ./pnpm-workspace.yaml
## Dev deps
FROM base AS dev-deps
WORKDIR /core
# Corepack is used to install pnpm
RUN corepack enable
ENV NODE_ENV development
RUN pnpm install --ignore-scripts --no-frozen-lockfile
## Production deps
FROM base AS production-deps
WORKDIR /core
# Corepack is used to install pnpm
RUN corepack enable
ENV NODE_ENV production
RUN pnpm install --prod --no-frozen-lockfile
## Builder (builds the init CLI)
FROM base AS builder
WORKDIR /core
# Corepack is used to install pnpm
RUN corepack enable
COPY --from=pruner --chown=node:node /core/out/full/ .
COPY --from=dev-deps --chown=node:node /core/ .
COPY --chown=node:node turbo.json turbo.json
COPY --chown=node:node .configs/tsconfig.base.json .configs/tsconfig.base.json
RUN pnpm run build --filter=@redplanethq/init...
# Runner
FROM ${NODE_IMAGE} AS runner
RUN apt-get update && apt-get install -y openssl postgresql-client ca-certificates
WORKDIR /core
RUN corepack enable
ENV NODE_ENV production
COPY --from=base /usr/bin/dumb-init /usr/bin/dumb-init
COPY --from=pruner --chown=node:node /core/out/full/ .
COPY --from=production-deps --chown=node:node /core .
COPY --from=builder --chown=node:node /core/apps/init/dist ./apps/init/dist
# Copy the trigger dump file
COPY --chown=node:node apps/init/trigger.dump ./apps/init/trigger.dump
# Copy and set up entrypoint script
COPY --chown=node:node apps/init/entrypoint.sh ./apps/init/entrypoint.sh
RUN chmod +x ./apps/init/entrypoint.sh
USER node
WORKDIR /core/apps/init
ENTRYPOINT ["dumb-init", "--"]
CMD ["./entrypoint.sh"]

View File

@ -1,197 +0,0 @@
# Core CLI
🧠 **CORE - Contextual Observation & Recall Engine**
A Command-Line Interface for setting up and managing the Core development environment.
## Installation
```bash
npm install -g @redplanethq/core
```
## Commands
### `core init`
**One-time setup command** - Initializes the Core development environment with full configuration.
### `core start`
**Daily usage command** - Starts all Core services (Docker containers).
### `core stop`
**Daily usage command** - Stops all Core services (Docker containers).
## Getting Started
### Prerequisites
- **Node.js** (v18.20.0 or higher)
- **Docker** and **Docker Compose**
- **Git**
- **pnpm** package manager
### Initial Setup
1. **Clone the Core repository:**
```bash
git clone https://github.com/redplanethq/core.git
cd core
```
2. **Run the initialization command:**
```bash
core init
```
3. **The CLI will guide you through the complete setup process:**
#### Step 1: Prerequisites Check
- The CLI shows a checklist of required tools
- Confirms you're in the Core repository directory
- Exits with instructions if prerequisites aren't met
#### Step 2: Environment Configuration
- Copies `.env.example` to `.env` in the root directory
- Copies `trigger/.env.example` to `trigger/.env`
- Skips copying if `.env` files already exist
#### Step 3: Docker Services Startup
- Starts main Core services: `docker compose up -d`
- Starts Trigger.dev services: `docker compose up -d` (in trigger/ directory)
- Shows real-time output with progress indicators
#### Step 4: Database Health Check
- Verifies PostgreSQL is running on `localhost:5432`
- Retries for up to 60 seconds if needed
#### Step 5: Trigger.dev Setup (Interactive)
- **If Trigger.dev is not configured:**
1. Prompts you to open http://localhost:8030
2. Asks you to login to Trigger.dev
3. Guides you to create an organization and project
4. Collects your Project ID and Secret Key
5. Updates `.env` with your Trigger.dev configuration
6. Restarts Core services with new configuration
- **If Trigger.dev is already configured:**
- Skips setup and shows "Configuration already exists" message
#### Step 6: Docker Registry Login
- Displays docker login command with credentials from `.env`
- Waits for you to complete the login process
#### Step 7: Trigger.dev Task Deployment
- Automatically runs: `npx trigger.dev@v4-beta login -a http://localhost:8030`
- Deploys tasks with: `pnpm trigger:deploy`
- Shows manual deployment instructions if automatic deployment fails
#### Step 8: Setup Complete!
- Confirms all services are running
- Shows service URLs and connection information
## Daily Usage
After initial setup, use these commands for daily development:
### Start Services
```bash
core start
```
Starts all Docker containers for Core development.
### Stop Services
```bash
core stop
```
Stops all Docker containers.
## Service URLs
After setup, these services will be available:
- **Core Application**: http://localhost:3033
- **Trigger.dev**: http://localhost:8030
- **PostgreSQL**: localhost:5432
## Troubleshooting
### Repository Not Found
If you run commands outside the Core repository:
- The CLI will ask you to confirm you're in the Core repository
- If not, it provides instructions to clone the repository
- Navigate to the Core repository directory before running commands again
### Docker Issues
- Ensure Docker is running
- Check Docker Compose is installed
- Verify you have sufficient system resources
### Trigger.dev Setup Issues
- Check container logs: `docker logs trigger-webapp --tail 50`
- Ensure you can access http://localhost:8030
- Verify your network allows connections to localhost
### Environment Variables
The CLI automatically manages these environment variables:
- `TRIGGER_PROJECT_ID` - Your Trigger.dev project ID
- `TRIGGER_SECRET_KEY` - Your Trigger.dev secret key
- Docker registry credentials for deployment
### Manual Trigger.dev Deployment
If automatic deployment fails, run manually:
```bash
npx trigger.dev@v4-beta login -a http://localhost:8030
pnpm trigger:deploy
```
## Development Workflow
1. **First time setup:** `core init`
2. **Daily development:**
- `core start` - Start your development environment
- Do your development work
- `core stop` - Stop services when done
## Support
For issues and questions:
- Check the main Core repository: https://github.com/redplanethq/core
- Review Docker container logs for troubleshooting
- Ensure all prerequisites are properly installed
## Features
- 🚀 **One-command setup** - Complete environment initialization
- 🔄 **Smart configuration** - Skips already configured components
- 📱 **Real-time feedback** - Live progress indicators and output
- 🐳 **Docker integration** - Full container lifecycle management
- 🔧 **Interactive setup** - Guided configuration process
- 🎯 **Error handling** - Graceful failure with recovery instructions
---
**Happy coding with Core!** 🎉

View File

@ -1,22 +0,0 @@
#!/bin/sh
# Exit on any error
set -e
echo "Starting init CLI..."
# Wait for database to be ready
echo "Waiting for database connection..."
until pg_isready -h "${DB_HOST:-localhost}" -p "${DB_PORT:-5432}" -U "${POSTGRES_USER:-docker}"; do
echo "Database is unavailable - sleeping"
sleep 2
done
echo "Database is ready!"
# Run the init command
echo "Running init command..."
node ./dist/esm/index.js init
echo "Init completed successfully!"
exit 0

View File

@ -1,145 +0,0 @@
{
"name": "@redplanethq/init",
"version": "0.1.0",
"description": "A init service to create trigger instance",
"type": "module",
"license": "MIT",
"repository": {
"type": "git",
"url": "https://github.com/redplanethq/core",
"directory": "apps/init"
},
"publishConfig": {
"access": "public"
},
"keywords": [
"typescript"
],
"files": [
"dist",
"trigger.dump"
],
"bin": {
"core": "./dist/esm/index.js"
},
"tshy": {
"selfLink": false,
"main": false,
"module": false,
"dialects": [
"esm"
],
"project": "./tsconfig.json",
"exclude": [
"**/*.test.ts"
],
"exports": {
"./package.json": "./package.json",
".": "./src/index.ts"
}
},
"devDependencies": {
"@epic-web/test-server": "^0.1.0",
"@types/gradient-string": "^1.1.2",
"@types/ini": "^4.1.1",
"@types/object-hash": "3.0.6",
"@types/polka": "^0.5.7",
"@types/react": "^18.2.48",
"@types/resolve": "^1.20.6",
"@types/rimraf": "^4.0.5",
"@types/semver": "^7.5.0",
"@types/source-map-support": "0.5.10",
"@types/ws": "^8.5.3",
"cpy-cli": "^5.0.0",
"execa": "^8.0.1",
"find-up": "^7.0.0",
"rimraf": "^5.0.7",
"ts-essentials": "10.0.1",
"tshy": "^3.0.2",
"tsx": "4.17.0"
},
"scripts": {
"clean": "rimraf dist .tshy .tshy-build .turbo",
"typecheck": "tsc -p tsconfig.src.json --noEmit",
"build": "tshy",
"test": "vitest",
"test:e2e": "vitest --run -c ./e2e/vitest.config.ts"
},
"dependencies": {
"@clack/prompts": "^0.10.0",
"@depot/cli": "0.0.1-cli.2.80.0",
"@opentelemetry/api": "1.9.0",
"@opentelemetry/api-logs": "0.52.1",
"@opentelemetry/exporter-logs-otlp-http": "0.52.1",
"@opentelemetry/exporter-trace-otlp-http": "0.52.1",
"@opentelemetry/instrumentation": "0.52.1",
"@opentelemetry/instrumentation-fetch": "0.52.1",
"@opentelemetry/resources": "1.25.1",
"@opentelemetry/sdk-logs": "0.52.1",
"@opentelemetry/sdk-node": "0.52.1",
"@opentelemetry/sdk-trace-base": "1.25.1",
"@opentelemetry/sdk-trace-node": "1.25.1",
"@opentelemetry/semantic-conventions": "1.25.1",
"ansi-escapes": "^7.0.0",
"braces": "^3.0.3",
"c12": "^1.11.1",
"chalk": "^5.2.0",
"chokidar": "^3.6.0",
"cli-table3": "^0.6.3",
"commander": "^9.4.1",
"defu": "^6.1.4",
"dotenv": "^16.4.5",
"dotenv-expand": "^12.0.2",
"esbuild": "^0.23.0",
"eventsource": "^3.0.2",
"evt": "^2.4.13",
"fast-npm-meta": "^0.2.2",
"git-last-commit": "^1.0.1",
"gradient-string": "^2.0.2",
"has-flag": "^5.0.1",
"import-in-the-middle": "1.11.0",
"import-meta-resolve": "^4.1.0",
"ini": "^5.0.0",
"jsonc-parser": "3.2.1",
"magicast": "^0.3.4",
"minimatch": "^10.0.1",
"mlly": "^1.7.1",
"nypm": "^0.5.4",
"nanoid": "3.3.8",
"object-hash": "^3.0.0",
"open": "^10.0.3",
"knex": "3.1.0",
"p-limit": "^6.2.0",
"p-retry": "^6.1.0",
"partysocket": "^1.0.2",
"pkg-types": "^1.1.3",
"polka": "^0.5.2",
"pg": "8.16.3",
"resolve": "^1.22.8",
"semver": "^7.5.0",
"signal-exit": "^4.1.0",
"source-map-support": "0.5.21",
"std-env": "^3.7.0",
"supports-color": "^10.0.0",
"tiny-invariant": "^1.2.0",
"tinyexec": "^0.3.1",
"tinyglobby": "^0.2.10",
"uuid": "11.1.0",
"ws": "^8.18.0",
"xdg-app-paths": "^8.3.0",
"zod": "3.23.8",
"zod-validation-error": "^1.5.0"
},
"engines": {
"node": ">=18.20.0"
},
"exports": {
"./package.json": "./package.json",
".": {
"import": {
"types": "./dist/esm/index.d.ts",
"default": "./dist/esm/index.js"
}
}
}
}

View File

@ -1,14 +0,0 @@
import { Command } from "commander";
import { initCommand } from "../commands/init.js";
import { VERSION } from "./version.js";
const program = new Command();
program.name("core").description("Core CLI - A Command-Line Interface for Core").version(VERSION);
program
.command("init")
.description("Initialize Core development environment (run once)")
.action(initCommand);
program.parse(process.argv);

View File

@ -1,3 +0,0 @@
import { env } from "../utils/env.js";
export const VERSION = env.VERSION;

View File

@ -1,36 +0,0 @@
import { intro, outro, note } from "@clack/prompts";
import { printCoreBrainLogo } from "../utils/ascii.js";
import { initTriggerDatabase, updateWorkerImage } from "../utils/trigger.js";
export async function initCommand() {
// Display the CORE brain logo
printCoreBrainLogo();
intro("🚀 Core Development Environment Setup");
try {
await initTriggerDatabase();
await updateWorkerImage();
note(
[
"Your services will start running:",
"",
"• Core Application: http://localhost:3033",
"• Trigger.dev: http://localhost:8030",
"• PostgreSQL: localhost:5432",
"",
"You can now start developing with Core!",
"",
" When logging in to the Core Application, you can find the login URL in the Docker container logs:",
" docker logs core-app --tail 50",
].join("\n"),
"🚀 Services Running"
);
outro("🎉 Setup Complete!");
process.exit(0);
} catch (error: any) {
outro(`❌ Setup failed: ${error.message}`);
process.exit(1);
}
}

View File

@ -1,3 +0,0 @@
#!/usr/bin/env node
import "./cli/index.js";

View File

@ -1,29 +0,0 @@
import chalk from "chalk";
import { VERSION } from "../cli/version.js";
export function printCoreBrainLogo(): void {
const brain = `
o o o
o o---o---o o
o---o o o---o---o
o o---o---o---o o
o---o o o---o---o
o o---o---o o
o o o
`;
console.log(chalk.cyan(brain));
console.log(
chalk.bold.white(
` 🧠 CORE - Contextual Observation & Recall Engine ${VERSION ? chalk.gray(`(${VERSION})`) : ""}\n`
)
);
}

View File

@ -1,24 +0,0 @@
import { z } from "zod";
const EnvironmentSchema = z.object({
// Version
VERSION: z.string().default("0.1.14"),
// Database
DB_HOST: z.string().default("localhost"),
DB_PORT: z.string().default("5432"),
TRIGGER_DB: z.string().default("trigger"),
POSTGRES_USER: z.string().default("docker"),
POSTGRES_PASSWORD: z.string().default("docker"),
// Trigger database
TRIGGER_TASKS_IMAGE: z.string().default("redplanethq/proj_core:latest"),
// Node environment
NODE_ENV: z
.union([z.literal("development"), z.literal("production"), z.literal("test")])
.default("development"),
});
export type Environment = z.infer<typeof EnvironmentSchema>;
export const env = EnvironmentSchema.parse(process.env);

View File

@ -1,182 +0,0 @@
import Knex from "knex";
import path from "path";
import { fileURLToPath } from "url";
import { env } from "./env.js";
import { spinner, note, log } from "@clack/prompts";
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
/**
* Returns a PostgreSQL database URL for the given database name.
* Throws if required environment variables are missing.
*/
export function getDatabaseUrl(dbName: string): string {
const { POSTGRES_USER, POSTGRES_PASSWORD, DB_HOST, DB_PORT } = env;
if (!POSTGRES_USER || !POSTGRES_PASSWORD || !DB_HOST || !DB_PORT || !dbName) {
throw new Error(
"One or more required environment variables are missing: POSTGRES_USER, POSTGRES_PASSWORD, DB_HOST, DB_PORT, dbName"
);
}
return `postgresql://${POSTGRES_USER}:${POSTGRES_PASSWORD}@${DB_HOST}:${DB_PORT}/${dbName}`;
}
/**
* Checks if the database specified by TRIGGER_DB exists, and creates it if it does not.
* Returns { exists: boolean, created: boolean } - exists indicates success, created indicates if database was newly created.
*/
export async function ensureDatabaseExists(): Promise<{ exists: boolean; created: boolean }> {
const { TRIGGER_DB } = env;
if (!TRIGGER_DB) {
throw new Error("TRIGGER_DB environment variable is missing");
}
// Build a connection string to the default 'postgres' database
const adminDbUrl = getDatabaseUrl("postgres");
// Create a Knex instance for the admin connection
const adminKnex = Knex({
client: "pg",
connection: adminDbUrl,
});
const s = spinner();
s.start("Checking for Trigger.dev database...");
try {
// Check if the database exists
const result = await adminKnex.select(1).from("pg_database").where("datname", TRIGGER_DB);
if (result.length === 0) {
s.message("Database not found. Creating...");
// Database does not exist, create it
await adminKnex.raw(`CREATE DATABASE "${TRIGGER_DB}"`);
s.stop("Database created.");
return { exists: true, created: true };
} else {
s.stop("Database exists.");
return { exists: true, created: false };
}
} catch (err) {
s.stop("Failed to ensure database exists.");
log.warning("Failed to ensure database exists: " + (err as Error).message);
return { exists: false, created: false };
} finally {
await adminKnex.destroy();
}
}
// Main initialization function
export async function initTriggerDatabase() {
const { TRIGGER_DB } = env;
if (!TRIGGER_DB) {
throw new Error("TRIGGER_DB environment variable is missing");
}
// Ensure the database exists
const { exists, created } = await ensureDatabaseExists();
if (!exists) {
throw new Error("Failed to create or verify database exists");
}
// Only run pg_restore if the database was newly created
if (!created) {
note("Database already exists, skipping restore from trigger.dump");
return;
}
// Run pg_restore with the trigger.dump file
const dumpFilePath = path.join(__dirname, "../../../trigger.dump");
const connectionString = getDatabaseUrl(TRIGGER_DB);
const s = spinner();
s.start("Restoring database from trigger.dump...");
try {
// Use execSync and capture stdout/stderr, send to spinner.log
const { spawn } = await import("child_process");
await new Promise<void>((resolve, reject) => {
const child = spawn(
"pg_restore",
["--verbose", "--no-acl", "--no-owner", "-d", connectionString, dumpFilePath],
{ stdio: ["ignore", "pipe", "pipe"] }
);
child.stdout.on("data", (data) => {
s.message(data.toString());
});
child.stderr.on("data", (data) => {
s.message(data.toString());
});
child.on("close", (code) => {
if (code === 0) {
s.stop("Database restored successfully from trigger.dump");
resolve();
} else {
s.stop("Failed to restore database.");
log.warning(`Failed to restore database: pg_restore exited with code ${code}`);
reject(new Error(`Database restore failed: pg_restore exited with code ${code}`));
}
});
child.on("error", (err) => {
s.stop("Failed to restore database.");
log.warning("Failed to restore database: " + err.message);
reject(new Error(`Database restore failed: ${err.message}`));
});
});
} catch (error: any) {
s.stop("Failed to restore database.");
log.warning("Failed to restore database: " + error.message);
throw new Error(`Database restore failed: ${error.message}`);
}
}
export async function updateWorkerImage() {
const { TRIGGER_DB, TRIGGER_TASKS_IMAGE } = env;
if (!TRIGGER_DB) {
throw new Error("TRIGGER_DB environment variable is missing");
}
const connectionString = getDatabaseUrl(TRIGGER_DB);
const knex = Knex({
client: "pg",
connection: connectionString,
});
const s = spinner();
s.start("Updating worker image reference...");
try {
// Get the first record from WorkerDeployment table
const firstWorkerDeployment = await knex("WorkerDeployment").select("id").first();
if (!firstWorkerDeployment) {
s.stop("No WorkerDeployment records found, skipping image update");
note("No WorkerDeployment records found, skipping image update");
return;
}
// Update the imageReference column with the TRIGGER_TASKS_IMAGE value
await knex("WorkerDeployment").where("id", firstWorkerDeployment.id).update({
imageReference: TRIGGER_TASKS_IMAGE,
updatedAt: new Date(),
});
s.stop(`Successfully updated worker image reference to: ${TRIGGER_TASKS_IMAGE}`);
} catch (error: any) {
s.stop("Failed to update worker image.");
log.warning("Failed to update worker image: " + error.message);
throw new Error(`Worker image update failed: ${error.message}`);
} finally {
await knex.destroy();
}
}

Binary file not shown.

View File

@ -1,40 +0,0 @@
{
"include": ["./src/**/*.ts"],
"exclude": ["./src/**/*.test.ts"],
"compilerOptions": {
"target": "es2022",
"lib": ["ES2022", "DOM", "DOM.Iterable", "DOM.AsyncIterable"],
"module": "NodeNext",
"moduleResolution": "NodeNext",
"moduleDetection": "force",
"verbatimModuleSyntax": false,
"jsx": "react",
"strict": true,
"alwaysStrict": true,
"strictPropertyInitialization": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"noUnusedLocals": false,
"noUnusedParameters": false,
"noImplicitAny": true,
"noImplicitReturns": true,
"noImplicitThis": true,
"noFallthroughCasesInSwitch": true,
"resolveJsonModule": true,
"removeComments": false,
"esModuleInterop": true,
"emitDecoratorMetadata": false,
"experimentalDecorators": false,
"downlevelIteration": true,
"isolatedModules": true,
"noUncheckedIndexedAccess": true,
"pretty": true,
"isolatedDeclarations": false,
"composite": true,
"sourceMap": true
}
}

View File

@ -1,8 +0,0 @@
import { configDefaults, defineConfig } from "vitest/config";
export default defineConfig({
test: {
globals: true,
exclude: [...configDefaults.exclude, "e2e/**/*"],
},
});

View File

@ -0,0 +1,50 @@
import Redis, { type RedisOptions } from "ioredis";
let redisConnection: Redis | null = null;
/**
* Get or create a Redis connection for BullMQ
* This connection is shared across all queues and workers
*/
export function getRedisConnection() {
if (redisConnection) {
return redisConnection;
}
// Dynamically import ioredis only when needed
const redisConfig: RedisOptions = {
host: process.env.REDIS_HOST,
port: parseInt(process.env.REDIS_PORT as string),
password: process.env.REDIS_PASSWORD,
maxRetriesPerRequest: null, // Required for BullMQ
enableReadyCheck: false, // Required for BullMQ
};
// Add TLS configuration if not disabled
if (!process.env.REDIS_TLS_DISABLED) {
redisConfig.tls = {};
}
redisConnection = new Redis(redisConfig);
redisConnection.on("error", (error) => {
console.error("Redis connection error:", error);
});
redisConnection.on("connect", () => {
console.log("Redis connected successfully");
});
return redisConnection;
}
/**
* Close the Redis connection (useful for graceful shutdown)
*/
export async function closeRedisConnection(): Promise<void> {
if (redisConnection) {
await redisConnection.quit();
redisConnection = null;
}
}

View File

@ -0,0 +1,160 @@
/**
* BullMQ Queues
*
* All queue definitions for the BullMQ implementation
*/
import { Queue } from "bullmq";
import { getRedisConnection } from "../connection";
/**
* Episode ingestion queue
* Handles individual episode ingestion (including document chunks)
*/
export const ingestQueue = new Queue("ingest-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 3,
backoff: {
type: "exponential",
delay: 2000,
},
removeOnComplete: {
age: 3600, // Keep completed jobs for 1 hour
count: 1000, // Keep last 1000 completed jobs
},
removeOnFail: {
age: 86400, // Keep failed jobs for 24 hours
},
},
});
/**
* Document ingestion queue
* Handles document-level ingestion with differential processing
*/
export const documentIngestQueue = new Queue("document-ingest-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 3,
backoff: {
type: "exponential",
delay: 2000,
},
removeOnComplete: {
age: 3600,
count: 1000,
},
removeOnFail: {
age: 86400,
},
},
});
/**
* Conversation title creation queue
*/
export const conversationTitleQueue = new Queue("conversation-title-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 3,
backoff: {
type: "exponential",
delay: 2000,
},
removeOnComplete: {
age: 3600,
count: 1000,
},
removeOnFail: {
age: 86400,
},
},
});
/**
* Session compaction queue
*/
export const sessionCompactionQueue = new Queue("session-compaction-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 3,
backoff: {
type: "exponential",
delay: 2000,
},
removeOnComplete: {
age: 3600,
count: 1000,
},
removeOnFail: {
age: 86400,
},
},
});
/**
* BERT topic analysis queue
* Handles CPU-intensive topic modeling on user episodes
*/
export const bertTopicQueue = new Queue("bert-topic-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 2, // Only 2 attempts due to long runtime
backoff: {
type: "exponential",
delay: 5000,
},
removeOnComplete: {
age: 7200, // Keep completed jobs for 2 hours
count: 100,
},
removeOnFail: {
age: 172800, // Keep failed jobs for 48 hours (for debugging)
},
},
});
/**
* Space assignment queue
* Handles assigning episodes to spaces based on semantic matching
*/
export const spaceAssignmentQueue = new Queue("space-assignment-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 3,
backoff: {
type: "exponential",
delay: 2000,
},
removeOnComplete: {
age: 3600,
count: 1000,
},
removeOnFail: {
age: 86400,
},
},
});
/**
* Space summary queue
* Handles generating summaries for spaces
*/
export const spaceSummaryQueue = new Queue("space-summary-queue", {
connection: getRedisConnection(),
defaultJobOptions: {
attempts: 3,
backoff: {
type: "exponential",
delay: 2000,
},
removeOnComplete: {
age: 3600,
count: 1000,
},
removeOnFail: {
age: 86400,
},
},
});

View File

@ -0,0 +1,154 @@
/**
* BullMQ Worker Startup Script
*
* This script starts all BullMQ workers for processing background jobs.
* Run this as a separate process alongside your main application.
*
* Usage:
* tsx apps/webapp/app/bullmq/start-workers.ts
*/
import { logger } from "~/services/logger.service";
import {
ingestWorker,
documentIngestWorker,
conversationTitleWorker,
sessionCompactionWorker,
closeAllWorkers,
bertTopicWorker,
spaceAssignmentWorker,
spaceSummaryWorker,
} from "./workers";
import {
ingestQueue,
documentIngestQueue,
conversationTitleQueue,
sessionCompactionQueue,
bertTopicQueue,
spaceAssignmentQueue,
spaceSummaryQueue,
} from "./queues";
import {
setupWorkerLogging,
startPeriodicMetricsLogging,
} from "./utils/worker-logger";
let metricsInterval: NodeJS.Timeout | null = null;
/**
* Initialize and start all BullMQ workers with comprehensive logging
*/
export async function initWorkers(): Promise<void> {
// Setup comprehensive logging for all workers
setupWorkerLogging(ingestWorker, ingestQueue, "ingest-episode");
setupWorkerLogging(
documentIngestWorker,
documentIngestQueue,
"ingest-document",
);
setupWorkerLogging(
conversationTitleWorker,
conversationTitleQueue,
"conversation-title",
);
setupWorkerLogging(
sessionCompactionWorker,
sessionCompactionQueue,
"session-compaction",
);
setupWorkerLogging(bertTopicWorker, bertTopicQueue, "bert-topic");
setupWorkerLogging(
spaceAssignmentWorker,
spaceAssignmentQueue,
"space-assignment",
);
setupWorkerLogging(spaceSummaryWorker, spaceSummaryQueue, "space-summary");
// Start periodic metrics logging (every 60 seconds)
metricsInterval = startPeriodicMetricsLogging(
[
{ worker: ingestWorker, queue: ingestQueue, name: "ingest-episode" },
{
worker: documentIngestWorker,
queue: documentIngestQueue,
name: "ingest-document",
},
{
worker: conversationTitleWorker,
queue: conversationTitleQueue,
name: "conversation-title",
},
{
worker: sessionCompactionWorker,
queue: sessionCompactionQueue,
name: "session-compaction",
},
{
worker: bertTopicWorker,
queue: bertTopicQueue,
name: "bert-topic",
},
{
worker: spaceAssignmentWorker,
queue: spaceAssignmentQueue,
name: "space-assignment",
},
{
worker: spaceSummaryWorker,
queue: spaceAssignmentQueue,
name: "space-summary",
},
],
60000, // Log metrics every 60 seconds
);
// Log worker startup
logger.log("\n🚀 Starting BullMQ workers...");
logger.log("─".repeat(80));
logger.log(`✓ Ingest worker: ${ingestWorker.name} (concurrency: 5)`);
logger.log(
`✓ Document ingest worker: ${documentIngestWorker.name} (concurrency: 3)`,
);
logger.log(
`✓ Conversation title worker: ${conversationTitleWorker.name} (concurrency: 10)`,
);
logger.log(
`✓ Session compaction worker: ${sessionCompactionWorker.name} (concurrency: 3)`,
);
logger.log("─".repeat(80));
logger.log("✅ All BullMQ workers started and listening for jobs");
logger.log("📊 Metrics will be logged every 60 seconds\n");
}
/**
* Shutdown all workers gracefully
*/
export async function shutdownWorkers(): Promise<void> {
logger.log("Shutdown signal received, closing workers gracefully...");
if (metricsInterval) {
clearInterval(metricsInterval);
}
await closeAllWorkers();
}
// If running as standalone script, initialize workers
if (import.meta.url === `file://${process.argv[1]}`) {
initWorkers();
// Handle graceful shutdown
const shutdown = async () => {
await shutdownWorkers();
process.exit(0);
};
process.on("SIGTERM", shutdown);
process.on("SIGINT", shutdown);
}

View File

@ -0,0 +1,132 @@
/**
* BullMQ Job Finder Utilities
*
* Helper functions to find, retrieve, and cancel BullMQ jobs
*/
interface JobInfo {
id: string;
isCompleted: boolean;
status?: string;
}
/**
* Get all active queues
*/
async function getAllQueues() {
const {
ingestQueue,
documentIngestQueue,
conversationTitleQueue,
sessionCompactionQueue,
} = await import("../queues");
return [
ingestQueue,
documentIngestQueue,
conversationTitleQueue,
sessionCompactionQueue,
];
}
/**
* Find jobs by tags (metadata stored in job data)
* Since BullMQ doesn't have native tag support like Trigger.dev,
* we search through jobs and check if their data contains the required identifiers
*/
export async function getJobsByTags(
tags: string[],
taskIdentifier?: string,
): Promise<JobInfo[]> {
const queues = await getAllQueues();
const matchingJobs: JobInfo[] = [];
for (const queue of queues) {
// Skip if taskIdentifier is specified and doesn't match queue name
if (taskIdentifier && !queue.name.includes(taskIdentifier)) {
continue;
}
// Get all active and waiting jobs
const [active, waiting, delayed] = await Promise.all([
queue.getActive(),
queue.getWaiting(),
queue.getDelayed(),
]);
const allJobs = [...active, ...waiting, ...delayed];
for (const job of allJobs) {
// Check if job data contains all required tags
const jobData = job.data as any;
const matchesTags = tags.every(
(tag) =>
job.id?.includes(tag) ||
jobData.userId === tag ||
jobData.workspaceId === tag ||
jobData.queueId === tag,
);
if (matchesTags) {
const state = await job.getState();
matchingJobs.push({
id: job.id!,
isCompleted: state === "completed" || state === "failed",
status: state,
});
}
}
}
return matchingJobs;
}
/**
* Get a specific job by ID across all queues
*/
export async function getJobById(jobId: string): Promise<JobInfo | null> {
const queues = await getAllQueues();
for (const queue of queues) {
try {
const job = await queue.getJob(jobId);
if (job) {
const state = await job.getState();
return {
id: job.id!,
isCompleted: state === "completed" || state === "failed",
status: state,
};
}
} catch {
// Job not in this queue, continue
continue;
}
}
return null;
}
/**
* Cancel a job by ID
*/
export async function cancelJobById(jobId: string): Promise<void> {
const queues = await getAllQueues();
for (const queue of queues) {
try {
const job = await queue.getJob(jobId);
if (job) {
const state = await job.getState();
// Only remove if not already completed
if (state !== "completed" && state !== "failed") {
await job.remove();
}
return;
}
} catch {
// Job not in this queue, continue
continue;
}
}
}

View File

@ -0,0 +1,184 @@
/**
* BullMQ Worker Logger
*
* Comprehensive logging utility for tracking worker status, queue metrics,
* and job lifecycle events
*/
import { type Worker, type Queue } from "bullmq";
import { logger } from "~/services/logger.service";
interface WorkerMetrics {
name: string;
concurrency: number;
activeJobs: number;
waitingJobs: number;
delayedJobs: number;
failedJobs: number;
completedJobs: number;
}
/**
* Setup comprehensive logging for a worker
*/
export function setupWorkerLogging(
worker: Worker,
queue: Queue,
workerName: string,
): void {
// Job picked up and started processing
worker.on("active", async (job) => {
const counts = await getQueueCounts(queue);
logger.log(
`[${workerName}] 🔄 Job started: ${job.id} | Queue: ${counts.waiting} waiting, ${counts.active} active, ${counts.delayed} delayed`,
);
});
// Job completed successfully
worker.on("completed", async (job, result) => {
const counts = await getQueueCounts(queue);
const duration = job.finishedOn ? job.finishedOn - job.processedOn! : 0;
logger.log(
`[${workerName}] ✅ Job completed: ${job.id} (${duration}ms) | Queue: ${counts.waiting} waiting, ${counts.active} active`,
);
});
// Job failed
worker.on("failed", async (job, error) => {
const counts = await getQueueCounts(queue);
const attempt = job?.attemptsMade || 0;
const maxAttempts = job?.opts?.attempts || 3;
logger.error(
`[${workerName}] ❌ Job failed: ${job?.id} (attempt ${attempt}/${maxAttempts}) | Error: ${error.message} | Queue: ${counts.waiting} waiting, ${counts.failed} failed`,
);
});
// Job progress update (if job reports progress)
worker.on("progress", async (job, progress) => {
logger.log(`[${workerName}] 📊 Job progress: ${job.id} - ${progress}%`);
});
// Worker stalled (job took too long)
worker.on("stalled", async (jobId) => {
logger.warn(`[${workerName}] ⚠️ Job stalled: ${jobId}`);
});
// Worker error
worker.on("error", (error) => {
logger.error(`[${workerName}] 🔥 Worker error: ${error.message}`);
});
// Worker closed
worker.on("closed", () => {
logger.log(`[${workerName}] 🛑 Worker closed`);
});
}
/**
* Get queue counts for logging
*/
async function getQueueCounts(queue: Queue): Promise<{
waiting: number;
active: number;
delayed: number;
failed: number;
completed: number;
}> {
try {
const counts = await queue.getJobCounts(
"waiting",
"active",
"delayed",
"failed",
"completed",
);
return {
waiting: counts.waiting || 0,
active: counts.active || 0,
delayed: counts.delayed || 0,
failed: counts.failed || 0,
completed: counts.completed || 0,
};
} catch (error) {
return { waiting: 0, active: 0, delayed: 0, failed: 0, completed: 0 };
}
}
/**
* Get metrics for all workers
*/
export async function getAllWorkerMetrics(
workers: Array<{ worker: Worker; queue: Queue; name: string }>,
): Promise<WorkerMetrics[]> {
const metrics = await Promise.all(
workers.map(async ({ worker, queue, name }) => {
const counts = await getQueueCounts(queue);
return {
name,
concurrency: worker.opts.concurrency || 1,
activeJobs: counts.active,
waitingJobs: counts.waiting,
delayedJobs: counts.delayed,
failedJobs: counts.failed,
completedJobs: counts.completed,
};
}),
);
return metrics;
}
/**
* Log worker metrics summary
*/
export function logWorkerMetrics(metrics: WorkerMetrics[]): void {
logger.log("\n📊 BullMQ Worker Metrics:");
logger.log("─".repeat(80));
for (const metric of metrics) {
logger.log(
`[${metric.name.padEnd(25)}] Concurrency: ${metric.concurrency} | ` +
`Active: ${metric.activeJobs} | Waiting: ${metric.waitingJobs} | ` +
`Delayed: ${metric.delayedJobs} | Failed: ${metric.failedJobs} | ` +
`Completed: ${metric.completedJobs}`,
);
}
const totals = metrics.reduce(
(acc, m) => ({
active: acc.active + m.activeJobs,
waiting: acc.waiting + m.waitingJobs,
delayed: acc.delayed + m.delayedJobs,
failed: acc.failed + m.failedJobs,
completed: acc.completed + m.completedJobs,
}),
{ active: 0, waiting: 0, delayed: 0, failed: 0, completed: 0 },
);
logger.log("─".repeat(80));
logger.log(
`[TOTAL] Active: ${totals.active} | Waiting: ${totals.waiting} | ` +
`Delayed: ${totals.delayed} | Failed: ${totals.failed} | ` +
`Completed: ${totals.completed}`,
);
logger.log("─".repeat(80) + "\n");
}
/**
* Start periodic metrics logging
*/
export function startPeriodicMetricsLogging(
workers: Array<{ worker: Worker; queue: Queue; name: string }>,
intervalMs: number = 60000, // Default: 1 minute
): NodeJS.Timeout {
const logMetrics = async () => {
const metrics = await getAllWorkerMetrics(workers);
logWorkerMetrics(metrics);
};
// Log immediately on start
logMetrics();
// Then log periodically
return setInterval(logMetrics, intervalMs);
}

View File

@ -0,0 +1,200 @@
/**
* BullMQ Workers
*
* All worker definitions for processing background jobs with BullMQ
*/
import { Worker } from "bullmq";
import { getRedisConnection } from "../connection";
import {
processEpisodeIngestion,
type IngestEpisodePayload,
} from "~/jobs/ingest/ingest-episode.logic";
import {
processDocumentIngestion,
type IngestDocumentPayload,
} from "~/jobs/ingest/ingest-document.logic";
import {
processConversationTitleCreation,
type CreateConversationTitlePayload,
} from "~/jobs/conversation/create-title.logic";
import {
processSessionCompaction,
type SessionCompactionPayload,
} from "~/jobs/session/session-compaction.logic";
import {
processTopicAnalysis,
type TopicAnalysisPayload,
} from "~/jobs/bert/topic-analysis.logic";
import {
enqueueIngestEpisode,
enqueueSpaceAssignment,
enqueueSessionCompaction,
enqueueBertTopicAnalysis,
enqueueSpaceSummary,
} from "~/lib/queue-adapter.server";
import { logger } from "~/services/logger.service";
import {
processSpaceAssignment,
type SpaceAssignmentPayload,
} from "~/jobs/spaces/space-assignment.logic";
import {
processSpaceSummary,
type SpaceSummaryPayload,
} from "~/jobs/spaces/space-summary.logic";
/**
* Episode ingestion worker
* Processes individual episode ingestion jobs with global concurrency
*
* Note: BullMQ uses global concurrency limit (5 jobs max).
* Trigger.dev uses per-user concurrency via concurrencyKey.
* For most open-source deployments, global concurrency is sufficient.
*/
export const ingestWorker = new Worker(
"ingest-queue",
async (job) => {
const payload = job.data as IngestEpisodePayload;
return await processEpisodeIngestion(
payload,
// Callbacks to enqueue follow-up jobs
enqueueSpaceAssignment,
enqueueSessionCompaction,
enqueueBertTopicAnalysis,
);
},
{
connection: getRedisConnection(),
concurrency: 1, // Global limit: process up to 1 jobs in parallel
},
);
/**
* Document ingestion worker
* Handles document-level ingestion with differential processing
*
* Note: Per-user concurrency is achieved by using userId as part of the jobId
* when adding jobs to the queue
*/
export const documentIngestWorker = new Worker(
"document-ingest-queue",
async (job) => {
const payload = job.data as IngestDocumentPayload;
return await processDocumentIngestion(
payload,
// Callback to enqueue episode ingestion for each chunk
enqueueIngestEpisode,
);
},
{
connection: getRedisConnection(),
concurrency: 3, // Process up to 3 documents in parallel
},
);
/**
* Conversation title creation worker
*/
export const conversationTitleWorker = new Worker(
"conversation-title-queue",
async (job) => {
const payload = job.data as CreateConversationTitlePayload;
return await processConversationTitleCreation(payload);
},
{
connection: getRedisConnection(),
concurrency: 10, // Process up to 10 title creations in parallel
},
);
/**
* Session compaction worker
*/
export const sessionCompactionWorker = new Worker(
"session-compaction-queue",
async (job) => {
const payload = job.data as SessionCompactionPayload;
return await processSessionCompaction(payload);
},
{
connection: getRedisConnection(),
concurrency: 3, // Process up to 3 compactions in parallel
},
);
/**
* BERT topic analysis worker
* Handles CPU-intensive topic modeling
*/
export const bertTopicWorker = new Worker(
"bert-topic-queue",
async (job) => {
const payload = job.data as TopicAnalysisPayload;
return await processTopicAnalysis(
payload,
// Callback to enqueue space summary
enqueueSpaceSummary,
);
},
{
connection: getRedisConnection(),
concurrency: 2, // Process up to 2 analyses in parallel (CPU-intensive)
},
);
/**
* Space assignment worker
* Handles assigning episodes to spaces based on semantic matching
*
* Note: Global concurrency of 1 ensures sequential processing.
* Trigger.dev uses per-user concurrency via concurrencyKey.
*/
export const spaceAssignmentWorker = new Worker(
"space-assignment-queue",
async (job) => {
const payload = job.data as SpaceAssignmentPayload;
return await processSpaceAssignment(
payload,
// Callback to enqueue space summary
enqueueSpaceSummary,
);
},
{
connection: getRedisConnection(),
concurrency: 1, // Global limit: process one job at a time
},
);
/**
* Space summary worker
* Handles generating summaries for spaces
*/
export const spaceSummaryWorker = new Worker(
"space-summary-queue",
async (job) => {
const payload = job.data as SpaceSummaryPayload;
return await processSpaceSummary(payload);
},
{
connection: getRedisConnection(),
concurrency: 1, // Process one space summary at a time
},
);
/**
* Graceful shutdown handler
*/
export async function closeAllWorkers(): Promise<void> {
await Promise.all([
ingestWorker.close(),
documentIngestWorker.close(),
conversationTitleWorker.close(),
sessionCompactionWorker.close(),
bertTopicWorker.close(),
spaceSummaryWorker.close(),
spaceAssignmentWorker.close(),
]);
logger.log("All BullMQ workers closed");
}

View File

@ -1,146 +0,0 @@
import React, { useMemo } from "react";
import CalendarHeatmap from "react-calendar-heatmap";
import { cn } from "~/lib/utils";
import { Popover, PopoverAnchor, PopoverContent } from "../ui/popover";
interface ContributionGraphProps {
data: Array<{
date: string;
count: number;
status?: string;
}>;
className?: string;
}
export function ContributionGraph({ data, className }: ContributionGraphProps) {
const [open, setOpen] = React.useState(false);
const [anchor, setAnchor] = React.useState<{ x: number; y: number } | null>(
null,
);
const [active, setActive] = React.useState<any>(null);
const containerRef = React.useRef<HTMLDivElement>(null);
const processedData = useMemo(() => {
const endDate = new Date();
const startDate = new Date();
startDate.setFullYear(endDate.getFullYear() - 1);
return data.map((item) => ({
date: item.date,
count: item.count,
status: item.status,
}));
}, [data]);
const getClassForValue = (value: any) => {
if (!value || value.count === 0) {
return "fill-background dark:fill-background";
}
const count = value.count;
if (count >= 20) return "fill-success";
if (count >= 15) return "fill-success/85";
if (count >= 10) return "fill-success/70";
if (count >= 5) return "fill-success/50";
return "fill-success/30";
};
const getTitleForValue = (value: any) => {
if (!value || value.count === 0) {
return `No activity on ${value?.date || "this date"}`;
}
const count = value.count;
const date = new Date(value.date).toLocaleDateString();
return `${count} ${count === 1 ? "activity" : "activities"} on ${date}`;
};
const endDate = new Date();
const startDate = new Date();
startDate.setFullYear(endDate.getFullYear() - 1);
// Position helpers: convert client coords to container-local coords
const getLocalPoint = (e: React.MouseEvent<SVGRectElement, MouseEvent>) => {
const rect = containerRef.current?.getBoundingClientRect();
if (!rect) return { x: e.clientX, y: e.clientY };
return { x: e.clientX, y: e.clientY };
};
return (
<div
ref={containerRef}
className={cn("flex w-full flex-col justify-center", className)}
>
<Popover open={open} onOpenChange={setOpen}>
{anchor && (
<PopoverAnchor
// Absolutely position the anchor relative to the container
style={{
position: "absolute",
left: anchor.x,
top: anchor.y,
width: 1,
height: 1,
}}
/>
)}
<PopoverContent
className="shadow-1 bg-background-3 w-fit p-2"
side="top"
align="center"
>
{active ? (
<div className="space-y-1">
<div className="text-sm font-medium">
{new Date(active.date).toDateString()}
</div>
<div className="text-muted-foreground text-sm">
{active.count ?? 0} events
</div>
{active.meta?.notes && (
<p className="mt-2 text-sm">{active.meta.notes}</p>
)}
</div>
) : (
<div className="text-sm">No data</div>
)}
</PopoverContent>
</Popover>
<div className="overflow-x-auto rounded-lg">
<CalendarHeatmap
startDate={startDate}
endDate={endDate}
values={processedData}
classForValue={getClassForValue}
titleForValue={getTitleForValue}
showWeekdayLabels={true}
showMonthLabels={true}
gutterSize={2}
horizontal={true}
transformDayElement={(element: any, value) => {
// React clones the <rect>. We add handlers to open the shared popover.
return React.cloneElement(element, {
onClick: (e: React.MouseEvent<SVGRectElement>) => {
setActive(value);
setAnchor(getLocalPoint(e));
setOpen(true);
},
onMouseEnter: (e: React.MouseEvent<SVGRectElement>) => {
// If you want hover popovers, uncomment:
setActive(value);
setAnchor(getLocalPoint(e));
setOpen(true);
},
onMouseLeave: () => {
// For hover behavior, you might want a small delay instead of closing immediately.
setOpen(false);
},
style: { cursor: "pointer" },
});
}}
/>
</div>
</div>
);
}

View File

@ -28,7 +28,8 @@ export const useTokensColumns = (): Array<ColumnDef<PersonalAccessToken>> => {
const [open, setOpen] = React.useState(false);
const onDelete = (id: string) => {
fetcher.submit({ id }, { method: "DELETE", action: "/home/api" });
fetcher.submit({ id }, { method: "DELETE", action: "/settings/api" });
setOpen(false);
};
return [

View File

@ -0,0 +1,71 @@
import { useState } from "react";
import { FileText, Plus } from "lucide-react";
import {
CommandDialog,
CommandGroup,
CommandInput,
CommandItem,
CommandList,
} from "../ui/command";
import { AddMemoryDialog } from "./memory-dialog.client";
import { AddDocumentDialog } from "./document-dialog";
interface AddMemoryCommandProps {
open: boolean;
onOpenChange: (open: boolean) => void;
}
export function AddMemoryCommand({
open,
onOpenChange,
}: AddMemoryCommandProps) {
const [showAddMemory, setShowAddMemory] = useState(false);
const [showAddDocument, setShowAddDocument] = useState(false);
const handleAddMemory = () => {
onOpenChange(false);
setShowAddMemory(true);
};
const handleAddDocument = () => {
onOpenChange(false);
setShowAddDocument(true);
};
return (
<>
{/* Main Command Dialog */}
<CommandDialog open={open} onOpenChange={onOpenChange}>
<CommandInput placeholder="Search" className="py-1" />
<CommandList>
<CommandGroup heading="Add to Memory">
<CommandItem
onSelect={handleAddMemory}
className="flex items-center gap-2 py-1"
>
<Plus className="mr-2 h-4 w-4" />
<span>Add Memory</span>
</CommandItem>
<CommandItem
onSelect={handleAddDocument}
className="flex items-center gap-2 py-1"
>
<FileText className="mr-2 h-4 w-4" />
<span>Add Document</span>
</CommandItem>
</CommandGroup>
</CommandList>
</CommandDialog>
{showAddMemory && (
<AddMemoryDialog open={showAddMemory} onOpenChange={setShowAddMemory} />
)}
{/* Add Document Dialog */}
<AddDocumentDialog
open={showAddDocument}
onOpenChange={setShowAddDocument}
/>
</>
);
}

View File

@ -0,0 +1,27 @@
import { Dialog, DialogContent, DialogHeader, DialogTitle } from "../ui/dialog";
interface AddDocumentDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
}
export function AddDocumentDialog({
open,
onOpenChange,
}: AddDocumentDialogProps) {
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="sm:max-w-[600px]">
<DialogHeader>
<DialogTitle>Add Document</DialogTitle>
</DialogHeader>
{/* TODO: Add document content here */}
<div className="border-border rounded-md border p-4">
<p className="text-muted-foreground text-sm">
Document upload content goes here...
</p>
</div>
</DialogContent>
</Dialog>
);
}

View File

@ -0,0 +1,95 @@
import { Dialog, DialogContent, DialogHeader, DialogTitle } from "../ui/dialog";
import { useEditor, EditorContent } from "@tiptap/react";
import {
extensionsForConversation,
getPlaceholder,
} from "../conversation/editor-extensions";
import { Button } from "../ui/button";
import { SpaceDropdown } from "../spaces/space-dropdown";
import React from "react";
import { useFetcher } from "@remix-run/react";
interface AddMemoryDialogProps {
open: boolean;
onOpenChange: (open: boolean) => void;
defaultSpaceId?: string;
}
export function AddMemoryDialog({
open,
onOpenChange,
defaultSpaceId,
}: AddMemoryDialogProps) {
const [spaceIds, setSpaceIds] = React.useState<string[]>(
defaultSpaceId ? [defaultSpaceId] : [],
);
const fetcher = useFetcher();
const editor = useEditor({
extensions: [
...extensionsForConversation,
getPlaceholder("Write your memory here..."),
],
editorProps: {
attributes: {
class:
"prose prose-sm focus:outline-none max-w-full min-h-[200px] p-4 py-0",
},
},
});
const handleAdd = async () => {
const content = editor?.getText();
if (!content?.trim()) return;
const payload = {
episodeBody: content,
referenceTime: new Date().toISOString(),
spaceIds: spaceIds,
source: "core",
};
fetcher.submit(payload, {
method: "POST",
action: "/api/v1/add",
encType: "application/json",
});
// Clear editor and close dialog
editor?.commands.clearContent();
setSpaceIds([]);
onOpenChange(false);
};
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="pt-0 sm:max-w-[600px]">
<div className="overflow-hidden rounded-md">
<EditorContent editor={editor} />
</div>
<div className="flex justify-between gap-2 px-4 pb-4">
<div>
<SpaceDropdown
episodeIds={[]}
selectedSpaceIds={spaceIds}
onSpaceChange={(spaceIds) => {
setSpaceIds(spaceIds);
}}
/>
</div>
<div className="flex gap-2">
<Button variant="ghost" onClick={() => onOpenChange(false)}>
Cancel
</Button>
<Button
variant="secondary"
onClick={handleAdd}
isLoading={fetcher.state !== "idle"}
>
Add
</Button>
</div>
</div>
</DialogContent>
</Dialog>
);
}

View File

@ -28,7 +28,7 @@ export interface PageHeaderProps {
actions?: PageHeaderAction[];
actionsNode?: React.ReactNode;
tabs?: PageHeaderTab[];
showBackForward?: boolean;
showTrigger?: boolean;
}
// Back and Forward navigation component
@ -66,7 +66,7 @@ export function PageHeader({
breadcrumbs,
actions,
tabs,
showBackForward = true,
showTrigger = true,
actionsNode,
}: PageHeaderProps) {
const navigation = useNavigation();
@ -95,9 +95,7 @@ export function PageHeader({
</style>
<div className="flex w-full items-center justify-between gap-1 px-4 pr-2 lg:gap-2">
<div className="-ml-1 flex items-center gap-1">
{/* Back/Forward navigation before SidebarTrigger */}
{showBackForward && <NavigationBackForward />}
<SidebarTrigger className="mr-1" />
{showTrigger && <SidebarTrigger className="mr-1" />}
{/* Breadcrumbs */}
{breadcrumbs && breadcrumbs.length > 0 ? (

View File

@ -0,0 +1,219 @@
import ReactMarkdown, {type Components } from "react-markdown";
import { cn } from "~/lib/utils";
const markdownComponents: Components = {
h1: ({ className, ...props }) => (
<h1
className={cn("mt-2 mb-1 text-3xl font-bold tracking-tight", className)}
{...props}
/>
),
h2: ({ className, ...props }) => (
<h2
className={cn(
"mt-2 mb-1 text-2xl font-semibold tracking-tight",
className,
)}
{...props}
/>
),
h3: ({ className, ...props }) => (
<h3
className={cn(
"mt-2 mb-1 text-xl font-semibold tracking-tight",
className,
)}
{...props}
/>
),
h4: ({ className, ...props }) => (
<h4
className={cn(
"mt-1.5 mb-0.5 text-lg font-semibold tracking-tight",
className,
)}
{...props}
/>
),
h5: ({ className, ...props }) => (
<h5
className={cn(
"mt-1.5 mb-0.5 text-base font-semibold tracking-tight",
className,
)}
{...props}
/>
),
h6: ({ className, ...props }) => (
<h6
className={cn(
"mt-1.5 mb-0.5 text-sm font-semibold tracking-tight",
className,
)}
{...props}
/>
),
p: ({ className, ...props }) => (
<p
className={cn(
"mb-1 leading-normal [&:not(:first-child)]:mt-1",
className,
)}
{...props}
/>
),
ul: ({ className, ...props }) => (
<ul
className={cn(
"my-1 ml-5 flex list-disc flex-col space-y-0 marker:text-gray-700 dark:marker:text-gray-400",
className,
)}
{...props}
/>
),
ol: ({ className, ...props }) => (
<ol
className={cn(
"my-1 ml-5 list-decimal space-y-0 marker:text-gray-700 dark:marker:text-gray-400",
className,
)}
{...props}
/>
),
li: ({ className, ...props }) => (
<li className={cn("py-0.5 pl-1 leading-normal", className)} {...props} />
),
blockquote: ({ className, ...props }) => (
<blockquote
className={cn(
"mt-1 mb-1 border-l-4 border-gray-300 pl-4 text-gray-700 italic dark:border-gray-600 dark:text-gray-300",
className,
)}
{...props}
/>
),
code: ({ className, inline, ...props }: any) =>
inline ? (
<code
className={cn(
"rounded bg-gray-100 px-1.5 py-0.5 font-mono text-sm text-gray-800 dark:bg-gray-800 dark:text-gray-200",
className,
)}
{...props}
/>
) : (
<code
className={cn(
"block rounded-lg bg-gray-100 p-4 font-mono text-sm text-gray-800 dark:bg-gray-800 dark:text-gray-200",
className,
)}
{...props}
/>
),
pre: ({ className, ...props }) => (
<pre
className={cn(
"mb-1 overflow-x-auto rounded-lg bg-gray-100 p-4 dark:bg-gray-800",
className,
)}
{...props}
/>
),
a: ({ className, ...props }) => (
<a
className={cn(
"font-medium text-blue-600 underline underline-offset-4 hover:text-blue-800 dark:text-blue-400 dark:hover:text-blue-300",
className,
)}
{...props}
/>
),
hr: ({ className, ...props }) => (
<hr
className={cn(
"my-2 border-t border-gray-300 dark:border-gray-600",
className,
)}
{...props}
/>
),
table: ({ className, ...props }) => (
<div className="mb-1 w-full overflow-auto">
<table
className={cn(
"w-full border-collapse border border-gray-300 dark:border-gray-600",
className,
)}
{...props}
/>
</div>
),
thead: ({ className, ...props }) => (
<thead
className={cn("bg-gray-100 dark:bg-gray-800", className)}
{...props}
/>
),
tbody: ({ className, ...props }) => (
<tbody className={cn("", className)} {...props} />
),
tr: ({ className, ...props }) => (
<tr
className={cn("border-b border-gray-300 dark:border-gray-600", className)}
{...props}
/>
),
th: ({ className, ...props }) => (
<th
className={cn(
"border border-gray-300 px-4 py-2 text-left font-semibold dark:border-gray-600",
className,
)}
{...props}
/>
),
td: ({ className, ...props }) => (
<td
className={cn(
"border border-gray-300 px-4 py-2 dark:border-gray-600",
className,
)}
{...props}
/>
),
strong: ({ className, ...props }) => (
<strong className={cn("font-bold", className)} {...props} />
),
em: ({ className, ...props }) => (
<em className={cn("italic", className)} {...props} />
),
};
interface StyledMarkdownProps {
children: string;
className?: string;
components?: Components;
}
export function StyledMarkdown({
children,
className,
components,
}: StyledMarkdownProps) {
return (
<div
className={cn(
"max-w-none",
"[&_ul_ul]:my-0.5 [&_ul_ul]:ml-4",
"[&_ol_ol]:my-0.5 [&_ol_ol]:ml-4",
"[&_ul_ol]:my-0.5 [&_ul_ol]:ml-4",
"[&_ol_ul]:my-0.5 [&_ol_ul]:ml-4",
className,
)}
>
<ReactMarkdown components={{ ...markdownComponents, ...components }}>
{children}
</ReactMarkdown>
</div>
);
}

View File

@ -1,38 +1,42 @@
import { EditorContent, useEditor } from "@tiptap/react";
import { useEffect } from "react";
import { UserTypeEnum } from "@core/types";
import { type ConversationHistory } from "@core/database";
import { useEffect, memo } from "react";
import { cn } from "~/lib/utils";
import { extensionsForConversation } from "./editor-extensions";
import { skillExtension } from "../editor/skill-extension";
import { type UIMessage } from "ai";
interface AIConversationItemProps {
conversationHistory: ConversationHistory;
message: UIMessage;
}
export const ConversationItem = ({
conversationHistory,
}: AIConversationItemProps) => {
const isUser =
conversationHistory.userType === UserTypeEnum.User ||
conversationHistory.userType === UserTypeEnum.System;
function getMessage(message: string) {
let finalMessage = message.replace("<final_response>", "");
finalMessage = finalMessage.replace("</final_response>", "");
finalMessage = finalMessage.replace("<question_response>", "");
finalMessage = finalMessage.replace("</question_response>", "");
const id = `a${conversationHistory.id.replace(/-/g, "")}`;
return finalMessage;
}
const ConversationItemComponent = ({ message }: AIConversationItemProps) => {
const isUser = message.role === "user" || false;
const textPart = message.parts.find((part) => part.type === "text");
const editor = useEditor({
extensions: [...extensionsForConversation, skillExtension],
editable: false,
content: conversationHistory.message,
content: textPart ? getMessage(textPart.text) : "",
});
useEffect(() => {
editor?.commands.setContent(conversationHistory.message);
if (textPart) {
editor?.commands.setContent(getMessage(textPart.text));
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [id, conversationHistory.message]);
}, [message]);
if (!conversationHistory.message) {
if (!message) {
return null;
}
@ -49,3 +53,12 @@ export const ConversationItem = ({
</div>
);
};
// Memoize to prevent unnecessary re-renders
export const ConversationItem = memo(
ConversationItemComponent,
(prevProps, nextProps) => {
// Only re-render if the conversation history ID or message changed
return prevProps.message === nextProps.message;
},
);

View File

@ -57,9 +57,7 @@ export const ConversationList = ({
limit: "5", // Increased for better density
});
fetcher.load(`/api/v1/conversations?${searchParams}`, {
flushSync: true,
});
fetcher.load(`/api/v1/conversations?${searchParams}`);
},
[isLoading, fetcher],
);

View File

@ -5,28 +5,30 @@ import { Paragraph } from "@tiptap/extension-paragraph";
import { Text } from "@tiptap/extension-text";
import { type Editor } from "@tiptap/react";
import { EditorContent, Placeholder, EditorRoot } from "novel";
import { useCallback, useState } from "react";
import { useCallback, useState, useEffect } from "react";
import { cn } from "~/lib/utils";
import { Button } from "../ui";
import { LoaderCircle } from "lucide-react";
import { Form, useSubmit } from "@remix-run/react";
import { Form, useSubmit, useActionData } from "@remix-run/react";
interface ConversationTextareaProps {
defaultValue?: string;
conversationId: string;
placeholder?: string;
isLoading?: boolean;
className?: string;
onChange?: (text: string) => void;
disabled?: boolean;
onConversationCreated?: (message: string) => void;
stop?: () => void;
}
export function ConversationTextarea({
defaultValue,
isLoading = false,
placeholder,
conversationId,
onChange,
onConversationCreated,
stop,
}: ConversationTextareaProps) {
const [text, setText] = useState(defaultValue ?? "");
const [editor, setEditor] = useState<Editor>();
@ -42,131 +44,99 @@ export function ConversationTextarea({
return;
}
const data = isLoading ? {} : { message: text, conversationId };
submit(data as any, {
action: isLoading
? `/home/conversation/${conversationId}`
: "/home/conversation",
method: "post",
});
onConversationCreated && onConversationCreated(text);
editor?.commands.clearContent(true);
setText("");
editor.commands.clearContent(true);
setText("");
}, [editor, text]);
// Send message to API
const submitForm = useCallback(
async (e: React.FormEvent<HTMLFormElement>) => {
const data = isLoading
? {}
: { message: text, title: text, conversationId };
submit(data as any, {
action: isLoading
? `/home/conversation/${conversationId}`
: "/home/conversation",
method: "post",
});
editor?.commands.clearContent(true);
setText("");
e.preventDefault();
},
[text, conversationId],
);
return (
<Form
action="/home/conversation"
method="post"
onSubmit={(e) => submitForm(e)}
className="pt-2"
>
<div className="bg-background-3 rounded-lg border-1 border-gray-300 py-2">
<EditorRoot>
<EditorContent
// eslint-disable-next-line @typescript-eslint/no-explicit-any
initialContent={defaultValue as any}
extensions={[
Document,
Paragraph,
Text,
HardBreak.configure({
keepMarks: true,
}),
<div className="bg-background-3 rounded-lg border-1 border-gray-300 py-2">
<EditorRoot>
<EditorContent
// eslint-disable-next-line @typescript-eslint/no-explicit-any
initialContent={defaultValue as any}
extensions={[
Document,
Paragraph,
Text,
HardBreak.configure({
keepMarks: true,
}),
Placeholder.configure({
placeholder: () => placeholder ?? "Ask sol...",
includeChildren: true,
}),
History,
]}
onCreate={async ({ editor }) => {
setEditor(editor);
await new Promise((resolve) => setTimeout(resolve, 100));
editor.commands.focus("end");
}}
onUpdate={({ editor }) => {
onUpdate(editor);
}}
shouldRerenderOnTransaction={false}
editorProps={{
attributes: {
class: `prose prose-lg dark:prose-invert prose-headings:font-title font-default focus:outline-none max-w-full`,
},
handleKeyDown(view, event) {
if (event.key === "Enter" && !event.shiftKey) {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const target = event.target as any;
if (target.innerHTML.includes("suggestion")) {
return false;
}
event.preventDefault();
if (text) {
handleSend();
}
return true;
Placeholder.configure({
placeholder: () => placeholder ?? "Ask sol...",
includeChildren: true,
}),
History,
]}
onCreate={async ({ editor }) => {
setEditor(editor);
await new Promise((resolve) => setTimeout(resolve, 100));
editor.commands.focus("end");
}}
onUpdate={({ editor }) => {
onUpdate(editor);
}}
shouldRerenderOnTransaction={false}
editorProps={{
attributes: {
class: `prose prose-lg dark:prose-invert prose-headings:font-title font-default focus:outline-none max-w-full`,
},
handleKeyDown(view, event) {
if (event.key === "Enter" && !event.shiftKey) {
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const target = event.target as any;
if (target.innerHTML.includes("suggestion")) {
return false;
}
event.preventDefault();
if (text) {
handleSend();
}
return true;
}
if (event.key === "Enter" && event.shiftKey) {
view.dispatch(
view.state.tr.replaceSelectionWith(
view.state.schema.nodes.hardBreak.create(),
),
);
return true;
}
return false;
},
}}
immediatelyRender={false}
className={cn(
"editor-container text-md max-h-[400px] min-h-[40px] w-full min-w-full overflow-auto rounded-lg px-3",
)}
/>
</EditorRoot>
<div className="mb-1 flex justify-end px-3">
<Button
variant="default"
className="gap-1 shadow-none transition-all duration-500 ease-in-out"
type="submit"
size="lg"
>
{isLoading ? (
<>
<LoaderCircle size={18} className="mr-1 animate-spin" />
Stop
</>
) : (
<>Chat</>
)}
</Button>
</div>
if (event.key === "Enter" && event.shiftKey) {
view.dispatch(
view.state.tr.replaceSelectionWith(
view.state.schema.nodes.hardBreak.create(),
),
);
return true;
}
return false;
},
}}
immediatelyRender={false}
className={cn(
"editor-container text-md max-h-[400px] min-h-[40px] w-full min-w-full overflow-auto rounded-lg px-3",
)}
/>
</EditorRoot>
<div className="mb-1 flex justify-end px-3">
<Button
variant="default"
className="gap-1 shadow-none transition-all duration-500 ease-in-out"
onClick={() => {
if (!isLoading) {
handleSend();
} else {
stop && stop();
}
}}
size="lg"
>
{isLoading ? (
<>
<LoaderCircle size={18} className="mr-1 animate-spin" />
Stop
</>
) : (
<>Chat</>
)}
</Button>
</div>
</Form>
</div>
);
}

View File

@ -9,6 +9,7 @@ import TableHeader from "@tiptap/extension-table-header";
import TableRow from "@tiptap/extension-table-row";
import { all, createLowlight } from "lowlight";
import { mergeAttributes, type Extension } from "@tiptap/react";
import { Markdown } from "tiptap-markdown";
// create a lowlight instance with all languages loaded
export const lowlight = createLowlight(all);
@ -136,4 +137,5 @@ export const extensionsForConversation = [
CodeBlockLowlight.configure({
lowlight,
}),
Markdown,
];

View File

@ -17,7 +17,7 @@ export const StreamingConversation = ({
afterStreaming,
apiURL,
}: StreamingConversationProps) => {
const { message, isEnd } = useTriggerStream(runId, token, apiURL);
const { message } = useTriggerStream(runId, token, apiURL, afterStreaming);
const [loadingText, setLoadingText] = React.useState("Thinking...");
const loadingMessages = [
@ -48,13 +48,6 @@ export const StreamingConversation = ({
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [message]);
React.useEffect(() => {
if (isEnd) {
afterStreaming();
}
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [isEnd]);
React.useEffect(() => {
let currentIndex = 0;
let delay = 5000; // Start with 2 seconds for more thinking time

View File

@ -1,5 +1,5 @@
import { useRealtimeRunWithStreams } from "@trigger.dev/react-hooks";
import React from "react";
import React, { useEffect, useState } from "react";
import { EventSource, type ErrorEvent } from "eventsource";
const getTriggerAPIURL = (apiURL?: string) => {
return (
@ -12,102 +12,53 @@ export const useTriggerStream = (
runId: string,
token: string,
apiURL?: string,
afterStreaming?: (finalMessage: string) => void,
) => {
// Need to fix this later
const baseURL = React.useMemo(() => getTriggerAPIURL(apiURL), [apiURL]);
const [error, setError] = useState<ErrorEvent | null>(null);
const [message, setMessage] = useState("");
const { error, streams, run } = useRealtimeRunWithStreams(runId, {
accessToken: token,
baseURL, // Optional if you are using a self-hosted Trigger.dev instance
});
useEffect(() => {
startStreaming();
}, []);
const isEnd = React.useMemo(() => {
if (error) {
return true;
}
const startStreaming = () => {
const eventSource = new EventSource(
`${baseURL}/realtime/v1/streams/${runId}/messages`,
{
fetch: (input, init) =>
fetch(input, {
...init,
headers: {
...init.headers,
Authorization: `Bearer ${token}`,
},
}),
},
);
if (
run &&
[
"COMPLETED",
"CANCELED",
"FAILED",
"CRASHED",
"INTERRUPTED",
"SYSTEM_FAILURE",
"EXPIRED",
"TIMED_OUT",
].includes(run?.status)
) {
return true;
}
eventSource.onmessage = (event) => {
try {
const eventData = JSON.parse(event.data);
const hasStreamEnd =
streams.messages &&
streams.messages.filter((item) => {
// Check if the item has a type that includes 'MESSAGE_' and is not empty
return item.type?.includes("STREAM_END");
});
if (eventData.type.includes("MESSAGE_")) {
setMessage((prevMessage) => prevMessage + eventData.message);
}
} catch (e) {
console.error("Failed to parse message:", e);
}
};
if (hasStreamEnd && hasStreamEnd.length > 0) {
return true;
}
eventSource.onerror = (err) => {
console.error("EventSource failed:", err);
setError(err);
eventSource.close();
if (afterStreaming) {
afterStreaming(message);
}
};
};
return false;
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [run?.status, error, streams.messages?.length]);
const message = React.useMemo(() => {
if (!streams?.messages) {
return "";
}
// Filter and combine all message chunks
return streams.messages
.filter((item) => {
// Check if the item has a type that includes 'MESSAGE_' and is not empty
return item.type?.includes("MESSAGE_");
})
.map((item) => item.message)
.join("");
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [streams.messages?.length]);
// const actionMessages = React.useMemo(() => {
// if (!streams?.messages) {
// return {};
// }
// // eslint-disable-next-line @typescript-eslint/no-explicit-any
// const messages: Record<string, { isStreaming: boolean; content: any[] }> =
// {};
// streams.messages.forEach((item) => {
// if (item.type?.includes("SKILL_")) {
// try {
// const parsed = JSON.parse(item.message);
// const skillId = parsed.skillId;
// if (!messages[skillId]) {
// messages[skillId] = { isStreaming: true, content: [] };
// }
// if (item.type === "SKILL_END") {
// messages[skillId].isStreaming = false;
// }
// messages[skillId].content.push(parsed);
// } catch (e) {
// console.error("Failed to parse message:", e);
// }
// }
// });
// return messages;
// // eslint-disable-next-line react-hooks/exhaustive-deps
// }, [streams.messages?.length]);
return { isEnd, message, actionMessages: [] };
return { error, message, actionMessages: [] };
};

View File

@ -2,25 +2,20 @@ import { NodeViewWrapper } from "@tiptap/react";
import React from "react";
import { getIcon as iconUtil, type IconType } from "../../icon-utils";
import { ChevronDown, ChevronRight } from "lucide-react";
import StaticLogo from "~/components/logo/logo";
// eslint-disable-next-line @typescript-eslint/no-explicit-any
export const SkillComponent = (props: any) => {
const id = props.node.attrs.id;
const name = props.node.attrs.name;
const agent = props.node.attrs.agent;
const [open, setOpen] = React.useState(false);
if (id === "undefined" || id === undefined || !name) {
return null;
}
const getIcon = () => {
const Icon = iconUtil(agent as IconType);
return <Icon size={18} className="rounded-sm" />;
return <StaticLogo size={18} className="rounded-sm" />;
};
const snakeToTitleCase = (input: string): string => {
@ -46,7 +41,7 @@ export const SkillComponent = (props: any) => {
<>
<div className="bg-grayAlpha-100 text-sm-md mt-0.5 flex w-fit items-center gap-2 rounded p-2">
{getIcon()}
<span className="font-mono text-sm">{snakeToTitleCase(name)}</span>
<span className="font-mono text-sm">{snakeToTitleCase(agent)}</span>
</div>
</>
);

View File

@ -1,5 +1,4 @@
import { useState, useMemo, forwardRef } from "react";
import { useTheme } from "remix-themes";
import {
type ClusterData,
GraphClustering,
@ -24,6 +23,7 @@ export interface GraphClusteringVisualizationProps {
selectedClusterId?: string | null;
onClusterSelect?: (clusterId: string) => void;
singleClusterView?: boolean;
forOnboarding?: boolean;
}
export const GraphClusteringVisualization = forwardRef<
@ -41,6 +41,7 @@ export const GraphClusteringVisualization = forwardRef<
selectedClusterId,
onClusterSelect,
singleClusterView,
forOnboarding,
},
ref,
) => {
@ -52,9 +53,6 @@ export const GraphClusteringVisualization = forwardRef<
const [edgePopupContent, setEdgePopupContent] =
useState<EdgePopupContent | null>(null);
const [selectedEntityType, setSelectedEntityType] = useState<
string | undefined
>();
const [searchQuery, setSearchQuery] = useState<string>("");
// Combined filter logic for all filters
@ -70,49 +68,36 @@ export const GraphClusteringVisualization = forwardRef<
);
}
// Entity type filter
if (selectedEntityType) {
filtered = filtered.filter((triplet) => {
const sourceMatches =
triplet.sourceNode.attributes?.type === selectedEntityType;
const targetMatches =
triplet.targetNode.attributes?.type === selectedEntityType;
return sourceMatches || targetMatches;
});
}
// Search filter
if (searchQuery.trim()) {
// Helper functions for filtering
const isStatementNode = (node: any) => {
const isEpisodeNode = (node: any) => {
return (
node.attributes?.fact ||
(node.labels && node.labels.includes("Statement"))
node.attributes?.content ||
node.attributes?.episodeUuid ||
(node.labels && node.labels.includes("Episode"))
);
};
const query = searchQuery.toLowerCase();
filtered = filtered.filter((triplet) => {
const sourceMatches =
isStatementNode(triplet.sourceNode) &&
triplet.sourceNode.attributes?.fact?.toLowerCase().includes(query);
isEpisodeNode(triplet.sourceNode) &&
triplet.sourceNode.attributes?.content
?.toLowerCase()
.includes(query);
const targetMatches =
isStatementNode(triplet.targetNode) &&
triplet.targetNode.attributes?.fact?.toLowerCase().includes(query);
isEpisodeNode(triplet.targetNode) &&
triplet.targetNode.attributes?.content
?.toLowerCase()
.includes(query);
return sourceMatches || targetMatches;
});
}
return filtered;
}, [
triplets,
selectedClusterId,
onClusterSelect,
selectedEntityType,
searchQuery,
]);
}, [triplets, selectedClusterId, onClusterSelect, searchQuery]);
// Convert filtered triplets to graph triplets
const graphTriplets = useMemo(
@ -234,12 +219,9 @@ export const GraphClusteringVisualization = forwardRef<
{/* Graph Filters and Search in same row */}
<div className="flex items-center gap-1">
<GraphFilters
triplets={triplets}
clusters={clusters}
selectedCluster={selectedClusterId}
selectedEntityType={selectedEntityType}
onClusterChange={onClusterSelect as any}
onEntityTypeChange={setSelectedEntityType}
/>
<SpaceSearch
triplets={triplets}
@ -265,6 +247,7 @@ export const GraphClusteringVisualization = forwardRef<
labelColorMap={sharedLabelColorMap}
showClusterLabels={!selectedClusterId} // Show cluster labels when not filtering
enableClusterColors={true} // Always enable cluster colors
forOnboarding={forOnboarding}
/>
) : (
<div className="flex h-full items-center justify-center">

View File

@ -46,6 +46,8 @@ export interface GraphClusteringProps {
labelColorMap?: Map<string, number>;
showClusterLabels?: boolean;
enableClusterColors?: boolean;
// Change this later
forOnboarding?: boolean;
}
export interface GraphClusteringRef {
@ -88,6 +90,7 @@ export const GraphClustering = forwardRef<
labelColorMap: externalLabelColorMap,
showClusterLabels = true,
enableClusterColors = true,
forOnboarding,
},
ref,
) => {
@ -101,6 +104,7 @@ export const GraphClustering = forwardRef<
const selectedNodeRef = useRef<string | null>(null);
const selectedEdgeRef = useRef<string | null>(null);
const selectedClusterRef = useRef<string | null>(null);
const size = forOnboarding ? 16 : 4;
// Create cluster color mapping
const clusterColorMap = useMemo(() => {
@ -188,13 +192,13 @@ export const GraphClustering = forwardRef<
const nodeData = nodeDataMap.get(node.id) || node;
// Check if this is a Statement node
const isStatementNode =
nodeData.attributes.nodeType === "Statement" ||
(nodeData.labels && nodeData.labels.includes("Statement"));
// Check if this is an Episode node
const isEpisodeNode =
nodeData.attributes.nodeType === "Episode" ||
(nodeData.labels && nodeData.labels.includes("Episode"));
if (isStatementNode) {
// Statement nodes with cluster IDs use cluster colors
if (isEpisodeNode) {
// Episode nodes with cluster IDs use cluster colors
if (
enableClusterColors &&
nodeData.clusterId &&
@ -203,7 +207,7 @@ export const GraphClustering = forwardRef<
return clusterColorMap.get(nodeData.clusterId)!;
}
// Unclustered statement nodes use a specific light color
// Unclustered episode nodes use a specific light color
return themeMode === "dark" ? "#2b9684" : "#54935b"; // Teal/Green from palette
}
@ -225,10 +229,10 @@ export const GraphClustering = forwardRef<
triplets.forEach((triplet) => {
if (!nodeMap.has(triplet.source.id)) {
const nodeColor = getNodeColor(triplet.source);
const isStatementNode =
triplet.source.attributes?.nodeType === "Statement" ||
const isEpisodeNode =
triplet.source.attributes?.nodeType === "Episode" ||
(triplet.source.labels &&
triplet.source.labels.includes("Statement"));
triplet.source.labels.includes("Episode"));
nodeMap.set(triplet.source.id, {
id: triplet.source.id,
@ -236,23 +240,23 @@ export const GraphClustering = forwardRef<
? triplet.source.value.split(/\s+/).slice(0, 4).join(" ") +
(triplet.source.value.split(/\s+/).length > 4 ? " ..." : "")
: "",
size: isStatementNode ? 4 : 2, // Statement nodes slightly larger
size: isEpisodeNode ? size : size / 2, // Episode nodes slightly larger
color: nodeColor,
x: width,
y: height,
nodeData: triplet.source,
clusterId: triplet.source.clusterId,
// Enhanced border for visual appeal, thicker for Statement nodes
// Enhanced border for visual appeal, thicker for Episode nodes
borderSize: 1,
borderColor: nodeColor,
});
}
if (!nodeMap.has(triplet.target.id)) {
const nodeColor = getNodeColor(triplet.target);
const isStatementNode =
triplet.target.attributes?.nodeType === "Statement" ||
const isEpisodeNode =
triplet.target.attributes?.nodeType === "Episode" ||
(triplet.target.labels &&
triplet.target.labels.includes("Statement"));
triplet.target.labels.includes("Episode"));
nodeMap.set(triplet.target.id, {
id: triplet.target.id,
@ -260,13 +264,13 @@ export const GraphClustering = forwardRef<
? triplet.target.value.split(/\s+/).slice(0, 4).join(" ") +
(triplet.target.value.split(/\s+/).length > 4 ? " ..." : "")
: "",
size: isStatementNode ? 4 : 2, // Statement nodes slightly larger
size: isEpisodeNode ? size : size / 2, // Episode nodes slightly larger
color: nodeColor,
x: width,
y: height,
nodeData: triplet.target,
clusterId: triplet.target.clusterId,
// Enhanced border for visual appeal, thicker for Statement nodes
// Enhanced border for visual appeal, thicker for Episode nodes
borderSize: 1,
borderColor: nodeColor,
});
@ -290,9 +294,9 @@ export const GraphClustering = forwardRef<
target: triplet.target.id,
relations: [],
relationData: [],
label: "",
label: triplet.relation.value, // Show edge type (predicate for Subject->Object)
color: "#0000001A",
labelColor: "#0000001A",
labelColor: "#000000",
size: 1,
};
}
@ -323,13 +327,13 @@ export const GraphClustering = forwardRef<
graph.forEachNode((node) => {
const nodeData = graph.getNodeAttribute(node, "nodeData");
const originalColor = getNodeColor(nodeData);
const isStatementNode =
nodeData?.attributes.nodeType === "Statement" ||
(nodeData?.labels && nodeData.labels.includes("Statement"));
const isEpisodeNode =
nodeData?.attributes.nodeType === "Episode" ||
(nodeData?.labels && nodeData.labels.includes("Episode"));
graph.setNodeAttribute(node, "highlighted", false);
graph.setNodeAttribute(node, "color", originalColor);
graph.setNodeAttribute(node, "size", isStatementNode ? 4 : 2);
graph.setNodeAttribute(node, "size", isEpisodeNode ? size : size / 2);
graph.setNodeAttribute(node, "zIndex", 1);
});
graph.forEachEdge((edge) => {
@ -519,7 +523,7 @@ export const GraphClustering = forwardRef<
return {
scalingRatio: Math.round(scalingRatio * 10) / 10,
gravity: Math.round(gravity * 10) / 10,
duration: Math.round(durationSeconds * 100) / 100, // in seconds
duration: forOnboarding ? 1 : Math.round(durationSeconds * 100) / 100, // in seconds
};
}, []);
@ -547,19 +551,19 @@ export const GraphClustering = forwardRef<
// Apply layout
if (graph.order > 0) {
// Strong cluster-based positioning for Statement nodes only
// Strong cluster-based positioning for Episode nodes only
const clusterNodeMap = new Map<string, string[]>();
const entityNodes: string[] = [];
// Group Statement nodes by their cluster ID, separate Entity nodes
// Group Episode nodes by their cluster ID, separate Entity nodes
graph.forEachNode((nodeId, attributes) => {
const isStatementNode =
attributes.nodeData?.nodeType === "Statement" ||
const isEpisodeNode =
attributes.nodeData?.nodeType === "Episode" ||
(attributes.nodeData?.labels &&
attributes.nodeData.labels.includes("Statement"));
attributes.nodeData.labels.includes("Episode"));
if (isStatementNode && attributes.clusterId) {
// Statement nodes with cluster IDs go into clusters
if (isEpisodeNode && attributes.clusterId) {
// Episode nodes with cluster IDs go into clusters
if (!clusterNodeMap.has(attributes.clusterId)) {
clusterNodeMap.set(attributes.clusterId, []);
}
@ -636,7 +640,7 @@ export const GraphClustering = forwardRef<
}
// Position Entity nodes using ForceAtlas2 natural positioning
// They will be positioned by the algorithm based on their connections to Statement nodes
// They will be positioned by the algorithm based on their connections to Episode nodes
entityNodes.forEach((nodeId) => {
// Give them initial random positions, ForceAtlas2 will adjust based on connections
graph.setNodeAttribute(nodeId, "x", Math.random() * width);
@ -661,7 +665,11 @@ export const GraphClustering = forwardRef<
});
layout.start();
setTimeout(() => layout.stop(), (optimalParams.duration ?? 2) * 1000);
if (!forOnboarding) {
setTimeout(() => layout.stop(), (optimalParams.duration ?? 2) * 1000);
} else {
setTimeout(() => layout.stop(), 500);
}
}
// Create Sigma instance
@ -673,13 +681,15 @@ export const GraphClustering = forwardRef<
edgeProgramClasses: {
"edges-fast": EdgeLineProgram,
},
renderLabels: false,
renderLabels: true,
labelRenderedSizeThreshold: 15, // labels appear when node size >= 10px
enableEdgeEvents: true,
minCameraRatio: 0.01,
defaultDrawNodeHover: drawHover,
maxCameraRatio: 2,
allowInvalidContainer: false,
allowInvalidContainer: true,
});
sigmaRef.current = sigma;
@ -693,12 +703,6 @@ export const GraphClustering = forwardRef<
}, 100);
}
// Update cluster labels after any camera movement
sigma.getCamera().on("updated", () => {
if (showClusterLabels) {
}
});
// Drag and drop implementation (same as original)
let draggedNode: string | null = null;
let isDragging = false;
@ -841,8 +845,8 @@ export const GraphClustering = forwardRef<
ref={containerRef}
className=""
style={{
width: `${width}px`,
height: `${height}px`,
width: forOnboarding ? "100%" : `${width}px`,
height: forOnboarding ? "100%" : `${height}px`,
borderRadius: "8px",
cursor: "grab",
fontSize: "12px",

View File

@ -12,69 +12,32 @@ import type { RawTriplet } from "./type";
import { type ClusterData } from "./graph-clustering";
import { nodeColorPalette } from "./node-colors";
import { useTheme } from "remix-themes";
import { ScrollArea } from "../ui";
interface GraphFiltersProps {
triplets: RawTriplet[];
clusters: ClusterData[];
selectedCluster?: string | null;
selectedEntityType?: string;
onClusterChange: (cluster?: string) => void;
onEntityTypeChange: (entityType?: string) => void;
}
type FilterStep = "main" | "cluster" | "nodeType" | "entityType";
const nodeTypeOptions = [
{ value: "entity", label: "Entity" },
{ value: "statement", label: "Statement" },
];
export function GraphFilters({
triplets,
clusters,
selectedCluster,
selectedEntityType,
onClusterChange,
onEntityTypeChange,
}: GraphFiltersProps) {
const [themeMode] = useTheme();
const [popoverOpen, setPopoverOpen] = useState(false);
const [step, setStep] = useState<FilterStep>("main");
// Extract unique entity types (primaryLabel values) from triplets
const entityTypeOptions = useMemo(() => {
const entityTypes = new Set<string>();
triplets.forEach((triplet) => {
// Check if node has primaryLabel (indicates it's an entity)
if (triplet.sourceNode.attributes?.type) {
entityTypes.add(triplet.sourceNode.attributes.type);
}
if (triplet.targetNode.attributes?.type) {
entityTypes.add(triplet.targetNode.attributes.type);
}
});
return Array.from(entityTypes)
.sort()
.map((type) => ({
value: type,
label: type,
}));
}, [triplets]);
// Get display labels
const selectedClusterLabel = clusters.find(
(c) => c.id === selectedCluster,
)?.name;
const selectedEntityTypeLabel = entityTypeOptions.find(
(e) => e.value === selectedEntityType,
)?.label;
const hasFilters = selectedCluster || selectedEntityType;
return (
@ -112,13 +75,6 @@ export function GraphFilters({
>
Cluster
</Button>
<Button
variant="ghost"
className="justify-start"
onClick={() => setStep("entityType")}
>
Entity Type
</Button>
</div>
)}
@ -167,40 +123,6 @@ export function GraphFilters({
})}
</div>
)}
{step === "entityType" && (
<div className="flex flex-col gap-1 p-2">
<Button
variant="ghost"
className="w-full justify-start"
onClick={() => {
onEntityTypeChange(undefined);
setPopoverOpen(false);
setStep("main");
}}
>
All Entity Types
</Button>
{entityTypeOptions.map((entityType) => (
<Button
key={entityType.value}
variant="ghost"
className="w-full justify-start"
onClick={() => {
onEntityTypeChange(
entityType.value === selectedEntityType
? undefined
: entityType.value,
);
setPopoverOpen(false);
setStep("main");
}}
>
{entityType.label}
</Button>
))}
</div>
)}
</div>
</PopoverContent>
</PopoverPortal>
@ -218,16 +140,6 @@ export function GraphFilters({
/>
</Badge>
)}
{selectedEntityType && (
<Badge variant="secondary" className="h-7 gap-1 rounded px-2">
{selectedEntityTypeLabel}
<X
className="hover:text-destructive h-3.5 w-3.5 cursor-pointer"
onClick={() => onEntityTypeChange(undefined)}
/>
</Badge>
)}
</div>
)}
</div>

View File

@ -114,36 +114,6 @@ export const GraphVisualization = forwardRef<GraphRef, GraphVisualizationProps>(
return (
<div className={className}>
{/* Entity Types Legend Button */}
<div className="absolute top-4 left-4 z-50">
{/* <HoverCard>
<HoverCardTrigger asChild>
<button className="bg-primary/10 text-primary hover:bg-primary/20 rounded-md px-2.5 py-1 text-xs transition-colors">
Entity Types
</button>
</HoverCardTrigger>
<HoverCardContent className="w-40" side="bottom" align="start">
<div className="space-y-2">
<div className="max-h-[300px] space-y-1.5 overflow-y-auto pr-2">
{allLabels.map((label) => (
<div key={label} className="flex items-center gap-2">
<div
className="h-4 w-4 flex-shrink-0 rounded-full"
style={{
backgroundColor: getNodeColor(
label,
isDarkMode,
sharedLabelColorMap,
),
}}
/>
<span className="text-xs">{label}</span>
</div>
))}
</div>
</div>
</HoverCardContent>
</HoverCard> */}
</div>
{triplets.length > 0 ? (
<Graph

View File

@ -284,52 +284,52 @@ export const Graph = forwardRef<GraphRef, GraphProps>(
// More nodes = need more space to prevent overcrowding
let scalingRatio: number;
if (nodeCount < 10) {
scalingRatio = 15; // Tight for small graphs
scalingRatio = 20; // Slightly wider for small graphs
} else if (nodeCount < 50) {
scalingRatio = 20 + (nodeCount - 10) * 0.5; // Gradual increase
scalingRatio = 30 + (nodeCount - 10) * 1.0; // Faster increase
} else if (nodeCount < 200) {
scalingRatio = 40 + (nodeCount - 50) * 0.2; // Slower increase
scalingRatio = 70 + (nodeCount - 50) * 0.5; // More spread
} else if (nodeCount < 500) {
scalingRatio = 145 + (nodeCount - 200) * 0.3; // Continue spreading
} else {
scalingRatio = Math.min(80, 70 + (nodeCount - 200) * 0.05); // Cap at 80
scalingRatio = Math.min(300, 235 + (nodeCount - 500) * 0.1); // Cap at 300
}
// Calculate optimal gravity based on density and node count
let gravity: number;
if (density > 0.3) {
// Dense graphs need less gravity to prevent overcrowding
gravity = 1 + density * 2;
gravity = 0.5 + density * 1.5;
} else if (density > 0.1) {
// Medium density graphs
gravity = 3 + density * 5;
gravity = 2 + density * 3;
} else {
// Sparse graphs need more gravity to keep components together
gravity = Math.min(8, 5 + (1 - density) * 3);
gravity = Math.min(6, 4 + (1 - density) * 2);
}
// Adjust gravity based on node count
// Adjust gravity based on node count - more aggressive reduction for large graphs
if (nodeCount < 20) {
gravity *= 1.5; // Smaller graphs benefit from stronger gravity
} else if (nodeCount > 100) {
gravity *= 0.8; // Larger graphs need gentler gravity
gravity *= 0.5; // Larger graphs need much gentler gravity
} else if (nodeCount > 200) {
gravity *= 0.3; // Very large graphs need very gentle gravity
}
// Calculate iterations based on complexity
const complexity = nodeCount + edgeCount;
let iterations: number;
if (complexity < 50) {
iterations = 400;
} else if (complexity < 200) {
iterations = 600;
} else if (complexity < 500) {
iterations = 800;
if (complexity < 500) {
iterations = complexity;
} else {
iterations = Math.min(1200, 1000 + complexity * 0.2);
iterations = Math.min(600, 500 + complexity * 0.2);
}
return {
scalingRatio: Math.round(scalingRatio * 10) / 10,
gravity: Math.round(gravity * 10) / 10,
iterations: Math.round(iterations),
iterations: Math.round(complexity),
};
}, []);
@ -378,10 +378,10 @@ export const Graph = forwardRef<GraphRef, GraphProps>(
settings: {
...settings,
barnesHutOptimize: true,
strongGravityMode: true,
strongGravityMode: false, // Disable strong gravity for more spread
gravity: optimalParams.gravity,
scalingRatio: optimalParams.scalingRatio,
slowDown: 3,
slowDown: 1.5, // Reduced slowDown for better spreading
},
});
@ -407,6 +407,7 @@ export const Graph = forwardRef<GraphRef, GraphProps>(
enableEdgeEvents: true,
minCameraRatio: 0.1,
maxCameraRatio: 2,
allowInvalidContainer: true,
});
sigmaRef.current = sigma;

View File

@ -16,7 +16,7 @@ export function SpaceSearch({
triplets,
searchQuery,
onSearchChange,
placeholder = "Search in statement facts...",
placeholder = "Search in episodes...",
}: SpaceSearchProps) {
const [inputValue, setInputValue] = useState(searchQuery);
@ -30,41 +30,42 @@ export function SpaceSearch({
}
}, [debouncedSearchQuery, searchQuery, onSearchChange]);
// Helper to determine if a node is a statement
const isStatementNode = useCallback((node: any) => {
// Check if node has a fact attribute (indicates it's a statement)
// Helper to determine if a node is an episode
const isEpisodeNode = useCallback((node: any) => {
// Check if node has content attribute (indicates it's an episode)
return (
node.attributes?.fact ||
(node.labels && node.labels.includes("Statement"))
node.attributes?.content ||
node.attributes?.episodeUuid ||
(node.labels && node.labels.includes("Episode"))
);
}, []);
// Count statement nodes that match the search
const matchingStatements = useMemo(() => {
// Count episode nodes that match the search
const matchingEpisodes = useMemo(() => {
if (!debouncedSearchQuery.trim()) return 0;
const query = debouncedSearchQuery.toLowerCase();
const statements: Record<string, number> = {};
const episodes: Record<string, number> = {};
triplets.forEach((triplet) => {
// Check if source node is a statement and matches
// Check if source node is an episode and matches
if (
isStatementNode(triplet.sourceNode) &&
triplet.sourceNode.attributes?.fact?.toLowerCase().includes(query)
isEpisodeNode(triplet.sourceNode) &&
triplet.sourceNode.attributes?.content?.toLowerCase().includes(query)
) {
statements[triplet.sourceNode.uuid] = 1;
episodes[triplet.sourceNode.uuid] = 1;
}
// Check if target node is a statement and matches
// Check if target node is an episode and matches
if (
isStatementNode(triplet.targetNode) &&
triplet.targetNode.attributes?.fact?.toLowerCase().includes(query)
isEpisodeNode(triplet.targetNode) &&
triplet.targetNode.attributes?.content?.toLowerCase().includes(query)
) {
statements[triplet.targetNode.uuid] = 1;
episodes[triplet.targetNode.uuid] = 1;
}
});
return Object.keys(statements).length;
return Object.keys(episodes).length;
}, [triplets, debouncedSearchQuery]);
const handleInputChange = (event: React.ChangeEvent<HTMLInputElement>) => {
@ -104,7 +105,7 @@ export function SpaceSearch({
{/* Show search results count */}
{debouncedSearchQuery.trim() && (
<div className="text-muted-foreground shrink-0 text-sm">
{matchingStatements} statement{matchingStatements !== 1 ? "s" : ""}
{matchingEpisodes} episode{matchingEpisodes !== 1 ? "s" : ""}
</div>
)}
</div>

View File

@ -10,6 +10,9 @@ import { Cursor } from "./icons/cursor";
import { Claude } from "./icons/claude";
import { Cline } from "./icons/cline";
import { VSCode } from "./icons/vscode";
import { Obsidian } from "./icons/obsidian";
import { Figma } from "./icons/figma";
import StaticLogo from "./logo/logo";
export const ICON_MAPPING = {
slack: SlackIcon,
@ -23,6 +26,9 @@ export const ICON_MAPPING = {
claude: Claude,
cline: Cline,
vscode: VSCode,
obsidian: Obsidian,
figma: Figma,
core: StaticLogo,
// Default icon
integration: LayoutGrid,

View File

@ -0,0 +1,41 @@
import type { IconProps } from "./types";
export function Figma({ size = 18, className }: IconProps) {
return (
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 24 24"
id="Figma"
height={size}
className={className}
width={size}
>
<path
fill="#0acf83"
d="M8.0833 23.750025c2.162 0 3.91665 -1.754675 3.91665 -3.916675V15.916675H8.0833c-2.162 0 -3.916675 1.754675 -3.916675 3.916675s1.754675 3.916675 3.916675 3.916675Z"
stroke-width="0.25"
></path>
<path
fill="#a259ff"
d="M4.166625 11.999975c0 -2.162 1.754675 -3.91665 3.916675 -3.91665h3.91665v7.833325H8.0833c-2.162 0 -3.916675 -1.754675 -3.916675 -3.916675Z"
stroke-width="0.25"
></path>
<path
fill="#f24e1e"
d="M4.166625 4.166675C4.166625 2.0046675 5.9213 0.25 8.0833 0.25h3.91665v7.833325H8.0833c-2.162 0 -3.916675 -1.75465 -3.916675 -3.91665Z"
stroke-width="0.25"
></path>
<path
fill="#ff7262"
d="M11.999875 0.25h3.916675c2.162 0 3.91665 1.7546675 3.91665 3.916675 0 2.162 -1.75465 3.91665 -3.91665 3.91665H11.999875V0.25Z"
stroke-width="0.25"
></path>
<path
fill="#1abcfe"
d="M19.8332 11.999975c0 2.162 -1.75465 3.916675 -3.91665 3.916675s-3.916675 -1.754675 -3.916675 -3.916675 1.754675 -3.91665 3.916675 -3.91665 3.91665 1.75465 3.91665 3.91665Z"
stroke-width="0.25"
></path>
</svg>
);
}

View File

@ -0,0 +1,148 @@
import type { IconProps } from "./types";
export function Obsidian({ size = 18, className }: IconProps) {
return (
<svg
xmlns="http://www.w3.org/2000/svg"
fill="none"
viewBox="0 0 96 96"
id="Obsidian-Icon"
height={size}
className={className}
width={size}
>
<path
fill="#000000"
fill-opacity=".3"
d="M70.9714 88.3753c-.579 4.2312-4.7434 7.5494-8.9078 6.3914-5.9015-1.6034-12.7382-4.1422-18.8847-4.6098l-9.4423-.7127c-1.5168-.1067-2.9424-.7641-4.0085-1.8483L13.4713 70.9382c-1.7668-1.8158-2.2437-4.5271-1.2026-6.8368 0 0 10.0213-21.9578 10.4222-23.0936.3563-1.1357 1.737-11.1125 2.5387-16.4572.2248-1.4192.9328-2.7172 2.0043-3.6745L46.4303 3.68397c1.2783-1.14667 2.9729-1.71542 4.6841-1.57211 1.7112.1433 3.2876.98597 4.3574 2.32928L71.6395 24.8401c.9047 1.1649 1.391 2.6004 1.3807 4.0753 0 3.8527.3341 11.8029 2.4942 16.9249 2.1435 4.6066 4.8032 8.9547 7.928 12.961.774 1.0049.8621 2.3785.2227 3.474-1.403 2.3829-4.209 6.9704-8.1507 12.8273-2.4954 4.0271-4.0555 8.5627-4.5652 13.2727h.0222Z"
></path>
<path
fill="#6c31e3"
d="M71.1274 87.6631c-.579 4.2535-4.7434 7.5717-8.9078 6.4359-5.8792-1.6257-12.6937-4.1644-18.8624-4.6321l-9.42-.7126c-1.5204-.1107-2.9471-.7765-4.0085-1.8706L13.6718 70.1814c-1.7841-1.834-2.2614-4.5742-1.2025-6.9035 0 0 10.0436-22.0469 10.4222-23.205.3785-1.1357 1.737-11.1348 2.5387-16.524.2199-1.4272.9283-2.7338 2.0043-3.6968L46.6309 2.59317c1.2811-1.14413 2.9776-1.709094 4.6889-1.56156 1.7114.14753 3.2861.99448 4.3525 2.341L71.8178 23.8607c.9016 1.1665 1.3876 2.6009 1.3807 4.0753 0 3.8749.3341 11.8474 2.4719 16.9917 2.1419 4.6282 4.8015 8.9987 7.928 13.0277.791 1.0049.8881 2.391.245 3.4964-1.4253 2.4051-4.209 6.9926-8.173 12.8941-2.4863 4.0457-4.0383 8.5954-4.543 13.3172Z"
></path>
<path
fill="url(#a)"
d="M31.6655 88.1308c7.5494-15.2992 7.3489-26.2782 4.1198-34.0725-2.9395-7.2154-8.4401-11.7584-12.7604-14.5866-.0891.4231-.2227.824-.4009 1.2025L12.4691 63.2779c-1.0532 2.3337-.5669 5.0746 1.2248 6.9036l16.2345 16.7022c.5122.5122 1.1135.9353 1.7371 1.2471Z"
></path>
<path
fill="url(#b)"
d="M52.1982 57.0867c2.0265.2004 4.0085.6458 5.9682 1.3584 6.191 2.3161 11.8252 7.5272 16.4795 17.5707.3341-.579.6681-1.1357 1.0244-1.6702 2.8444-4.2207 5.5698-8.5205 8.173-12.8941.6495-1.1009.5612-2.4867-.2227-3.4963-3.1343-4.0279-5.8014-8.3985-7.9503-13.0277-2.1378-5.122-2.4496-13.1168-2.4719-16.9917 0-1.4698-.4677-2.9174-1.3807-4.0754L55.6722 3.37237l-.2672-.33404c1.1803 3.89718 1.1135 7.01497.3786 9.84317-.6681 2.6278-1.9152 5.0107-3.2291 7.5271-.4454.8463-.8908 1.7148-1.3139 2.6056-2.0943 3.9999-3.2949 8.4069-3.5186 12.9163-.2227 5.3893.8685 12.137 4.4539 21.1562h.0223Z"
></path>
<path
fill="url(#c)"
d="M52.1763 57.0868c-3.5854-9.0192-4.6766-15.7669-4.4539-21.1561.2227-5.3448 1.7816-9.3533 3.5186-12.9164l1.3362-2.6056c1.2916-2.5164 2.5164-4.8993 3.2068-7.5271.8199-3.24951.6885-6.66623-.3786-9.84317-2.3058-2.533482-6.2237-2.732358-8.7742-.44539L27.4348 19.852c-1.076.963-1.7844 2.2696-2.0043 3.6967l-2.3383 15.4997c0 .1559-.0445.2895-.0668.4454 4.3203 2.806 9.7986 7.349 12.7605 14.542.579 1.4253 1.0689 2.9174 1.4252 4.543 4.8665-1.3495 9.927-1.8616 14.9652-1.5143v.0223Z"
></path>
<path
fill="url(#d)"
d="M62.2424 94.099c4.1421 1.1358 8.3066-2.1824 8.8856-6.4582.4741-4.0557 1.6735-7.9933 3.5408-11.6247-4.6766-10.0436-10.3108-15.2547-16.4795-17.5707-6.5695-2.4497-13.7181-1.6257-20.978.1336 1.6257 7.3712.6681 17.014-5.5228 29.5518.6903.3563 1.4698.5567 2.2492.6235l9.7764.7349c5.3001.3786 13.2059 3.1178 18.5283 4.6098Z"
></path>
<path
fill="url(#e)"
d="M47.7677 35.5966c-.2449 5.3447.4232 11.4466 4.0086 20.4435l-1.1135-.1113c-3.2291-9.3756-3.9417-14.1858-3.6968-19.5973.2227-5.4115 1.982-9.5759 3.7191-13.1391.4454-.8908 1.4698-2.561 1.9151-3.4072 1.2917-2.5165 2.1602-3.8304 2.8951-6.1242 1.0689-3.2068.8462-4.72113.7126-6.23546.824 5.45606-2.316 10.19946-4.6766 15.03196-2.1854 4.0593-3.4763 8.5394-3.7858 13.1391h.0222Z"
></path>
<path
fill="url(#f)"
d="M36.8533 54.4368c.4453.9798.8239 1.7815 1.0912 3.0064l-.9576.2226c-.3786-1.4252-.6681-2.4496-1.2249-3.6744-3.2513-7.6385-8.4624-11.5802-12.6936-14.4753 5.122 2.7614 10.3999 7.104 13.7849 14.9207Z"
></path>
<path
fill="url(#g)"
d="M37.9913 58.4229c1.7816 8.3511-.2227 18.9737-6.1241 29.3068 4.9438-10.244 7.349-20.0649 5.3447-29.1732l.7794-.1559v.0223Z"
></path>
<path
fill="url(#h)"
d="M58.3896 57.6436c9.6873 3.63 13.4286 11.5802 16.2123 18.2388-3.4518-6.9481-8.2398-14.6311-16.5686-17.4816-6.3246-2.1824-11.6693-1.9152-20.8221.1559l-.2005-.8908c9.7096-2.227 14.7871-2.4942 21.3789 0v-.0223Z"
></path>
<defs>
<radialGradient
id="a"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(4256.29 0 0 7970.03 1904.17 4756.18)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".4"></stop>
<stop offset="1" stop-opacity=".1"></stop>
</radialGradient>
<radialGradient
id="b"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(6963.7 0 0 13892.1 1983.46 6617.11)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".6"></stop>
<stop offset="1" stop-color="#fff" stop-opacity=".1"></stop>
</radialGradient>
<radialGradient
id="c"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(5949.36 0 0 10290.1 1060.79 5594.09)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".8"></stop>
<stop offset="1" stop-color="#fff" stop-opacity=".4"></stop>
</radialGradient>
<radialGradient
id="d"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(3957.88 0 0 3444.96 3118.26 3797.7)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".3"></stop>
<stop offset="1" stop-opacity=".3"></stop>
</radialGradient>
<radialGradient
id="e"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(3096.76 0 0 15981 1149.23 1697.69)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity="0"></stop>
<stop offset="1" stop-color="#fff" stop-opacity=".2"></stop>
</radialGradient>
<radialGradient
id="f"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(2283.37 0 0 2785.85 -117.23 197.63)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".2"></stop>
<stop offset="1" stop-color="#fff" stop-opacity=".4"></stop>
</radialGradient>
<radialGradient
id="g"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(2665.29 0 0 11578.3 733.36 -591.872)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".1"></stop>
<stop offset="1" stop-color="#fff" stop-opacity=".3"></stop>
</radialGradient>
<radialGradient
id="h"
cx="0"
cy="0"
r="1"
gradientTransform="matrix(7661.99 0 0 4074.12 3771.63 1838.72)"
gradientUnits="userSpaceOnUse"
>
<stop stop-color="#fff" stop-opacity=".2"></stop>
<stop offset=".5" stop-color="#fff" stop-opacity=".2"></stop>
<stop offset="1" stop-color="#fff" stop-opacity=".3"></stop>
</radialGradient>
</defs>
</svg>
);
}

View File

@ -1,7 +1,4 @@
import { Button } from "../ui";
import Logo from "../logo/logo";
import { Theme, useTheme } from "remix-themes";
import { GalleryVerticalEnd } from "lucide-react";
export function LoginPageLayout({ children }: { children: React.ReactNode }) {
return (
@ -10,7 +7,7 @@ export function LoginPageLayout({ children }: { children: React.ReactNode }) {
<div className="flex justify-center gap-2 md:justify-start">
<a href="#" className="flex items-center gap-2 font-medium">
<div className="flex size-8 items-center justify-center rounded-md">
<Logo width={60} height={60} />
<Logo size={60} />
</div>
C.O.R.E.
</a>

View File

@ -14,7 +14,7 @@ export function LoginPageLayout({ children }: { children: React.ReactNode }) {
>
<div className="flex w-full max-w-sm flex-col items-center gap-2">
<div className="flex size-10 items-center justify-center rounded-md">
<Logo width={60} height={60} />
<Logo size={60} />
</div>
<a href="#" className="flex items-center gap-2 self-center font-medium">
<div className="font-mono">C.O.R.E.</div>

View File

@ -1,13 +1,14 @@
export interface LogoProps {
width: number;
height: number;
size: number;
className?: string;
}
export default function StaticLogo({ width, height }: LogoProps) {
export default function StaticLogo({ size, className }: LogoProps) {
return (
<svg
width={width}
height={height}
width={size}
height={size}
className={className}
viewBox="0 0 282 282"
fill="none"
xmlns="http://www.w3.org/2000/svg"

View File

@ -1,19 +1,71 @@
import { useState, useEffect } from "react";
import { useState, useEffect, type ReactNode } from "react";
import { useFetcher } from "@remix-run/react";
import { AlertCircle, Loader2 } from "lucide-react";
import { Dialog, DialogContent, DialogHeader, DialogTitle } from "../ui/dialog";
import { Badge } from "../ui/badge";
import { AlertCircle, File, Loader2, MessageSquare } from "lucide-react";
import { Badge, BadgeColor } from "../ui/badge";
import { type LogItem } from "~/hooks/use-logs";
import Markdown from "react-markdown";
import { getIconForAuthorise } from "../icon-utils";
import { cn, formatString } from "~/lib/utils";
import { getStatusColor } from "./utils";
import { format } from "date-fns";
import { SpaceDropdown } from "../spaces/space-dropdown";
import { StyledMarkdown } from "../common/styled-markdown";
interface LogDetailsProps {
open: boolean;
onOpenChange: (open: boolean) => void;
text?: string;
error?: string;
log: LogItem;
}
interface PropertyItemProps {
label: string;
value?: string | ReactNode;
icon?: ReactNode;
variant?: "default" | "secondary" | "outline" | "status";
statusColor?: string;
className?: string;
}
function PropertyItem({
label,
value,
icon,
variant = "secondary",
statusColor,
className,
}: PropertyItemProps) {
if (!value) return null;
return (
<div className="flex items-center py-1 !text-base">
<span className="text-muted-foreground min-w-[120px]">{label}</span>
{variant === "status" ? (
<Badge
className={cn(
"text-foreground h-7 items-center gap-2 rounded !bg-transparent px-4.5 !text-base",
className,
)}
>
{statusColor && (
<BadgeColor className={cn(statusColor, "h-2.5 w-2.5")} />
)}
{value}
</Badge>
) : (
<Badge
variant={variant}
className={cn(
"h-7 items-center gap-2 rounded bg-transparent px-4 !text-base",
className,
)}
>
{icon}
{value}
</Badge>
)}
</div>
);
}
interface EpisodeFact {
uuid: string;
fact: string;
@ -27,13 +79,15 @@ interface EpisodeFactsResponse {
invalidFacts: EpisodeFact[];
}
export function LogDetails({
open,
onOpenChange,
text,
error,
log,
}: LogDetailsProps) {
function getStatusValue(status: string) {
if (status === "PENDING") {
return formatString("IN QUEUE");
}
return formatString(status);
}
export function LogDetails({ log }: LogDetailsProps) {
const [facts, setFacts] = useState<any[]>([]);
const [invalidFacts, setInvalidFacts] = useState<any[]>([]);
const [factsLoading, setFactsLoading] = useState(false);
@ -41,11 +95,37 @@ export function LogDetails({
// Fetch episode facts when dialog opens and episodeUUID exists
useEffect(() => {
if (open && log.episodeUUID && facts.length === 0) {
if (log.data?.type === "DOCUMENT" && log.data?.episodes?.length > 0) {
setFactsLoading(true);
setFacts([]);
// Fetch facts for all episodes in DOCUMENT type
Promise.all(
log.data.episodes.map((episodeId: string) =>
fetch(`/api/v1/episodes/${episodeId}/facts`).then((res) =>
res.json(),
),
),
)
.then((results) => {
const allFacts = results.flatMap((result) => result.facts || []);
const allInvalidFacts = results.flatMap(
(result) => result.invalidFacts || [],
);
setFacts(allFacts);
setInvalidFacts(allInvalidFacts);
setFactsLoading(false);
})
.catch(() => {
setFactsLoading(false);
});
} else if (log.episodeUUID) {
setFactsLoading(true);
fetcher.load(`/api/v1/episodes/${log.episodeUUID}/facts`);
} else {
setFacts([]);
setInvalidFacts([]);
}
}, [open, log.episodeUUID, facts.length]);
}, [log.episodeUUID, log.data?.type, log.data?.episodes, facts.length]);
// Handle fetcher response
useEffect(() => {
@ -58,116 +138,197 @@ export function LogDetails({
}, [fetcher.data, fetcher.state]);
return (
<Dialog open={open} onOpenChange={onOpenChange}>
<DialogContent className="max-w-4xl">
<DialogHeader className="px-4 pt-4">
<DialogTitle className="flex w-full items-center justify-between">
<span>Log Details</span>
<div className="flex gap-0.5">
{log.episodeUUID && (
<Badge variant="secondary" className="rounded text-xs">
Episode: {log.episodeUUID.slice(0, 8)}...
</Badge>
<div className="flex h-full w-full flex-col items-center overflow-auto">
<div className="max-w-4xl">
<div className="mt-5 mb-5 px-4">
<div className="space-y-1">
<PropertyItem
label="Session Id"
value={log.data?.sessionId?.toLowerCase()}
variant="secondary"
/>
<PropertyItem
label="Type"
value={formatString(
log.data?.type ? log.data.type.toLowerCase() : "conversation",
)}
{log.source && (
<Badge variant="secondary" className="rounded text-xs">
Source: {log.source}
</Badge>
)}
</div>
</DialogTitle>
</DialogHeader>
icon={
log.data?.type === "CONVERSATION" ? (
<MessageSquare size={16} />
) : (
<File size={16} />
)
}
variant="secondary"
/>
<PropertyItem
label="Source"
value={formatString(log.source?.toLowerCase())}
icon={
log.source &&
getIconForAuthorise(log.source.toLowerCase(), 16, undefined)
}
variant="secondary"
/>
<div className="max-h-[70vh] overflow-auto p-4 pt-0">
{/* Log Content */}
<div className="mb-4 text-sm break-words whitespace-pre-wrap">
<div className="rounded-md">
<Markdown>{text}</Markdown>
<PropertyItem
label="Status"
value={getStatusValue(log.status)}
variant="status"
statusColor={log.status && getStatusColor(log.status)}
/>
{/* Space Assignment for CONVERSATION type */}
{log.data.type.toLowerCase() === "conversation" &&
log?.episodeUUID && (
<div className="mt-2 flex items-start py-1">
<span className="text-muted-foreground min-w-[120px]">
Spaces
</span>
<SpaceDropdown
className="px-3"
episodeIds={[log.episodeUUID]}
selectedSpaceIds={log.spaceIds || []}
/>
</div>
)}
</div>
</div>
{/* Error Details */}
{log.error && (
<div className="mb-6 px-4">
<div className="bg-destructive/10 rounded-md p-3">
<div className="flex items-start gap-2 text-red-600">
<AlertCircle className="mt-0.5 h-4 w-4 flex-shrink-0" />
<p className="text-sm break-words whitespace-pre-wrap">
{log.error}
</p>
</div>
</div>
</div>
)}
{/* Error Details */}
{error && (
<div className="mb-4">
<h3 className="mb-2 text-sm font-medium">Error Details</h3>
<div className="bg-destructive/10 rounded-md p-3">
<div className="flex items-start gap-2 text-red-600">
<AlertCircle className="mt-0.5 h-4 w-4 flex-shrink-0" />
<p className="text-sm break-words whitespace-pre-wrap">
{error}
</p>
</div>
</div>
</div>
)}
{/* Episode Facts */}
{log.episodeUUID && (
<div className="mb-4">
<h3 className="text-muted-foreground mb-2 text-sm">Facts</h3>
{log.data?.type === "CONVERSATION" && (
<div className="flex flex-col items-center p-4 pt-0">
{/* Log Content */}
<div className="mb-4 w-full break-words whitespace-pre-wrap">
<div className="rounded-md">
{factsLoading ? (
<div className="flex items-center justify-center gap-2 p-4 text-sm">
<Loader2 className="h-4 w-4 animate-spin" />
</div>
) : facts.length > 0 ? (
<div className="flex flex-col gap-2">
{facts.map((fact) => (
<div
key={fact.uuid}
className="bg-grayAlpha-100 rounded-md p-3"
>
<p className="mb-1 text-sm">{fact.fact}</p>
<div className="text-muted-foreground flex items-center gap-2 text-xs">
<span>
Valid: {new Date(fact.validAt).toLocaleString()}
</span>
{fact.invalidAt && (
<span>
Invalid:{" "}
{new Date(fact.invalidAt).toLocaleString()}
</span>
)}
{Object.keys(fact.attributes).length > 0 && (
<Badge variant="secondary" className="text-xs">
{Object.keys(fact.attributes).length} attributes
</Badge>
)}
</div>
</div>
))}
{invalidFacts.map((fact) => (
<div
key={fact.uuid}
className="bg-grayAlpha-100 rounded-md p-3"
>
<p className="mb-1 text-sm">{fact.fact}</p>
<div className="text-muted-foreground flex items-center gap-2 text-xs">
{fact.invalidAt && (
<span>
Invalid:{" "}
{new Date(fact.invalidAt).toLocaleString()}
</span>
)}
{Object.keys(fact.attributes).length > 0 && (
<Badge variant="secondary" className="text-xs">
{Object.keys(fact.attributes).length} attributes
</Badge>
)}
</div>
</div>
))}
</div>
) : (
<div className="text-muted-foreground p-4 text-center text-sm">
No facts found for this episode
</div>
)}
<StyledMarkdown>{log.ingestText}</StyledMarkdown>
</div>
</div>
)}
</div>
)}
{/* Episodes List for DOCUMENT type */}
{log.data?.type === "DOCUMENT" && log.episodeDetails?.length > 0 && (
<div className="mb-6 px-4">
<div className="mb-2 flex w-full items-center justify-between font-medium">
<span>Episodes ({log.episodeDetails.length})</span>
</div>
<div className="flex flex-col gap-3">
{log.episodeDetails.map((episode: any, index: number) => (
<div
key={episode.uuid}
className="bg-grayAlpha-100 flex flex-col gap-3 rounded-md p-3"
>
<div className="flex items-start gap-3">
<div className="flex min-w-0 flex-1 flex-col gap-1">
<span className="text-muted-foreground text-xs">
Episode {index + 1}
</span>
<span className="truncate font-mono text-xs">
{episode.uuid}
</span>
</div>
<div className="flex-shrink-0">
<SpaceDropdown
episodeIds={[episode.uuid]}
selectedSpaceIds={episode.spaceIds || []}
/>
</div>
</div>
{/* Episode Content */}
<div className="border-grayAlpha-200 border-t pt-3">
<div className="text-muted-foreground mb-1 text-xs">
Content
</div>
<div className="text-sm break-words whitespace-pre-wrap">
<StyledMarkdown>{episode.content}</StyledMarkdown>
</div>
</div>
</div>
))}
</div>
</div>
)}
{/* Episode Facts */}
<div className="mb-6 px-4">
<div className="mb-2 flex w-full items-center justify-between font-medium">
<span>Facts</span>
</div>
<div className="rounded-md">
{factsLoading ? (
<div className="flex items-center justify-center gap-2 p-4 text-sm">
<Loader2 className="h-4 w-4 animate-spin" />
</div>
) : facts.length > 0 ? (
<div className="flex flex-col gap-1">
{facts.map((fact) => (
<div
key={fact.uuid}
className="bg-grayAlpha-100 flex items-center justify-between gap-2 rounded-md p-3"
>
<p className="text-sm">{fact.fact}</p>
<div className="text-muted-foreground flex shrink-0 items-center gap-2 text-xs">
<span>
Valid: {format(new Date(fact.validAt), "dd/MM/yyyy")}
</span>
{fact.invalidAt && (
<span>
Invalid:{" "}
{format(new Date(fact.invalidAt), "dd/MM/yyyy")}
</span>
)}
{Object.keys(fact.attributes).length > 0 && (
<Badge variant="secondary" className="text-xs">
{Object.keys(fact.attributes).length} attributes
</Badge>
)}
</div>
</div>
))}
{invalidFacts.map((fact) => (
<div
key={fact.uuid}
className="bg-grayAlpha-100 rounded-md p-3"
>
<p className="mb-1 text-sm">{fact.fact}</p>
<div className="text-muted-foreground flex items-center gap-2 text-xs">
{fact.invalidAt && (
<span>
Invalid: {new Date(fact.invalidAt).toLocaleString()}
</span>
)}
{Object.keys(fact.attributes).length > 0 && (
<Badge variant="secondary" className="text-xs">
{Object.keys(fact.attributes).length} attributes
</Badge>
)}
</div>
</div>
))}
</div>
) : (
<div className="text-muted-foreground p-4 text-center text-sm">
No facts found for this episode
</div>
)}
</div>
</div>
</DialogContent>
</Dialog>
</div>
</div>
);
}

View File

@ -1,10 +1,4 @@
import { EllipsisVertical, Trash } from "lucide-react";
import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuTrigger,
} from "../ui/dropdown-menu";
import { Trash, Copy, RotateCw } from "lucide-react";
import { Button } from "../ui/button";
import {
AlertDialog,
@ -17,15 +11,19 @@ import {
AlertDialogTitle,
} from "../ui/alert-dialog";
import { useState, useEffect } from "react";
import { redirect, useFetcher } from "@remix-run/react";
import { useFetcher, useNavigate } from "@remix-run/react";
import { toast } from "~/hooks/use-toast";
interface LogOptionsProps {
id: string;
status?: string;
}
export const LogOptions = ({ id }: LogOptionsProps) => {
export const LogOptions = ({ id, status }: LogOptionsProps) => {
const [deleteDialogOpen, setDeleteDialogOpen] = useState(false);
const deleteFetcher = useFetcher<{ success: boolean }>();
const retryFetcher = useFetcher<{ success: boolean }>();
const navigate = useNavigate();
const handleDelete = () => {
deleteFetcher.submit(
@ -39,43 +37,84 @@ export const LogOptions = ({ id }: LogOptionsProps) => {
setDeleteDialogOpen(false);
};
const handleCopy = async () => {
try {
await navigator.clipboard.writeText(id);
toast({
title: "Copied",
description: "Episode ID copied to clipboard",
});
} catch (err) {
console.error("Failed to copy:", err);
toast({
title: "Error",
description: "Failed to copy ID",
variant: "destructive",
});
}
};
const handleRetry = () => {
retryFetcher.submit(
{},
{
method: "POST",
action: `/api/v1/logs/${id}/retry`,
},
);
};
useEffect(() => {
if (deleteFetcher.state === "idle" && deleteFetcher.data?.success) {
redirect(`/home/logs`);
navigate(`/home/inbox`);
}
}, [deleteFetcher.state, deleteFetcher.data]);
useEffect(() => {
if (retryFetcher.state === "idle" && retryFetcher.data?.success) {
toast({
title: "Success",
description: "Episode retry initiated",
});
// Reload the page to reflect the new status
window.location.reload();
}
}, [retryFetcher.state, retryFetcher.data]);
return (
<>
<DropdownMenu>
<DropdownMenuTrigger
asChild
<div className="flex items-center gap-2">
{status === "FAILED" && (
<Button
variant="secondary"
size="sm"
className="gap-2 rounded"
onClick={handleRetry}
disabled={retryFetcher.state !== "idle"}
>
<RotateCw size={15} /> Retry
</Button>
)}
<Button
variant="secondary"
size="sm"
className="gap-2 rounded"
onClick={handleCopy}
>
<Copy size={15} /> Copy Id
</Button>
<Button
variant="secondary"
size="sm"
className="gap-2 rounded"
onClick={(e) => {
e.stopPropagation();
setDeleteDialogOpen(true);
}}
>
<Button
variant="ghost"
className="mr-0.5 h-8 shrink items-center justify-between gap-2 px-1.5"
>
<div className="flex items-center justify-between gap-2">
<EllipsisVertical size={16} />
</div>
</Button>
</DropdownMenuTrigger>
<Trash size={15} /> Delete
</Button>
</div>
<DropdownMenuContent align="end">
<DropdownMenuItem
onClick={(e) => {
setDeleteDialogOpen(true);
}}
>
<Button variant="link" size="sm" className="gap-2 rounded">
<Trash size={15} /> Delete
</Button>
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
<AlertDialog open={deleteDialogOpen} onOpenChange={setDeleteDialogOpen}>
<AlertDialogContent>
<AlertDialogHeader>

View File

@ -1,9 +1,10 @@
import { useState } from "react";
import { cn } from "~/lib/utils";
import { Badge } from "../ui/badge";
import { Badge, BadgeColor } from "../ui/badge";
import { type LogItem } from "~/hooks/use-logs";
import { LogOptions } from "./log-options";
import { LogDetails } from "./log-details";
import { getIconForAuthorise } from "../icon-utils";
import { useNavigate, useParams } from "@remix-run/react";
import { getStatusColor, getStatusValue } from "./utils";
import { File, MessageSquare } from "lucide-react";
interface LogTextCollapseProps {
text?: string;
@ -14,30 +15,9 @@ interface LogTextCollapseProps {
reset?: () => void;
}
const getStatusColor = (status: string) => {
switch (status) {
case "PROCESSING":
return "bg-blue-100 text-blue-800 hover:bg-blue-100 hover:text-blue-800";
case "PENDING":
return "bg-yellow-100 text-yellow-800 hover:bg-yellow-100 hover:text-yellow-800";
case "COMPLETED":
return "bg-success/10 text-success hover:bg-success/10 hover:text-success";
case "FAILED":
return "bg-destructive/10 text-destructive hover:bg-destructive/10 hover:text-destructive";
case "CANCELLED":
return "bg-gray-100 text-gray-800 hover:bg-gray-100 hover:text-gray-800";
default:
return "bg-gray-100 text-gray-800 hover:bg-gray-100 hover:text-gray-800";
}
};
export function LogTextCollapse({
text,
error,
id,
log,
}: LogTextCollapseProps) {
const [dialogOpen, setDialogOpen] = useState(false);
export function LogTextCollapse({ text, log }: LogTextCollapseProps) {
const { logId } = useParams();
const navigate = useNavigate();
// Show collapse if text is long (by word count)
const COLLAPSE_WORD_LIMIT = 30;
@ -61,67 +41,79 @@ export function LogTextCollapse({
displayText = text;
}
const showStatus = (log: LogItem) => {
if (log.status === "COMPLETED") {
return false;
}
return true;
};
const getIngestType = (log: LogItem) => {
const type = log.type ?? log.data.type ?? "CONVERSATION";
return type === "CONVERSATION" ? (
<MessageSquare size={14} />
) : (
<File size={14} />
);
};
return (
<div className="flex w-full items-center">
<div
className={cn(
"group-hover:bg-grayAlpha-100 flex min-w-[0px] shrink grow items-start gap-2 rounded-md px-4",
"group-hover:bg-grayAlpha-100 flex min-w-[0px] shrink grow items-start gap-2 rounded-md px-2 text-sm",
logId === log.id && "bg-grayAlpha-200",
)}
onClick={() => {
navigate(`/home/inbox/${log.id}`);
}}
>
<div
className={cn(
"border-border flex w-full min-w-[0px] shrink flex-col border-b py-1",
)}
onClick={() => {
setDialogOpen(true);
}}
>
<div className="flex w-full items-center justify-between gap-4">
<div className="inline-flex min-h-[24px] min-w-[0px] shrink cursor-pointer items-center justify-start">
<div className={cn("truncate text-left")}>
{text.replace(/<[^>]+>/g, "")}
<div className="border-border flex w-full min-w-[0px] shrink flex-col gap-1 border-b py-2">
<div className={cn("flex w-full min-w-[0px] shrink flex-col")}>
<div className="flex w-full items-center justify-between gap-4">
<div className="inline-flex min-h-[24px] min-w-[0px] shrink items-center justify-start">
<div className={cn("truncate text-left text-base")}>
{text.replace(/<[^>]+>/g, "")}
</div>
</div>
{showStatus(log) && (
<div className="text-muted-foreground flex shrink-0 items-center justify-end text-xs">
<div className="flex items-center">
<Badge
className={cn(
"!bg-grayAlpha-100 text-muted-foreground rounded text-xs",
)}
>
<BadgeColor className={cn(getStatusColor(log.status))} />
{getStatusValue(log.status)}
</Badge>
</div>
</div>
)}
</div>
</div>
<div className="flex items-center justify-between">
<div className="flex items-center gap-1 font-light">
{getIconForAuthorise(log.source.toLowerCase(), 12, undefined)}
{log.source.toLowerCase()}
</div>
<div className="text-muted-foreground flex shrink-0 items-center justify-end text-xs">
<div className="flex items-center">
<Badge
className={cn(
"bg-grayAlpha-100 text-foreground mr-3 rounded text-xs",
)}
>
{log.source}
</Badge>
<Badge
className={cn(
"mr-3 rounded text-xs",
getStatusColor(log.status),
)}
>
{log.status.charAt(0).toUpperCase() +
log.status.slice(1).toLowerCase()}
</Badge>
<div className="text-muted-foreground mr-3">
{new Date(log.time).toLocaleString()}
</div>
<div onClick={(e) => e.stopPropagation()}>
<LogOptions id={id} />
</div>
</div>
<div className="flex items-center gap-1">
<Badge
className={cn(
"text-muted-foreground rounded !bg-transparent text-xs",
)}
>
{getIngestType(log)}
</Badge>
</div>
</div>
</div>
</div>
<LogDetails
open={dialogOpen}
onOpenChange={setDialogOpen}
text={text}
error={error}
log={log}
/>
</div>
);
}

View File

@ -13,8 +13,10 @@ interface LogsFiltersProps {
availableSources: Array<{ name: string; slug: string }>;
selectedSource?: string;
selectedStatus?: string;
selectedType?: string;
onSourceChange: (source?: string) => void;
onStatusChange: (status?: string) => void;
onTypeChange: (type?: string) => void;
}
const statusOptions = [
@ -23,14 +25,21 @@ const statusOptions = [
{ value: "COMPLETED", label: "Completed" },
];
type FilterStep = "main" | "source" | "status";
const typeOptions = [
{ value: "CONVERSATION", label: "Conversation" },
{ value: "DOCUMENT", label: "Document" },
];
type FilterStep = "main" | "source" | "status" | "type";
export function LogsFilters({
availableSources,
selectedSource,
selectedStatus,
selectedType,
onSourceChange,
onStatusChange,
onTypeChange,
}: LogsFiltersProps) {
const [popoverOpen, setPopoverOpen] = useState(false);
const [step, setStep] = useState<FilterStep>("main");
@ -44,11 +53,14 @@ export function LogsFilters({
const selectedStatusLabel = statusOptions.find(
(s) => s.value === selectedStatus,
)?.label;
const selectedTypeLabel = typeOptions.find(
(s) => s.value === selectedType,
)?.label;
const hasFilters = selectedSource || selectedStatus;
const hasFilters = selectedSource || selectedStatus || selectedType;
return (
<div className="mb-2 flex w-full items-center justify-start gap-2 px-5">
<div className="mb-2 flex w-full items-center justify-start gap-2 px-3">
<Popover
open={popoverOpen}
onOpenChange={(open) => {
@ -85,6 +97,13 @@ export function LogsFilters({
>
Status
</Button>
<Button
variant="ghost"
className="justify-start"
onClick={() => setStep("type")}
>
Type
</Button>
</div>
)}
@ -155,6 +174,40 @@ export function LogsFilters({
))}
</div>
)}
{step === "type" && (
<div className="flex flex-col gap-1 p-2">
<Button
variant="ghost"
className="w-full justify-start"
onClick={() => {
onTypeChange(undefined);
setPopoverOpen(false);
setStep("main");
}}
>
All types
</Button>
{typeOptions.map((type) => (
<Button
key={type.value}
variant="ghost"
className="w-full justify-start"
onClick={() => {
onTypeChange(
type.value === selectedType
? undefined
: type.value,
);
setPopoverOpen(false);
setStep("main");
}}
>
{type.label}
</Button>
))}
</div>
)}
</PopoverContent>
</PopoverPortal>
</Popover>
@ -180,6 +233,15 @@ export function LogsFilters({
/>
</Badge>
)}
{selectedType && (
<Badge variant="secondary" className="h-7 gap-1 rounded px-2">
{selectedTypeLabel}
<X
className="hover:text-destructive h-3.5 w-3.5 cursor-pointer"
onClick={() => onTypeChange(undefined)}
/>
</Badge>
)}
</div>
)}
</div>

View File

@ -0,0 +1,26 @@
import { formatString } from "~/lib/utils";
export const getStatusColor = (status: string) => {
switch (status) {
case "PROCESSING":
return "bg-blue-800";
case "PENDING":
return "bg-warning";
case "COMPLETED":
return "bg-success";
case "FAILED":
return "bg-destructive";
case "CANCELLED":
return "bg-gray-800";
default:
return "bg-gray-800";
}
};
export function getStatusValue(status: string) {
if (status === "PENDING") {
return formatString("In Queue");
}
return formatString(status);
}

View File

@ -10,6 +10,7 @@ import {
import { type LogItem } from "~/hooks/use-logs";
import { ScrollManagedList } from "../virtualized-list";
import { LogTextCollapse } from "./log-text-collapse";
import { LoaderCircle } from "lucide-react";
interface VirtualLogsListProps {
logs: LogItem[];
@ -139,7 +140,7 @@ export function VirtualLogsList({
{isLoading && (
<div className="text-muted-foreground p-4 text-center text-sm">
Loading more logs...
<LoaderCircle size={18} className="mr-1 animate-spin" />
</div>
)}
</div>

View File

@ -0,0 +1,4 @@
export { OnboardingModal } from "./onboarding-modal";
export { Provider, OnboardingStep } from "./types";
export type { ProviderConfig, OnboardingState } from "./types";
export { PROVIDER_CONFIGS, SUGGESTED_INGESTION_PROMPTS, VERIFICATION_PROMPT } from "./provider-config";

View File

@ -0,0 +1,137 @@
import { useState } from "react";
import { Copy, Check, Loader2, AlertCircle } from "lucide-react";
import { Button } from "../ui";
import { SUGGESTED_INGESTION_PROMPTS } from "./provider-config";
interface IngestionStepProps {
providerName: string;
ingestionStatus: "idle" | "waiting" | "processing" | "complete" | "error";
onStartWaiting: () => void;
error?: string;
}
export function IngestionStep({
providerName,
ingestionStatus,
onStartWaiting,
error,
}: IngestionStepProps) {
const [copiedIndex, setCopiedIndex] = useState<number | null>(null);
const handleCopy = async (text: string, index: number) => {
await navigator.clipboard.writeText(text);
setCopiedIndex(index);
setTimeout(() => setCopiedIndex(null), 2000);
};
return (
<div className="space-y-6">
<div>
<h2 className="mb-2 text-xl font-semibold">
Let's Store Your First Memory
</h2>
<p className="text-muted-foreground text-sm">
Copy one of these prompts and paste it into {providerName} to create
your first memory
</p>
</div>
{ingestionStatus === "idle" && (
<>
<div className="space-y-3">
{SUGGESTED_INGESTION_PROMPTS.map((prompt, index) => (
<div
key={index}
className="group bg-grayAlpha-100 hover:border-primary/50 relative rounded-lg border border-gray-300 p-4 transition-colors"
>
<p className="pr-10 text-sm">{prompt}</p>
<button
onClick={() => handleCopy(prompt, index)}
className="hover:bg-background absolute top-3 right-3 rounded-md p-2 transition-colors"
title="Copy to clipboard"
>
{copiedIndex === index ? (
<Check className="h-4 w-4 text-green-500" />
) : (
<Copy className="text-muted-foreground h-4 w-4" />
)}
</button>
</div>
))}
</div>
<div className="flex items-center justify-between rounded-lg border border-blue-500/20 bg-blue-500/10 p-4">
<div className="flex items-start gap-3">
<AlertCircle className="mt-0.5 h-5 w-5 text-blue-500" />
<div className="text-sm">
<p className="font-medium text-blue-700 dark:text-blue-300">
Important
</p>
<p className="text-blue-600 dark:text-blue-400">
After pasting the prompt in {providerName}, click the button
below to wait for ingestion
</p>
</div>
</div>
</div>
<div className="flex justify-end">
<Button onClick={onStartWaiting} size="lg">
I've Sent the Prompt
</Button>
</div>
</>
)}
{(ingestionStatus === "waiting" || ingestionStatus === "processing") && (
<div className="flex flex-col items-center justify-center space-y-4 py-12">
<Loader2 className="text-primary h-12 w-12 animate-spin" />
<div className="space-y-2 text-center">
<h3 className="text-lg font-medium">
{ingestionStatus === "waiting"
? "Waiting for your first ingestion..."
: "Processing your memory..."}
</h3>
<p className="text-muted-foreground max-w-md text-sm">
{ingestionStatus === "waiting"
? "Make sure you've sent the prompt in your provider app. We're listening for the first memory ingestion."
: "We're storing your information. This usually takes a few seconds."}
</p>
</div>
</div>
)}
{ingestionStatus === "complete" && (
<div className="flex flex-col items-center justify-center space-y-4 py-12">
<div className="flex h-16 w-16 items-center justify-center rounded-full bg-green-500/10">
<Check className="h-8 w-8 text-green-500" />
</div>
<div className="space-y-2 text-center">
<h3 className="text-lg font-medium">Memory stored successfully!</h3>
<p className="text-muted-foreground text-sm">
Your first memory has been ingested. Let's verify it worked.
</p>
</div>
</div>
)}
{ingestionStatus === "error" && (
<div className="flex flex-col items-center justify-center space-y-4 py-12">
<div className="flex h-16 w-16 items-center justify-center rounded-full bg-red-500/10">
<AlertCircle className="h-8 w-8 text-red-500" />
</div>
<div className="space-y-2 text-center">
<h3 className="text-lg font-medium">Something went wrong</h3>
<p className="text-muted-foreground max-w-md text-sm">
{error ||
"We couldn't detect your memory ingestion. Please try again or check your provider connection."}
</p>
</div>
<Button onClick={onStartWaiting} variant="secondary">
Try Again
</Button>
</div>
)}
</div>
);
}

View File

@ -0,0 +1,230 @@
import { useState } from "react";
import { Dialog, DialogContent, DialogHeader, DialogTitle } from "../ui/dialog";
import { type Provider, OnboardingStep } from "./types";
import { ProviderSelectionStep } from "./provider-selection-step";
import { IngestionStep } from "./ingestion-step";
import { VerificationStep } from "./verification-step";
import { PROVIDER_CONFIGS } from "./provider-config";
import { Progress } from "../ui/progress";
interface OnboardingModalProps {
isOpen: boolean;
onClose: () => void;
onComplete: () => void;
}
export function OnboardingModal({
isOpen,
onClose,
onComplete,
}: OnboardingModalProps) {
const [currentStep, setCurrentStep] = useState<OnboardingStep>(
OnboardingStep.PROVIDER_SELECTION,
);
const [selectedProvider, setSelectedProvider] = useState<Provider>();
const [ingestionStatus, setIngestionStatus] = useState<
"idle" | "waiting" | "processing" | "complete" | "error"
>("idle");
const [verificationResult, setVerificationResult] = useState<string>();
const [isCheckingRecall, setIsCheckingRecall] = useState(false);
const [error, setError] = useState<string>();
// Calculate progress
const getProgress = () => {
switch (currentStep) {
case OnboardingStep.PROVIDER_SELECTION:
return 33;
case OnboardingStep.FIRST_INGESTION:
return 66;
case OnboardingStep.VERIFICATION:
return 100;
default:
return 0;
}
};
// Poll for ingestion status
const pollIngestion = async () => {
setIngestionStatus("waiting");
try {
const maxAttempts = 30; // 60 seconds (30 * 2s)
let attempts = 0;
// Store the timestamp when polling starts
const startTime = Date.now();
const poll = async (): Promise<boolean> => {
if (attempts >= maxAttempts) {
throw new Error("Ingestion timeout - please try again");
}
// Check for new ingestion logs from the last 5 minutes
const response = await fetch("/api/v1/logs?limit=1");
const data = await response.json();
// Check if there's a recent ingestion (created after we started polling)
if (data.logs && data.logs.length > 0) {
const latestLog = data.logs[0];
const logTime = new Date(latestLog.time).getTime();
// If the log was created after we started polling, we found a new ingestion
if (logTime >= startTime) {
return true;
}
}
await new Promise((resolve) => setTimeout(resolve, 2000));
attempts++;
return poll();
};
const success = await poll();
if (success) {
setIngestionStatus("complete");
// Auto-advance to verification step after 2 seconds
setTimeout(() => {
setCurrentStep(OnboardingStep.VERIFICATION);
}, 2000);
}
} catch (err) {
setError(err instanceof Error ? err.message : "Unknown error occurred");
setIngestionStatus("error");
}
};
const handleProviderSelect = (provider: Provider) => {
setSelectedProvider(provider);
};
const handleContinueFromProvider = () => {
setCurrentStep(OnboardingStep.FIRST_INGESTION);
};
const handleStartWaiting = () => {
pollIngestion();
};
const handleComplete = () => {
setCurrentStep(OnboardingStep.COMPLETE);
onComplete();
onClose();
};
// Poll for recall logs to detect verification
const pollRecallLogs = async () => {
setIsCheckingRecall(true);
try {
const maxAttempts = 30; // 60 seconds
let attempts = 0;
const startTime = Date.now();
const poll = async (): Promise<string | null> => {
if (attempts >= maxAttempts) {
throw new Error("Verification timeout - please try again");
}
// Check for new recall logs
const response = await fetch("/api/v1/recall-logs?limit=1");
const data = await response.json();
// Check if there's a recent recall (created after we started polling)
if (data.recallLogs && data.recallLogs.length > 0) {
const latestRecall = data.recallLogs[0];
const recallTime = new Date(latestRecall.createdAt).getTime();
// If the recall was created after we started polling
if (recallTime >= startTime) {
// Return the query as verification result
return latestRecall.query || "Recall detected successfully";
}
}
await new Promise((resolve) => setTimeout(resolve, 2000));
attempts++;
return poll();
};
const result = await poll();
if (result) {
setVerificationResult(result);
setIsCheckingRecall(false);
}
} catch (err) {
setError(err instanceof Error ? err.message : "Unknown error occurred");
setIsCheckingRecall(false);
}
};
const getStepTitle = () => {
switch (currentStep) {
case OnboardingStep.PROVIDER_SELECTION:
return "Step 1 of 3";
case OnboardingStep.FIRST_INGESTION:
return "Step 2 of 3";
case OnboardingStep.VERIFICATION:
return "Step 3 of 3";
default:
return "";
}
};
return (
<Dialog open={isOpen} onOpenChange={onClose}>
<DialogContent className="max-h-[90vh] max-w-3xl overflow-y-auto p-4">
<DialogHeader>
<div className="space-y-3">
<DialogTitle className="text-2xl">Welcome to Core</DialogTitle>
<div className="space-y-2">
<div className="flex items-center justify-between">
<p className="text-muted-foreground text-sm">
{getStepTitle()}
</p>
</div>
<Progress
segments={[{ value: getProgress() }]}
className="mb-2"
color="#c15e50"
/>
</div>
</div>
</DialogHeader>
<div>
{currentStep === OnboardingStep.PROVIDER_SELECTION && (
<ProviderSelectionStep
selectedProvider={selectedProvider}
onSelectProvider={handleProviderSelect}
onContinue={handleContinueFromProvider}
/>
)}
{currentStep === OnboardingStep.FIRST_INGESTION &&
selectedProvider && (
<IngestionStep
providerName={PROVIDER_CONFIGS[selectedProvider].name}
ingestionStatus={ingestionStatus}
onStartWaiting={handleStartWaiting}
error={error}
/>
)}
{currentStep === OnboardingStep.VERIFICATION && selectedProvider && (
<VerificationStep
providerName={PROVIDER_CONFIGS[selectedProvider].name}
verificationResult={verificationResult}
isCheckingRecall={isCheckingRecall}
onStartChecking={pollRecallLogs}
onComplete={handleComplete}
/>
)}
</div>
</DialogContent>
</Dialog>
);
}

View File

@ -0,0 +1,166 @@
import { useState, useEffect } from "react";
import { Card, CardContent, CardHeader, CardTitle } from "~/components/ui/card";
import { Button } from "~/components/ui";
import { Checkbox } from "~/components/ui/checkbox";
import { Label } from "~/components/ui/label";
import type { OnboardingQuestion, OnboardingAnswer } from "./onboarding-utils";
interface OnboardingQuestionProps {
question: OnboardingQuestion;
answer?: string | string[];
onAnswer: (answer: OnboardingAnswer) => void;
onNext: () => void;
onPrevious?: () => void;
isFirst: boolean;
isLast: boolean;
currentStep: number;
totalSteps: number;
loading?: boolean;
}
export default function OnboardingQuestionComponent({
question,
answer,
onAnswer,
onNext,
onPrevious,
isFirst,
isLast,
currentStep,
totalSteps,
loading,
}: OnboardingQuestionProps) {
const [selectedValue, setSelectedValue] = useState<string | string[]>(
answer || (question.type === "multi-select" ? [] : ""),
);
// Sync local state when answer prop changes (e.g., when navigating between steps)
useEffect(() => {
setSelectedValue(answer || (question.type === "multi-select" ? [] : ""));
}, [answer, question.type]);
const handleSingleSelect = (value: string) => {
setSelectedValue(value);
onAnswer({ questionId: question.id, value });
};
const handleMultiSelect = (optionValue: string, checked: boolean) => {
const currentValues = Array.isArray(selectedValue) ? selectedValue : [];
const newValues = checked
? [...currentValues, optionValue]
: currentValues.filter((v) => v !== optionValue);
setSelectedValue(newValues);
onAnswer({ questionId: question.id, value: newValues });
};
const isValid = () => {
if (!question.required) return true;
if (question.type === "multi-select") {
return Array.isArray(selectedValue) && selectedValue.length > 0;
}
return selectedValue && selectedValue !== "";
};
return (
<div className="mx-auto w-full max-w-md">
<Card className="bg-background-2 w-full rounded-lg p-3 pt-1">
<CardHeader className="flex flex-col items-start px-0">
<div className="mb-2 flex w-full items-center justify-between">
<span className="text-muted-foreground text-sm">
Step {currentStep} of {totalSteps}
</span>
<div className="bg-grayAlpha-100 h-1.5 w-32 rounded-full">
<div
className="bg-primary h-1.5 rounded-full transition-all duration-300"
style={{ width: `${(currentStep / totalSteps) * 100}%` }}
/>
</div>
</div>
</CardHeader>
<CardContent className="text-base">
<div className="space-y-6">
<div>
<CardTitle className="mb-2 text-xl">{question.title}</CardTitle>
</div>
{question.type === "single-select" && question.options && (
<div className="space-y-3">
{question.options.map((option) => (
<Button
key={option.id}
type="button"
variant={
selectedValue === option.value ? "secondary" : "outline"
}
className="hover:bg-grayAlpha-100 h-auto w-full justify-start px-4 py-3 text-left font-normal"
onClick={() => handleSingleSelect(option.value)}
>
{option.label}
</Button>
))}
</div>
)}
{question.type === "multi-select" && question.options && (
<div className="space-y-3">
{question.options.map((option) => (
<div key={option.id} className="flex items-center space-x-2">
<Checkbox
id={option.id}
checked={
Array.isArray(selectedValue) &&
selectedValue.includes(option.value)
}
onCheckedChange={(checked) =>
handleMultiSelect(option.value, !!checked)
}
className="h-6 w-6 text-xl"
checkboxClassname="h-5 w-5 text-xl"
/>
<Label
htmlFor={option.id}
className="cursor-pointer text-base font-normal"
>
{option.label}
</Label>
</div>
))}
</div>
)}
<div className="flex justify-end gap-2 pt-4">
{!isFirst && (
<Button
type="button"
variant="ghost"
size="xl"
onClick={onPrevious}
disabled={loading}
className="rounded-lg px-4 py-2"
>
Previous
</Button>
)}
<Button
type="button"
variant="secondary"
size="xl"
onClick={onNext}
isLoading={!!loading}
disabled={!isValid() || loading}
className="rounded-lg px-4 py-2"
>
{isLast ? "Complete Profile" : "Continue"}
</Button>
</div>
</div>
</CardContent>
</Card>
</div>
);
}

View File

@ -0,0 +1,567 @@
import type {
Triple,
EntityNode,
EpisodicNode,
StatementNode,
} from "@core/types";
import crypto from "crypto";
export interface OnboardingQuestion {
id: string;
title: string;
description?: string;
type: "single-select" | "multi-select" | "text";
options?: OnboardingOption[];
placeholder?: string;
required?: boolean;
}
export interface OnboardingOption {
id: string;
label: string;
value: string;
}
export interface OnboardingAnswer {
questionId: string;
value: string | string[];
}
// Onboarding questions in order
export const ONBOARDING_QUESTIONS: OnboardingQuestion[] = [
{
id: "role",
title: "What best describes you?",
description: 'Role / identity → anchors the "user" node',
type: "single-select",
options: [
{ id: "developer", label: "Developer", value: "Developer" },
{ id: "designer", label: "Designer", value: "Designer" },
{
id: "product-manager",
label: "Product Manager",
value: "Product Manager",
},
{
id: "engineering-manager",
label: "Engineering Manager",
value: "Engineering Manager",
},
{
id: "founder",
label: "Founder / Executive",
value: "Founder / Executive",
},
{ id: "other", label: "Other", value: "Other" },
],
required: true,
},
{
id: "goal",
title: "What's your primary goal with CORE?",
description: 'Motivation → drives the "objective" branch of graph',
type: "single-select",
options: [
{
id: "personal-memory",
label: "Build a personal memory system",
value: "Build a personal memory system",
},
{
id: "team-knowledge",
label: "Manage team/project knowledge",
value: "Manage team/project knowledge",
},
{
id: "automate-workflows",
label: "Automate workflows across tools",
value: "Automate workflows across tools",
},
{
id: "ai-assistant",
label: "Power an AI assistant / agent with context",
value: "Power an AI assistant / agent with context",
},
{
id: "explore-core",
label: "Explore core",
value: "Explore core",
},
],
required: true,
},
{
id: "tools",
title: "Which tools do you care about most?",
description: "Context → lets you connect integration nodes live",
type: "multi-select",
options: [
{ id: "claude", label: "Claude", value: "Claude" },
{ id: "claude-code", label: "Claude Code", value: "Claude Code" },
{ id: "cursor", label: "Cursor", value: "Cursor" },
{ id: "windsurf", label: "Windsurf", value: "Windsurf" },
{ id: "zed", label: "Zed", value: "Zed" },
{ id: "github", label: "GitHub", value: "GitHub" },
{ id: "slack", label: "Slack", value: "Slack" },
{ id: "notion", label: "Notion", value: "Notion" },
{ id: "obsidian", label: "Obsidian", value: "Obsidian" },
{ id: "gmail", label: "Gmail", value: "Gmail" },
{ id: "linear", label: "Linear", value: "Linear" },
{
id: "figma",
label: "Figma",
value: "Figma",
},
],
required: true,
},
];
// Helper function to create entity nodes (client-side, no embeddings)
function createEntity(
name: string,
type: string,
userId: string,
space?: string,
): EntityNode {
return {
uuid: crypto.randomUUID(),
name,
type,
attributes: {},
nameEmbedding: [], // Empty placeholder for client-side preview
typeEmbedding: [], // Empty placeholder for client-side preview
createdAt: new Date(),
userId,
space,
};
}
// Helper function to create episodic node (client-side, no embeddings)
function createEpisode(
content: string,
userId: string,
space?: string,
): EpisodicNode {
return {
uuid: crypto.randomUUID(),
content,
originalContent: content,
contentEmbedding: [], // Empty placeholder for client-side preview
metadata: { source: "onboarding" },
source: "onboarding",
createdAt: new Date(),
validAt: new Date(),
labels: ["onboarding"],
userId,
space,
};
}
// Helper function to create statement node (client-side, no embeddings)
function createStatement(
fact: string,
userId: string,
space?: string,
): StatementNode {
return {
uuid: crypto.randomUUID(),
fact,
factEmbedding: [], // Empty placeholder for client-side preview
createdAt: new Date(),
validAt: new Date(),
invalidAt: null,
attributes: {},
userId,
space,
};
}
// Create triplet from onboarding answer using reified knowledge graph structure (client-side, no embeddings)
export function createOnboardingTriplet(
username: string,
questionId: string,
answer: string | string[],
userId: string,
space?: string,
): Triple[] {
const triplets: Triple[] = [];
// Convert array answers to individual triplets
const answers = Array.isArray(answer) ? answer : [answer];
for (const singleAnswer of answers) {
// Get the statement mapping for this question type
const { predicateType, objectType, factTemplate } =
getStatementMapping(questionId);
// Create the statement fact (e.g., "Manoj uses GitHub")
const fact = factTemplate(username, singleAnswer);
// Create entities following CORE's reified structure (client-side preview only)
const subject = createEntity(username, "Person", userId, space);
const predicate = createEntity(
predicateType.toLowerCase().replace("_", " "), // "uses tool" instead of "USES_TOOL"
"Predicate", // Use "Predicate" type instead of "Relationship"
userId,
space,
);
const object = createEntity(singleAnswer, objectType, userId, space);
// Create statement node as first-class object (client-side preview only)
const statement = createStatement(fact, userId, space);
// Create provenance episode (client-side preview only)
const provenance = createEpisode(
`Onboarding question: ${questionId} - Answer: ${singleAnswer}`,
userId,
space,
);
// Create the reified triple structure (no embeddings for client preview)
triplets.push({
statement,
subject,
predicate,
object,
provenance,
});
}
return triplets;
}
// Create initial identity statement for preview using reified knowledge graph structure
export function createInitialIdentityStatement(displayName: string): any {
const timestamp = Date.now();
const now = new Date().toISOString();
// Create the identity statement: "I'm [DisplayName]" using reified structure
const fact = `I'm ${displayName}`;
return {
// Statement node (center)
statementNode: {
uuid: `identity-statement-${timestamp}`,
name: fact,
labels: ["Statement"],
attributes: {
nodeType: "Statement",
type: "Statement",
fact: fact,
source: "onboarding",
validAt: now,
},
createdAt: now,
},
// Subject entity ("I")
subjectNode: {
uuid: `pronoun-${timestamp}`,
name: "I",
labels: ["Entity"],
attributes: {
nodeType: "Entity",
type: "Pronoun",
source: "onboarding",
},
createdAt: now,
},
// Predicate entity ("am")
predicateNode: {
uuid: `predicate-identity-${timestamp}`,
name: "am",
labels: ["Entity"],
attributes: {
nodeType: "Entity",
type: "Predicate",
source: "onboarding",
},
createdAt: now,
},
// Object entity (DisplayName)
objectNode: {
uuid: `user-${timestamp}`,
name: displayName,
labels: ["Entity"],
attributes: {
nodeType: "Entity",
type: "Person",
source: "onboarding",
},
createdAt: now,
},
// Edges connecting statement to subject, predicate, object
edges: {
hasSubject: {
uuid: `identity-has-subject-${timestamp}`,
type: "HAS_SUBJECT",
source_node_uuid: `identity-statement-${timestamp}`,
target_node_uuid: `pronoun-${timestamp}`,
createdAt: now,
},
hasPredicate: {
uuid: `identity-has-predicate-${timestamp}`,
type: "HAS_PREDICATE",
source_node_uuid: `identity-statement-${timestamp}`,
target_node_uuid: `predicate-identity-${timestamp}`,
createdAt: now,
},
hasObject: {
uuid: `identity-has-object-${timestamp}`,
type: "HAS_OBJECT",
source_node_uuid: `identity-statement-${timestamp}`,
target_node_uuid: `user-${timestamp}`,
createdAt: now,
},
},
};
}
// Create progressive episode content as user answers questions
export function createProgressiveEpisode(
username: string,
answers: OnboardingAnswer[],
): string {
// Start with identity
let episodeContent = `I'm ${username}.`;
// Build episode progressively based on answers
for (const answer of answers) {
const values = Array.isArray(answer.value) ? answer.value : [answer.value];
switch (answer.questionId) {
case "role":
episodeContent += ` I'm a ${values[0]}.`;
break;
case "goal":
episodeContent += ` My primary goal with CORE is to ${values[0].toLowerCase()}.`;
break;
case "tools":
if (values.length === 1) {
episodeContent += ` I use ${values[0]}.`;
} else if (values.length === 2) {
episodeContent += ` I use ${values[0]} and ${values[1]}.`;
} else {
// Create a copy to avoid mutating the original array
const toolsCopy = [...values];
const lastTool = toolsCopy.pop();
episodeContent += ` I use ${toolsCopy.join(", ")}, and ${lastTool}.`;
}
break;
}
}
return episodeContent;
}
// Create preview statements for real-time visualization (reified structure)
// Including episode hierarchy: Episode → Statements → Entities
export function createPreviewStatements(
username: string,
answers: OnboardingAnswer[],
): { episode: any; statements: any[] } {
const allStatements: any[] = [];
const now = new Date().toISOString();
const baseTimestamp = Date.now();
// Create the cumulative episode content
const episodeContent = createProgressiveEpisode(username, answers);
// Create episode node that contains all statements
const episode = {
uuid: `onboarding-episode-${baseTimestamp}`,
name: username,
content: episodeContent,
labels: ["Episode"],
attributes: {
nodeType: "Episode",
type: "Episode",
source: "onboarding",
content: episodeContent,
validAt: now,
},
createdAt: now,
};
// Create user entity that will be the subject of all statements
const userEntityId = `user-${baseTimestamp}`;
for (let i = 0; i < answers.length; i++) {
const answer = answers[i];
const values = Array.isArray(answer.value) ? answer.value : [answer.value];
for (let j = 0; j < values.length; j++) {
const value = values[j];
const uniqueId = `${baseTimestamp}-${i}-${j}`;
// Get the relationship mapping for this question
const { predicateType, objectType, factTemplate } = getStatementMapping(
answer.questionId,
);
// Create the statement fact (e.g., "Manoj uses GitHub")
const fact = factTemplate(username, value);
// Create statement visualization as a reified structure
const statement = {
// Statement node (center)
statementNode: {
uuid: `statement-${uniqueId}`,
name: fact,
labels: ["Statement"],
attributes: {
nodeType: "Statement",
type: "Statement",
fact: fact,
source: "onboarding",
validAt: now,
},
createdAt: now,
},
// Subject entity (user)
subjectNode: {
uuid: userEntityId,
name: username,
labels: ["Entity"],
attributes: {
nodeType: "Entity",
type: "Person",
source: "onboarding",
},
createdAt: now,
},
// Predicate entity (relationship type)
predicateNode: {
uuid: `predicate-${predicateType}-${uniqueId}`,
name: predicateType.toLowerCase().replace("_", " "),
labels: ["Entity"],
attributes: {
nodeType: "Entity",
type: "Predicate",
source: "onboarding",
},
createdAt: now,
},
// Object entity (the thing being related to)
objectNode: {
uuid: `object-${uniqueId}`,
name: value,
labels: ["Entity"],
attributes: {
nodeType: "Entity",
type: objectType,
source: "onboarding",
},
createdAt: now,
},
// Edges connecting statement to subject, predicate, object
edges: {
hasSubject: {
uuid: `has-subject-${uniqueId}`,
type: "HAS_SUBJECT",
source_node_uuid: `statement-${uniqueId}`,
target_node_uuid: userEntityId,
createdAt: now,
},
hasPredicate: {
uuid: `has-predicate-${uniqueId}`,
type: "HAS_PREDICATE",
source_node_uuid: `statement-${uniqueId}`,
target_node_uuid: `predicate-${predicateType}-${uniqueId}`,
createdAt: now,
},
hasObject: {
uuid: `has-object-${uniqueId}`,
type: "HAS_OBJECT",
source_node_uuid: `statement-${uniqueId}`,
target_node_uuid: `object-${uniqueId}`,
createdAt: now,
},
// Provenance connection: Episode → Statement
hasProvenance: {
uuid: `provenance-${uniqueId}`,
type: "HAS_PROVENANCE",
source_node_uuid: `statement-${uniqueId}`,
target_node_uuid: episode.uuid,
createdAt: now,
},
},
};
allStatements.push(statement);
}
}
return { episode, statements: allStatements };
}
// Helper function to map question types to statement templates with natural English phrasing
function getStatementMapping(questionId: string): {
predicateType: string;
objectType: string;
factTemplate: (subject: string, object: string) => string;
} {
switch (questionId) {
case "role":
return {
predicateType: "IS_A",
objectType: "Role",
factTemplate: (subject, object) =>
`${subject} is a ${object.toLowerCase()}`,
};
case "goal":
return {
predicateType: "WANTS_TO",
objectType: "Goal",
factTemplate: (subject, object) =>
`${subject} wants to ${object.toLowerCase()}`,
};
case "tools":
return {
predicateType: "USES",
objectType: "Tool",
factTemplate: (subject, object) => `${subject} uses ${object}`,
};
default:
return {
predicateType: "HAS",
objectType: "Attribute",
factTemplate: (subject, object) => `${subject} has ${object}`,
};
}
}
// Create main onboarding episode (client-side preview, no embeddings)
export function createOnboardingEpisode(
username: string,
answers: OnboardingAnswer[],
userId: string,
space?: string,
): EpisodicNode {
// Generate progressive episode content
const episodeContent = createProgressiveEpisode(username, answers);
// Create the main onboarding episode for client preview
const episode: EpisodicNode = {
uuid: crypto.randomUUID(),
content: episodeContent,
originalContent: episodeContent, // Same as content for onboarding
contentEmbedding: [], // Empty placeholder for client-side preview
source: "onboarding",
metadata: {
completedAt: new Date().toISOString(),
questionCount: answers.length,
answersData: answers, // Store original answers for reference
},
createdAt: new Date(),
validAt: new Date(),
labels: ["onboarding", "user-profile"],
userId,
space,
sessionId: crypto.randomUUID(), // Generate unique session for onboarding
};
return episode;
}

View File

@ -0,0 +1,54 @@
import { Provider, type ProviderConfig } from "./types";
export const PROVIDER_CONFIGS: Record<Provider, ProviderConfig> = {
[Provider.CLAUDE_CODE]: {
id: Provider.CLAUDE_CODE,
name: "Claude Code CLI",
description: "Connect your Claude Code CLI to CORE's memory system",
docsUrl: "https://docs.heysol.ai/providers/claude-code",
icon: "claude",
},
[Provider.CLAUDE]: {
id: Provider.CLAUDE,
name: "Claude",
description: "Connect your Claude Desktop app to CORE's memory system",
docsUrl: "https://docs.heysol.ai/providers/claude",
icon: "claude",
},
[Provider.CURSOR]: {
id: Provider.CURSOR,
name: "Cursor",
description: "Connect your Cursor Desktop app to CORE's memory system",
docsUrl: "https://docs.heysol.ai/providers/cursor",
icon: "cursor",
},
[Provider.KILO_CODE]: {
id: Provider.KILO_CODE,
name: "Kilo-Code",
description: "Connect Kilo Code Agent to CORE's memory system via MCP",
docsUrl: "https://docs.heysol.ai/providers/kilo-code",
icon: "kilo-code",
},
[Provider.VSCODE]: {
id: Provider.VSCODE,
name: "VS Code (Github Copilot)",
description: "Connect your VS Code editor to CORE's memory system via MCP",
docsUrl: "https://docs.heysol.ai/providers/vscode",
icon: "vscode",
},
[Provider.ZED]: {
id: Provider.ZED,
name: "Zed",
description: "Connect your Zed editor to CORE's memory system via MCP",
docsUrl: "https://docs.heysol.ai/providers/zed",
icon: "zed",
},
};
export const SUGGESTED_INGESTION_PROMPTS = [
"I'm a full-stack developer working on a React and Node.js application. I prefer TypeScript, functional programming patterns, and writing comprehensive tests.",
"I'm working on a machine learning project using Python and PyTorch. I focus on computer vision and prefer Jupyter notebooks for exploration.",
"I'm a DevOps engineer managing Kubernetes clusters. I work primarily with Terraform, Helm, and CI/CD pipelines using GitHub Actions.",
];
export const VERIFICATION_PROMPT = "Who am I? Tell me what you know about me.";

View File

@ -0,0 +1,89 @@
import { Check, ExternalLink } from "lucide-react";
import { Button } from "../ui";
import { PROVIDER_CONFIGS } from "./provider-config";
import { type Provider } from "./types";
import { getIconForAuthorise } from "../icon-utils";
interface ProviderSelectionStepProps {
selectedProvider?: Provider;
onSelectProvider: (provider: Provider) => void;
onContinue: () => void;
}
export function ProviderSelectionStep({
selectedProvider,
onSelectProvider,
onContinue,
}: ProviderSelectionStepProps) {
const providers = Object.values(PROVIDER_CONFIGS);
return (
<div className="space-y-2">
<div>
<h2 className="mb-2 text-xl font-semibold">Choose Your Provider</h2>
<p className="text-muted-foreground text-sm">
Select the application you'll use to connect with Core
</p>
</div>
<div className="grid grid-cols-1 gap-3 sm:grid-cols-2 lg:grid-cols-3">
{providers.map((provider) => {
const isSelected = selectedProvider === provider.id;
return (
<Button
key={provider.id}
variant="outline"
onClick={() => onSelectProvider(provider.id)}
size="2xl"
className={`relative flex flex-col items-start justify-center gap-1 rounded-lg border-1 border-gray-300 p-4 text-left transition-all ${
isSelected
? "border-primary bg-primary/5"
: "hover:border-primary/50 border-gray-300"
}`}
>
<div className="flex h-full items-center gap-2">
{getIconForAuthorise(provider.icon, 20)}
<div className="flex items-center gap-2">
<h3 className="font-medium">{provider.name}</h3>
</div>
</div>
</Button>
);
})}
</div>
{selectedProvider && (
<div className="bg-grayAlpha-100 space-y-4 rounded-lg p-4">
<div className="space-y-3">
<h3 className="font-medium">Next Steps</h3>
<p className="text-muted-foreground text-sm">
Follow our setup guide to connect{" "}
{PROVIDER_CONFIGS[selectedProvider].name} with Core. Once you've
completed the setup, come back here to continue.
</p>
<a
href={PROVIDER_CONFIGS[selectedProvider].docsUrl}
target="_blank"
rel="noopener noreferrer"
className="bg-primary text-primary-foreground hover:bg-primary/90 inline-flex items-center gap-2 rounded-md px-4 py-2 text-sm font-medium transition-colors"
>
Open Setup Guide
<ExternalLink className="h-4 w-4" />
</a>
</div>
</div>
)}
<div className="flex justify-end">
<Button
onClick={onContinue}
disabled={!selectedProvider}
size="lg"
variant="secondary"
>
Continue to Setup
</Button>
</div>
</div>
);
}

View File

@ -0,0 +1,32 @@
export enum Provider {
CLAUDE_CODE = "claude-code",
CLAUDE = "claude",
CURSOR = "cursor",
KILO_CODE = "kilo-code",
VSCODE = "vscode",
ZED = "zed",
}
export enum OnboardingStep {
PROVIDER_SELECTION = "provider_selection",
FIRST_INGESTION = "first_ingestion",
VERIFICATION = "verification",
COMPLETE = "complete",
}
export interface ProviderConfig {
id: Provider;
name: string;
description: string;
docsUrl: string;
icon: string;
}
export interface OnboardingState {
currentStep: OnboardingStep;
selectedProvider?: Provider;
isConnected: boolean;
ingestionStatus: "idle" | "waiting" | "processing" | "complete" | "error";
verificationResult?: string;
error?: string;
}

View File

@ -0,0 +1,101 @@
import { useState } from "react";
import {
Copy,
Check,
AlertCircle,
ThumbsUp,
ThumbsDown,
Loader2,
} from "lucide-react";
import { Button } from "../ui";
import { VERIFICATION_PROMPT } from "./provider-config";
interface VerificationStepProps {
providerName: string;
verificationResult?: string;
isCheckingRecall?: boolean;
onStartChecking: () => void;
onComplete: () => void;
}
export function VerificationStep({
providerName,
verificationResult,
isCheckingRecall = false,
onStartChecking,
onComplete,
}: VerificationStepProps) {
const [copied, setCopied] = useState(false);
const handleCopy = async () => {
await navigator.clipboard.writeText(VERIFICATION_PROMPT);
setCopied(true);
setTimeout(() => setCopied(false), 2000);
};
return (
<div className="space-y-6">
<div>
<h2 className="mb-2 text-xl font-semibold">Verify Your Memory</h2>
<p className="text-muted-foreground text-sm">
Let's test if your memory is working correctly by asking the AI about
you
</p>
</div>
{!verificationResult && !isCheckingRecall && (
<>
<div className="group bg-grayAlpha-100 relative rounded-lg border border-gray-300 p-4">
<p className="mb-1 text-sm font-medium">Copy this prompt:</p>
<p className="pr-10 text-sm">{VERIFICATION_PROMPT}</p>
<button
onClick={handleCopy}
className="hover:bg-background absolute top-3 right-3 rounded-md p-2 transition-colors"
title="Copy to clipboard"
>
{copied ? (
<Check className="h-4 w-4 text-green-500" />
) : (
<Copy className="text-muted-foreground h-4 w-4" />
)}
</button>
</div>
<div className="flex items-center gap-3 rounded-lg border border-blue-500/20 bg-blue-500/10 p-4">
<AlertCircle className="h-5 w-5 shrink-0 text-blue-500" />
<div className="flex-1 text-sm">
<p className="text-blue-600 dark:text-blue-400">
Paste this prompt in {providerName}. Once you ask, click the
button below to detect the recall.
</p>
</div>
</div>
<div className="flex justify-end gap-3">
<Button onClick={onComplete} variant="ghost" size="lg">
Skip Verification
</Button>
<Button onClick={onStartChecking} size="lg" variant="secondary">
I've Asked the Question
</Button>
</div>
</>
)}
{isCheckingRecall && !verificationResult && (
<div className="flex flex-col items-center justify-center space-y-4 py-12">
<Loader2 className="text-primary h-12 w-12 animate-spin" />
<div className="space-y-2 text-center">
<h3 className="text-lg font-medium">
Waiting for your recall query...
</h3>
<p className="text-muted-foreground max-w-md text-sm">
Make sure you've asked "{VERIFICATION_PROMPT}" in {providerName}.
We're listening for the recall.
</p>
</div>
</div>
)}
</div>
);
}

View File

@ -1,4 +1,5 @@
import * as React from "react";
import { useHotkeys } from "react-hotkeys-hook";
import {
Sidebar,
@ -9,22 +10,33 @@ import {
SidebarMenuItem,
} from "../ui/sidebar";
import {
Activity,
Columns3,
Inbox,
LayoutGrid,
LoaderCircle,
MessageSquare,
Network,
Plus,
} from "lucide-react";
import { NavMain } from "./nav-main";
import { useUser } from "~/hooks/useUser";
import { NavUser } from "./nav-user";
import Logo from "../logo/logo";
import { ConversationList } from "../conversation";
import { Button } from "../ui";
import { Project } from "../icons/project";
import { AddMemoryCommand } from "../command-bar/add-memory-command";
import { AddMemoryDialog } from "../command-bar/memory-dialog.client";
const data = {
navMain: [
{
title: "Conversation",
title: "Inbox",
url: "/home/inbox",
icon: Inbox,
},
{
title: "Chat",
url: "/home/conversation",
icon: MessageSquare,
},
@ -36,12 +48,7 @@ const data = {
{
title: "Spaces",
url: "/home/space",
icon: Columns3,
},
{
title: "Activity",
url: "/home/logs",
icon: Activity,
icon: Project,
},
{
title: "Integrations",
@ -54,33 +61,57 @@ const data = {
export function AppSidebar({ ...props }: React.ComponentProps<typeof Sidebar>) {
const user = useUser();
return (
<Sidebar
variant="inset"
{...props}
className="bg-background h-[100vh] py-2"
>
<SidebarHeader>
<SidebarMenu>
<SidebarMenuItem>
<div className="mt-1 ml-1 flex w-full items-center justify-start gap-2">
<Logo width={20} height={20} />
C.O.R.E.
</div>
</SidebarMenuItem>
</SidebarMenu>
</SidebarHeader>
<SidebarContent>
<NavMain items={data.navMain} />
<div className="mt-4 flex h-full flex-col">
<h2 className="text-muted-foreground px-4 text-sm"> History </h2>
<ConversationList />
</div>
</SidebarContent>
const [showAddMemory, setShowAddMemory] = React.useState(false);
<SidebarFooter className="px-2">
<NavUser user={user} />
</SidebarFooter>
</Sidebar>
// Open command bar with Meta+K (Cmd+K on Mac, Ctrl+K on Windows/Linux)
useHotkeys("meta+k", (e) => {
e.preventDefault();
setShowAddMemory(true);
});
return (
<>
<Sidebar
variant="inset"
{...props}
className="bg-background h-[100vh] py-2"
>
<SidebarHeader>
<SidebarMenu>
<SidebarMenuItem className="flex justify-center">
<div className="mt-1 ml-1 flex w-full items-center justify-start gap-2">
<Logo size={20} />
C.O.R.E.
</div>
<Button
variant="secondary"
isActive
size="sm"
className="rounded"
onClick={() => setShowAddMemory(true)}
>
<Plus size={16} />
</Button>
</SidebarMenuItem>
</SidebarMenu>
</SidebarHeader>
<SidebarContent>
<NavMain items={data.navMain} />
<div className="mt-4 flex h-full flex-col">
<h2 className="text-muted-foreground px-4 text-sm"> History </h2>
<ConversationList />
</div>
</SidebarContent>
<SidebarFooter className="flex flex-col px-2">
<NavUser user={user} />
</SidebarFooter>
</Sidebar>
{showAddMemory && (
<AddMemoryDialog open={showAddMemory} onOpenChange={setShowAddMemory} />
)}
</>
);
}

View File

@ -53,7 +53,7 @@ export function NavUser({ user }: { user: ExtendedUser }) {
<DropdownMenuSeparator />
<DropdownMenuItem
className="flex gap-2"
onClick={() => navigate("/settings/api")}
onClick={() => navigate("/settings/account")}
>
<Settings size={16} />
Settings
@ -67,6 +67,15 @@ export function NavUser({ user }: { user: ExtendedUser }) {
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
<Button
variant="ghost"
onClick={() => {
navigate("/settings/billing");
}}
>
<div>{user.availableCredits} credits</div>
</Button>
</SidebarMenuItem>
</SidebarMenu>
);

View File

@ -17,8 +17,8 @@ interface SpaceCardProps {
createdAt: string;
updatedAt: string;
autoMode: boolean;
statementCount: number | null;
summary: string | null;
contextCount?: number | null;
themes?: string[];
};
}
@ -46,13 +46,17 @@ export function SpaceCard({ space }: SpaceCardProps) {
</div>
<CardTitle className="text-base">{space.name}</CardTitle>
<CardDescription className="line-clamp-2 text-xs">
{space.description || space.summary || "Knowledge space"}
<p
dangerouslySetInnerHTML={{
__html: space.description || space.summary || "Knowledge space",
}}
></p>
</CardDescription>
<div className="text-muted-foreground mt-2 flex items-center justify-between text-xs">
{space.statementCount && space.statementCount > 0 && (
{space.contextCount && space.contextCount > 0 && (
<div>
{space.statementCount} fact
{space.statementCount !== 1 ? "s" : ""}
{space.contextCount} episode
{space.contextCount !== 1 ? "s" : ""}
</div>
)}
</div>

View File

@ -0,0 +1,167 @@
import { useState, useEffect } from "react";
import { Check, Plus, X } from "lucide-react";
import { Button } from "~/components/ui/button";
import {
Popover,
PopoverContent,
PopoverPortal,
PopoverTrigger,
} from "~/components/ui/popover";
import {
Command,
CommandEmpty,
CommandGroup,
CommandInput,
CommandItem,
CommandList,
} from "~/components/ui/command";
import { Badge } from "~/components/ui/badge";
import { cn } from "~/lib/utils";
import { useFetcher } from "@remix-run/react";
import { Project } from "../icons/project";
interface Space {
id: string;
name: string;
description?: string;
}
interface SpaceDropdownProps {
episodeIds: string[];
selectedSpaceIds?: string[];
onSpaceChange?: (spaceIds: string[]) => void;
className?: string;
}
export function SpaceDropdown({
episodeIds,
selectedSpaceIds = [],
onSpaceChange,
className,
}: SpaceDropdownProps) {
const [open, setOpen] = useState(false);
const [selectedSpaces, setSelectedSpaces] =
useState<string[]>(selectedSpaceIds);
const [spaces, setSpaces] = useState<Space[]>([]);
const spacesFetcher = useFetcher<{ spaces: Space[] }>();
const assignFetcher = useFetcher();
// Fetch all spaces
useEffect(() => {
spacesFetcher.load("/api/v1/spaces");
}, []);
// Update spaces when data is fetched
useEffect(() => {
if (spacesFetcher.data?.spaces) {
setSpaces(spacesFetcher.data.spaces);
}
}, [spacesFetcher.data]);
const handleSpaceToggle = (spaceId: string) => {
const newSelectedSpaces = selectedSpaces.includes(spaceId)
? selectedSpaces.filter((id) => id !== spaceId)
: [...selectedSpaces, spaceId];
setSelectedSpaces(newSelectedSpaces);
if (episodeIds) {
assignFetcher.submit(
{
episodeIds: JSON.stringify(episodeIds),
spaceId,
action: selectedSpaces.includes(spaceId) ? "remove" : "assign",
},
{
method: "post",
action: "/api/v1/episodes/assign-space",
encType: "application/json",
},
);
}
// Call the callback if provided
if (onSpaceChange) {
onSpaceChange(newSelectedSpaces);
}
};
const selectedSpaceObjects = spaces.filter((space) =>
selectedSpaces.includes(space.id),
);
const getTrigger = () => {
if (selectedSpaceObjects?.length === 1) {
return (
<>
<Project size={14} /> {selectedSpaceObjects[0].name}
</>
);
}
if (selectedSpaceObjects?.length > 1) {
return (
<>
<Project size={14} /> {selectedSpaceObjects.length} Spaces
</>
);
}
return (
<>
{" "}
<Project size={14} />
Spaces
</>
);
};
return (
<div className={cn("flex flex-wrap items-center gap-2", className)}>
{/* + button to add more spaces */}
<Popover open={open} onOpenChange={setOpen}>
<PopoverTrigger asChild>
<Button
variant="secondary"
size="sm"
role="combobox"
aria-expanded={open}
className="h-7 gap-1 rounded"
>
{getTrigger()}
</Button>
</PopoverTrigger>
<PopoverPortal>
<PopoverContent className="w-[250px] p-0" align="end">
<Command>
<CommandInput placeholder="Search spaces..." />
<CommandList>
<CommandEmpty>No spaces found.</CommandEmpty>
<CommandGroup>
{spaces.map((space) => (
<CommandItem
key={space.id}
value={space.name}
onSelect={() => handleSpaceToggle(space.id)}
>
<Check
className={cn(
"mr-2 h-4 w-4",
selectedSpaces.includes(space.id)
? "opacity-100"
: "opacity-0",
)}
/>
<div className="flex flex-col">
<span className="text-sm">{space.name}</span>
</div>
</CommandItem>
))}
</CommandGroup>
</CommandList>
</Command>
</PopoverContent>
</PopoverPortal>
</Popover>
</div>
);
}

View File

@ -0,0 +1,112 @@
import { EllipsisVertical, Trash } from "lucide-react";
import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuItem,
DropdownMenuTrigger,
} from "../ui/dropdown-menu";
import { Button } from "../ui/button";
import {
AlertDialog,
AlertDialogAction,
AlertDialogCancel,
AlertDialogContent,
AlertDialogDescription,
AlertDialogFooter,
AlertDialogHeader,
AlertDialogTitle,
} from "../ui/alert-dialog";
import { useEffect, useState } from "react";
import { useFetcher, useNavigate } from "@remix-run/react";
import { toast } from "~/hooks/use-toast";
interface SpaceEpisodeActionsProps {
episodeId: string;
spaceId: string;
}
export const SpaceEpisodeActions = ({
episodeId,
spaceId,
}: SpaceEpisodeActionsProps) => {
const [removeDialogOpen, setRemoveDialogOpen] = useState(false);
const removeFetcher = useFetcher();
const navigate = useNavigate();
const handleRemove = () => {
removeFetcher.submit(
{
episodeIds: JSON.stringify([episodeId]),
spaceId,
action: "remove",
},
{
method: "post",
action: "/api/v1/episodes/assign-space",
encType: "application/json",
},
);
setRemoveDialogOpen(false);
};
useEffect(() => {
if (removeFetcher.state === "idle" && removeFetcher.data) {
if (removeFetcher.data.success) {
toast({
title: "Success",
description: "Episode removed from space",
});
// Reload the page to refresh the episode list
navigate(".", { replace: true });
} else {
toast({
title: "Error",
description: removeFetcher.data.error || "Failed to remove episode",
variant: "destructive",
});
}
}
}, [removeFetcher.state, removeFetcher.data, navigate]);
return (
<>
<DropdownMenu>
<DropdownMenuTrigger asChild>
<Button
variant="ghost"
className="h-6 w-6 shrink-0 items-center justify-center p-0 opacity-0 transition-opacity group-hover:opacity-100"
onClick={(e) => e.stopPropagation()}
>
<EllipsisVertical size={16} />
</Button>
</DropdownMenuTrigger>
<DropdownMenuContent align="end" onClick={(e) => e.stopPropagation()}>
<DropdownMenuItem onClick={() => setRemoveDialogOpen(true)}>
<Button variant="link" size="sm" className="gap-2 rounded">
<Trash size={15} /> Remove from space
</Button>
</DropdownMenuItem>
</DropdownMenuContent>
</DropdownMenu>
<AlertDialog open={removeDialogOpen} onOpenChange={setRemoveDialogOpen}>
<AlertDialogContent onClick={(e) => e.stopPropagation()}>
<AlertDialogHeader>
<AlertDialogTitle>Remove from space</AlertDialogTitle>
<AlertDialogDescription>
Are you sure you want to remove this episode from the space? This
will not delete the episode itself.
</AlertDialogDescription>
</AlertDialogHeader>
<AlertDialogFooter>
<AlertDialogCancel>Cancel</AlertDialogCancel>
<AlertDialogAction onClick={handleRemove}>
Remove
</AlertDialogAction>
</AlertDialogFooter>
</AlertDialogContent>
</AlertDialog>
</>
);
};

View File

@ -2,12 +2,30 @@ import { Calendar } from "lucide-react";
import { Badge } from "~/components/ui/badge";
import type { StatementNode } from "@core/types";
import { cn } from "~/lib/utils";
import { useNavigate } from "@remix-run/react";
import Markdown from "react-markdown";
import { StyledMarkdown } from "../common/styled-markdown";
import { SpaceEpisodeActions } from "./space-episode-actions";
interface SpaceFactCardProps {
fact: StatementNode;
export interface Episode {
uuid: string;
content: string;
originalContent: string;
source: any;
createdAt: Date;
validAt: Date;
metadata: any;
sessionId: any;
logId?: any;
}
export function SpaceFactCard({ fact }: SpaceFactCardProps) {
interface SpaceFactCardProps {
episode: Episode;
spaceId: string;
}
export function SpaceEpisodeCard({ episode, spaceId }: SpaceFactCardProps) {
const navigate = useNavigate();
const formatDate = (date: Date | string) => {
const d = new Date(date);
return d.toLocaleDateString("en-US", {
@ -17,18 +35,20 @@ export function SpaceFactCard({ fact }: SpaceFactCardProps) {
});
};
const displayText = fact.fact;
const displayText = episode.originalContent;
const recallCount =
(fact.recallCount?.high ?? 0) + (fact.recallCount?.low ?? 0);
const onClick = () => {
navigate(`/home/inbox/${episode.logId}`);
};
return (
<>
<div className="flex w-full items-center px-5 pr-2">
<div className="group flex w-full items-center px-5 pr-2">
<div
className={cn(
"group-hover:bg-grayAlpha-100 flex min-w-[0px] shrink grow items-start gap-2 rounded-md px-3",
"group-hover:bg-grayAlpha-100 flex min-w-[0px] shrink grow cursor-pointer items-start gap-2 rounded-md px-3",
)}
onClick={onClick}
>
<div
className={cn(
@ -37,19 +57,14 @@ export function SpaceFactCard({ fact }: SpaceFactCardProps) {
>
<div className="flex w-full items-center justify-between gap-4">
<div className="inline-flex min-h-[24px] min-w-[0px] shrink items-center justify-start">
<div className={cn("truncate text-left")}>{displayText}</div>
<StyledMarkdown>{displayText.slice(0, 300)}</StyledMarkdown>
</div>
<div className="text-muted-foreground flex shrink-0 items-center justify-end gap-2 text-xs">
{!!recallCount && <span>Recalled: {recallCount} times</span>}
<Badge variant="secondary" className="rounded text-xs">
<Calendar className="h-3 w-3" />
{formatDate(fact.validAt)}
{formatDate(episode.validAt)}
</Badge>
{fact.invalidAt && (
<Badge variant="destructive" className="rounded text-xs">
Invalid since {formatDate(fact.invalidAt)}
</Badge>
)}
<SpaceEpisodeActions episodeId={episode.uuid} spaceId={spaceId} />
</div>
</div>
</div>

View File

@ -9,7 +9,7 @@ import {
} from "~/components/ui/popover";
import { Badge } from "~/components/ui/badge";
interface SpaceFactsFiltersProps {
interface SpaceEpisodesFiltersProps {
selectedValidDate?: string;
selectedSpaceFilter?: string;
onValidDateChange: (date?: string) => void;
@ -22,34 +22,24 @@ const validDateOptions = [
{ value: "last_6_months", label: "Last 6 Months" },
];
const spaceFilterOptions = [
{ value: "active", label: "Active Facts" },
{ value: "archived", label: "Archived Facts" },
{ value: "all", label: "All Facts" },
];
type FilterStep = "main" | "validDate";
type FilterStep = "main" | "validDate" | "spaceFilter";
export function SpaceFactsFilters({
export function SpaceEpisodesFilters({
selectedValidDate,
selectedSpaceFilter,
onValidDateChange,
onSpaceFilterChange,
}: SpaceFactsFiltersProps) {
}: SpaceEpisodesFiltersProps) {
const [popoverOpen, setPopoverOpen] = useState(false);
const [step, setStep] = useState<FilterStep>("main");
const selectedValidDateLabel = validDateOptions.find(
(d) => d.value === selectedValidDate,
)?.label;
const selectedSpaceFilterLabel = spaceFilterOptions.find(
(f) => f.value === selectedSpaceFilter,
)?.label;
const hasFilters = selectedValidDate || selectedSpaceFilter;
return (
<div className="mb-2 flex w-full items-center justify-start gap-2 px-5">
<>
<Popover
open={popoverOpen}
onOpenChange={(open) => {
@ -79,13 +69,6 @@ export function SpaceFactsFilters({
>
Valid Date
</Button>
<Button
variant="ghost"
className="justify-start"
onClick={() => setStep("spaceFilter")}
>
Status
</Button>
</div>
)}
@ -122,40 +105,6 @@ export function SpaceFactsFilters({
))}
</div>
)}
{step === "spaceFilter" && (
<div className="flex flex-col gap-1 p-2">
<Button
variant="ghost"
className="w-full justify-start"
onClick={() => {
onSpaceFilterChange(undefined);
setPopoverOpen(false);
setStep("main");
}}
>
All Facts
</Button>
{spaceFilterOptions.map((option) => (
<Button
key={option.value}
variant="ghost"
className="w-full justify-start"
onClick={() => {
onSpaceFilterChange(
option.value === selectedSpaceFilter
? undefined
: option.value,
);
setPopoverOpen(false);
setStep("main");
}}
>
{option.label}
</Button>
))}
</div>
)}
</PopoverContent>
</PopoverPortal>
</Popover>
@ -172,17 +121,8 @@ export function SpaceFactsFilters({
/>
</Badge>
)}
{selectedSpaceFilter && (
<Badge variant="secondary" className="h-7 gap-1 rounded px-2">
{selectedSpaceFilterLabel}
<X
className="hover:text-destructive h-3.5 w-3.5 cursor-pointer"
onClick={() => onSpaceFilterChange(undefined)}
/>
</Badge>
)}
</div>
)}
</div>
</>
);
}

View File

@ -9,25 +9,26 @@ import {
} from "react-virtualized";
import { Database } from "lucide-react";
import { Card, CardContent } from "~/components/ui/card";
import type { StatementNode } from "@core/types";
import { ScrollManagedList } from "../virtualized-list";
import { SpaceFactCard } from "./space-fact-card";
import { type Episode, SpaceEpisodeCard } from "./space-episode-card";
interface SpaceFactsListProps {
facts: any[];
interface SpaceEpisodesListProps {
episodes: any[];
hasMore: boolean;
loadMore: () => void;
isLoading: boolean;
height?: number;
spaceId: string;
}
function FactItemRenderer(
function EpisodeItemRenderer(
props: ListRowProps,
facts: StatementNode[],
episodes: Episode[],
cache: CellMeasurerCache,
spaceId: string,
) {
const { index, key, style, parent } = props;
const fact = facts[index];
const episode = episodes[index];
return (
<CellMeasurer
@ -38,23 +39,24 @@ function FactItemRenderer(
rowIndex={index}
>
<div key={key} style={style} className="pb-2">
<SpaceFactCard fact={fact} />
<SpaceEpisodeCard episode={episode} spaceId={spaceId} />
</div>
</CellMeasurer>
);
}
export function SpaceFactsList({
facts,
export function SpaceEpisodesList({
episodes,
hasMore,
loadMore,
isLoading,
}: SpaceFactsListProps) {
spaceId,
}: SpaceEpisodesListProps) {
// Create a CellMeasurerCache instance using useRef to prevent recreation
const cacheRef = useRef<CellMeasurerCache | null>(null);
if (!cacheRef.current) {
cacheRef.current = new CellMeasurerCache({
defaultHeight: 200, // Default row height for fact cards
defaultHeight: 200, // Default row height for episode cards
fixedWidth: true, // Rows have fixed width but dynamic height
});
}
@ -62,17 +64,17 @@ export function SpaceFactsList({
useEffect(() => {
cache.clearAll();
}, [facts, cache]);
}, [episodes, cache]);
if (facts.length === 0 && !isLoading) {
if (episodes.length === 0 && !isLoading) {
return (
<Card className="bg-background-2 w-full">
<CardContent className="bg-background-2 flex w-full items-center justify-center py-16">
<div className="text-center">
<Database className="text-muted-foreground mx-auto mb-4 h-12 w-12" />
<h3 className="mb-2 text-lg font-semibold">No facts found</h3>
<h3 className="mb-2 text-lg font-semibold">No Episodes found</h3>
<p className="text-muted-foreground">
This space doesn't contain any facts yet.
This space doesn't contain any episodes yet.
</p>
</div>
</CardContent>
@ -81,7 +83,7 @@ export function SpaceFactsList({
}
const isRowLoaded = ({ index }: { index: number }) => {
return !!facts[index];
return !!episodes[index];
};
const loadMoreRows = async () => {
@ -92,14 +94,14 @@ export function SpaceFactsList({
};
const rowRenderer = (props: ListRowProps) => {
return FactItemRenderer(props, facts, cache);
return EpisodeItemRenderer(props, episodes, cache, spaceId);
};
const rowHeight = ({ index }: Index) => {
return cache.getHeight(index, 0);
};
const itemCount = hasMore ? facts.length + 1 : facts.length;
const itemCount = hasMore ? episodes.length + 1 : episodes.length;
return (
<div className="h-full grow overflow-hidden rounded-lg">
@ -131,7 +133,7 @@ export function SpaceFactsList({
{isLoading && (
<div className="text-muted-foreground p-4 text-center text-sm">
Loading more facts...
Loading more episodes...
</div>
)}
</div>

View File

@ -1,4 +1,4 @@
import { EllipsisVertical, RefreshCcw, Trash, Edit } from "lucide-react";
import { EllipsisVertical, RefreshCcw, Trash, Edit, Copy } from "lucide-react";
import {
DropdownMenu,
DropdownMenuContent,
@ -19,6 +19,7 @@ import {
import { useEffect, useState } from "react";
import { useFetcher, useNavigate } from "@remix-run/react";
import { EditSpaceDialog } from "./edit-space-dialog.client";
import { toast } from "~/hooks/use-toast";
interface SpaceOptionsProps {
id: string;
@ -64,6 +65,23 @@ export const SpaceOptions = ({ id, name, description }: SpaceOptionsProps) => {
// revalidator.revalidate();
};
const handleCopy = async () => {
try {
await navigator.clipboard.writeText(id);
toast({
title: "Copied",
description: "Space ID copied to clipboard",
});
} catch (err) {
console.error("Failed to copy:", err);
toast({
title: "Error",
description: "Failed to copy ID",
variant: "destructive",
});
}
};
return (
<>
<DropdownMenu>
@ -79,6 +97,11 @@ export const SpaceOptions = ({ id, name, description }: SpaceOptionsProps) => {
</DropdownMenuTrigger>
<DropdownMenuContent align="end">
<DropdownMenuItem onClick={handleCopy}>
<Button variant="link" size="sm" className="gap-2 rounded">
<Copy size={15} /> Copy Id
</Button>
</DropdownMenuItem>
<DropdownMenuItem onClick={() => setEditDialogOpen(true)}>
<Button variant="link" size="sm" className="gap-2 rounded">
<Edit size={15} /> Edit

View File

@ -6,6 +6,7 @@ import { useState } from "react";
import { Dialog, DialogContent, DialogHeader, DialogTitle } from "../ui/dialog";
import { Button } from "../ui";
import { useFetcher } from "@remix-run/react";
import { getTailwindColor, getTeamColor } from "../ui/color-utils";
interface SpacePatternCardProps {
pattern: SpacePattern;
@ -46,10 +47,22 @@ export function SpacePatternCard({ pattern }: SpacePatternCardProps) {
<div className={cn("truncate text-left")}>{displayText}</div>
</div>
<div className="text-muted-foreground flex shrink-0 items-center justify-end gap-2 text-xs">
<Badge variant="secondary" className="rounded text-xs">
<Badge
variant="secondary"
className="rounded text-xs"
style={{
color: getTailwindColor(pattern.type),
}}
>
{pattern.type}
</Badge>
<Badge variant="secondary" className="rounded text-xs">
<Badge
variant="secondary"
className="rounded text-xs"
style={{
color: getTailwindColor(pattern.name),
}}
>
{pattern.name}
</Badge>
</div>
@ -66,10 +79,22 @@ export function SpacePatternCard({ pattern }: SpacePatternCardProps) {
<div className="flex flex-col gap-2">
<div className="flex gap-2">
<Badge variant="secondary" className="rounded text-xs">
<Badge
variant="secondary"
className="rounded text-xs"
style={{
color: getTailwindColor(pattern.type),
}}
>
{pattern.type}
</Badge>
<Badge variant="secondary" className="rounded text-xs">
<Badge
variant="secondary"
className="rounded text-xs"
style={{
color: getTailwindColor(pattern.name),
}}
>
{pattern.name}
</Badge>
</div>

View File

@ -9,8 +9,8 @@ interface SpacesGridProps {
createdAt: string;
updatedAt: string;
autoMode: boolean;
statementCount: number | null;
summary: string | null;
contextCount?: number | null;
themes?: string[];
}>;
}

View File

@ -4,10 +4,15 @@ import React from "react";
import { cn } from "../../lib/utils";
interface CheckBoxProps
extends React.ComponentPropsWithoutRef<typeof CheckboxPrimitive.Root> {
checkboxClassname?: string;
}
const Checkbox = React.forwardRef<
React.ElementRef<typeof CheckboxPrimitive.Root>,
React.ComponentPropsWithoutRef<typeof CheckboxPrimitive.Root>
>(({ className, ...props }, ref) => (
CheckBoxProps
>(({ className, checkboxClassname, ...props }, ref) => (
<CheckboxPrimitive.Root
ref={ref}
className={cn(
@ -19,7 +24,7 @@ const Checkbox = React.forwardRef<
<CheckboxPrimitive.Indicator
className={cn("flex items-center justify-center text-white")}
>
<CheckIcon className="h-3 w-3" />
<CheckIcon className={cn("h-3 w-3", checkboxClassname)} />
</CheckboxPrimitive.Indicator>
</CheckboxPrimitive.Root>
));

View File

@ -40,7 +40,7 @@ const CommandDialog = ({
<Dialog {...props}>
<DialogContent className={cn("overflow-hidden p-0 font-sans")}>
<Command
className="[&_[cmdk-group-heading]]:text-muted-foreground [&_[cmdk-group-heading]]:font-medium [&_[cmdk-group]:not([hidden])_~[cmdk-group]]:pt-0 [&_[cmdk-input-wrapper]_svg]:h-5 [&_[cmdk-input-wrapper]_svg]:w-5 [&_[cmdk-input]]:h-12 [&_[cmdk-item]]:px-2 [&_[cmdk-item]]:py-3 [&_[cmdk-item]_svg]:h-5 [&_[cmdk-item]_svg]:w-5"
className="[&_[cmdk-group-heading]]:text-muted-foreground [&_[cmdk-group-heading]]:font-medium [&_[cmdk-group]:not([hidden])_~[cmdk-group]]:pt-0 [&_[cmdk-input-wrapper]_svg]:h-5 [&_[cmdk-input-wrapper]_svg]:w-5 [&_[cmdk-input]]:h-10 [&_[cmdk-item]]:px-2 [&_[cmdk-item]]:py-2 [&_[cmdk-item]_svg]:h-5 [&_[cmdk-item]_svg]:w-5"
{...commandProps}
>
{children}
@ -141,7 +141,7 @@ const CommandItem = React.forwardRef<
<CommandPrimitive.Item
ref={ref}
className={cn(
"command-item aria-selected:bg-accent aria-selected:text-accent-foreground relative flex cursor-default items-center rounded-sm px-2 py-1 outline-none select-none data-[disabled]:pointer-events-none data-[disabled]:opacity-50",
"command-item aria-selected:bg-accent aria-selected:text-accent-foreground relative flex cursor-default items-center rounded px-2 py-1 outline-none select-none data-[disabled]:pointer-events-none data-[disabled]:opacity-50",
className,
)}
{...props}

View File

@ -2,3 +2,5 @@ export * from "./button";
export * from "./tabs";
export * from "./input";
export * from "./scrollarea";
export * from "./toast";
export * from "./toaster";

View File

@ -0,0 +1,52 @@
import * as ProgressPrimitive from "@radix-ui/react-progress";
import * as React from "react";
import { cn } from "~/lib/utils";
interface ProgressSegment {
value: number;
}
type Props = React.ComponentPropsWithoutRef<typeof ProgressPrimitive.Root> & {
color?: string;
segments: ProgressSegment[];
};
const Progress = React.forwardRef<
React.ElementRef<typeof ProgressPrimitive.Root>,
Props
>(({ className, segments, color, ...props }, ref) => {
const sortedSegments = segments.sort((a, b) => b.value - a.value);
return (
<ProgressPrimitive.Root
ref={ref}
className={cn("relative h-2 w-full overflow-hidden rounded", className)}
style={{
backgroundColor: `${color}33`,
}}
{...props}
>
{sortedSegments.map((segment, index) => (
<ProgressPrimitive.Indicator
key={index}
className="bg-primary absolute top-0 h-full transition-all"
style={{
width: `${segment.value}%`,
left: "0%",
backgroundColor: `${color}${Math.round(
90 + ((100 - 30) * index) / (sortedSegments.length - 1),
)
.toString(16)
.padStart(2, "0")}`,
zIndex: sortedSegments.length - index,
}}
/>
))}
</ProgressPrimitive.Root>
);
});
Progress.displayName = "Progress";
export { Progress };

View File

@ -30,7 +30,7 @@ const ResizableHandle = ({
}) => (
<ResizablePrimitive.PanelResizeHandle
className={cn(
"bg-background-1 focus-visible:ring-ring relative flex w-px items-center justify-center after:absolute after:inset-y-0 after:left-1/2 after:w-1 after:-translate-x-1/2 focus-visible:ring-1 focus-visible:ring-offset-1 focus-visible:outline-none data-[panel-group-direction=vertical]:h-px data-[panel-group-direction=vertical]:w-full data-[panel-group-direction=vertical]:after:left-0 data-[panel-group-direction=vertical]:after:h-1 data-[panel-group-direction=vertical]:after:w-full data-[panel-group-direction=vertical]:after:translate-x-0 data-[panel-group-direction=vertical]:after:-translate-y-1/2 [&[data-panel-group-direction=vertical]>div]:rotate-90",
"focus-visible:ring-ring relative flex w-px items-center justify-center bg-gray-300 after:absolute after:inset-y-0 after:left-1/2 after:w-1 after:-translate-x-1/2 focus-visible:ring-1 focus-visible:ring-offset-1 focus-visible:outline-none data-[panel-group-direction=vertical]:h-px data-[panel-group-direction=vertical]:w-full data-[panel-group-direction=vertical]:after:left-0 data-[panel-group-direction=vertical]:after:h-1 data-[panel-group-direction=vertical]:after:w-full data-[panel-group-direction=vertical]:after:translate-x-0 data-[panel-group-direction=vertical]:after:-translate-y-1/2 [&[data-panel-group-direction=vertical]>div]:rotate-90",
className,
)}
{...props}

View File

@ -0,0 +1,133 @@
import { Cross2Icon } from "@radix-ui/react-icons";
import * as ToastPrimitives from "@radix-ui/react-toast";
import { cva, type VariantProps } from "class-variance-authority";
import React from "react";
import { cn } from "../../lib/utils";
const ToastProvider = ToastPrimitives.Provider;
const ToastViewport = React.forwardRef<
React.ElementRef<typeof ToastPrimitives.Viewport>,
React.ComponentPropsWithoutRef<typeof ToastPrimitives.Viewport>
>(({ className, ...props }, ref) => (
<ToastPrimitives.Viewport
ref={ref}
className={cn(
"fixed top-0 z-[100] flex max-h-screen w-full flex-col-reverse p-4 sm:top-auto sm:right-0 sm:bottom-0 sm:flex-col md:max-w-[420px]",
className,
)}
{...props}
/>
));
ToastViewport.displayName = ToastPrimitives.Viewport.displayName;
const toastVariants = cva(
"group pointer-events-auto relative flex w-full items-center justify-between space-x-2 overflow-hidden rounded-md border p-3 pr-6 shadow-lg transition-all data-[swipe=cancel]:translate-x-0 data-[swipe=end]:translate-x-[var(--radix-toast-swipe-end-x)] data-[swipe=move]:translate-x-[var(--radix-toast-swipe-move-x)] data-[swipe=move]:transition-none data-[state=open]:animate-in data-[state=closed]:animate-out data-[swipe=end]:animate-out data-[state=closed]:fade-out-80 data-[state=closed]:slide-out-to-right-full data-[state=open]:slide-in-from-top-full data-[state=open]:sm:slide-in-from-bottom-full",
{
variants: {
variant: {
default: "border bg-background text-foreground",
warning: "warning group border-warning bg-warning text-foreground",
success: "success group border-success bg-success text-foreground",
destructive:
"destructive group border-destructive bg-destructive text-foreground",
},
},
defaultVariants: {
variant: "default",
},
},
);
const Toast = React.forwardRef<
React.ElementRef<typeof ToastPrimitives.Root>,
React.ComponentPropsWithoutRef<typeof ToastPrimitives.Root> &
VariantProps<typeof toastVariants>
>(({ className, variant, ...props }, ref) => {
return (
<ToastPrimitives.Root
ref={ref}
className={cn(
toastVariants({ variant }),
className,
"shadow-1 rounded-md border-0 bg-gray-100 font-sans backdrop-blur-md",
)}
{...props}
/>
);
});
Toast.displayName = ToastPrimitives.Root.displayName;
const ToastAction = React.forwardRef<
React.ElementRef<typeof ToastPrimitives.Action>,
React.ComponentPropsWithoutRef<typeof ToastPrimitives.Action>
>(({ className, ...props }, ref) => (
<ToastPrimitives.Action
ref={ref}
className={cn(
"hover:bg-secondary focus:ring-ring group-[.destructive]:border-muted/40 group-[.destructive]:hover:border-destructive/30 group-[.destructive]:hover:bg-destructive group-[.destructive]:hover:text-destructive-foreground group-[.destructive]:focus:ring-destructive inline-flex h-8 shrink-0 items-center justify-center rounded-md border bg-transparent px-3 text-sm font-medium transition-colors focus:ring-1 focus:outline-none disabled:pointer-events-none disabled:opacity-50",
className,
)}
{...props}
/>
));
ToastAction.displayName = ToastPrimitives.Action.displayName;
const ToastClose = React.forwardRef<
React.ElementRef<typeof ToastPrimitives.Close>,
React.ComponentPropsWithoutRef<typeof ToastPrimitives.Close>
>(({ className, ...props }, ref) => (
<ToastPrimitives.Close
ref={ref}
className={cn(
"text-foreground/50 hover:text-foreground absolute top-1 right-1 rounded-md p-1 opacity-0 transition-opacity group-hover:opacity-100 group-[.destructive]:text-red-300 group-[.destructive]:hover:text-red-50 focus:opacity-100 focus:ring-1 focus:outline-none group-[.destructive]:focus:ring-red-400 group-[.destructive]:focus:ring-offset-red-600",
className,
)}
toast-close=""
{...props}
>
<Cross2Icon className="h-4 w-4" />
</ToastPrimitives.Close>
));
ToastClose.displayName = ToastPrimitives.Close.displayName;
const ToastTitle = React.forwardRef<
React.ElementRef<typeof ToastPrimitives.Title>,
React.ComponentPropsWithoutRef<typeof ToastPrimitives.Title>
>(({ className, ...props }, ref) => (
<ToastPrimitives.Title
ref={ref}
className={cn("font-medium [&+div]:text-xs", className)}
{...props}
/>
));
ToastTitle.displayName = ToastPrimitives.Title.displayName;
const ToastDescription = React.forwardRef<
React.ElementRef<typeof ToastPrimitives.Description>,
React.ComponentPropsWithoutRef<typeof ToastPrimitives.Description>
>(({ className, ...props }, ref) => (
<ToastPrimitives.Description
ref={ref}
className={cn("opacity-90", className)}
{...props}
/>
));
ToastDescription.displayName = ToastPrimitives.Description.displayName;
type ToastProps = React.ComponentPropsWithoutRef<typeof Toast>;
type ToastActionElement = React.ReactElement<typeof ToastAction>;
export {
type ToastProps,
type ToastActionElement,
ToastProvider,
ToastViewport,
Toast,
ToastTitle,
ToastDescription,
ToastClose,
ToastAction,
};

View File

@ -0,0 +1,33 @@
import {
Toast,
ToastClose,
ToastDescription,
ToastProvider,
ToastTitle,
ToastViewport,
} from "~/components/ui/toast";
import { useToast } from "~/hooks/use-toast";
export function Toaster() {
const { toasts } = useToast();
return (
<ToastProvider>
{toasts.map(function ({ id, title, description, action, ...props }) {
return (
<Toast key={id} {...props}>
<div className="grid gap-1">
{title && <ToastTitle>{title}</ToastTitle>}
{description && (
<ToastDescription>{description}</ToastDescription>
)}
</div>
{action}
<ToastClose />
</Toast>
);
})}
<ToastViewport />
</ToastProvider>
);
}

View File

@ -149,7 +149,7 @@ export const ScrollAreaWithAutoScroll = ({
className?: string;
}) => {
const { scrollRef } = useAutoScroll({
smooth: true,
smooth: false,
content: children,
});
@ -161,7 +161,7 @@ export const ScrollAreaWithAutoScroll = ({
className,
)}
>
<div className="flex h-full w-full max-w-[97ch] flex-col pb-4">
<div className="flex h-full w-full max-w-[80ch] flex-col pb-4">
{children}
</div>
</div>

View File

@ -0,0 +1,120 @@
/**
* Billing Configuration
*
* This file centralizes all billing-related configuration.
* Billing is feature-flagged and can be disabled for self-hosted instances.
*/
export const BILLING_CONFIG = {
// Feature flag: Enable/disable billing system
// Self-hosted instances can set this to false for unlimited usage
enabled: process.env.ENABLE_BILLING === "true",
// Stripe configuration (only used if billing is enabled)
stripe: {
secretKey: process.env.STRIPE_SECRET_KEY,
publishableKey: process.env.STRIPE_PUBLISHABLE_KEY,
webhookSecret: process.env.STRIPE_WEBHOOK_SECRET,
meterEventName: process.env.STRIPE_METER_EVENT_NAME || "echo_credits_used",
},
// Plan configurations
plans: {
free: {
name: "Free",
monthlyCredits: parseInt(process.env.FREE_PLAN_CREDITS || "200", 10),
enableOverage: false,
features: {
episodesPerMonth: 200,
searchesPerMonth: 200,
mcpIntegrations: 3,
},
},
pro: {
name: "Pro",
monthlyCredits: parseInt(process.env.PRO_PLAN_CREDITS || "2000", 10),
enableOverage: true,
overagePrice: parseFloat(process.env.PRO_OVERAGE_PRICE || "0.01"), // $0.01 per credit
stripePriceId: process.env.PRO_PLAN_STRIPE_PRICE_ID,
features: {
episodesPerMonth: 2000,
searchesPerMonth: 2000,
mcpIntegrations: -1, // unlimited
prioritySupport: true,
},
},
max: {
name: "Max",
monthlyCredits: parseInt(process.env.MAX_PLAN_CREDITS || "10000", 10),
enableOverage: true,
overagePrice: parseFloat(process.env.MAX_OVERAGE_PRICE || "0.008"), // $0.008 per credit (cheaper than pro)
stripePriceId: process.env.MAX_PLAN_STRIPE_PRICE_ID,
features: {
episodesPerMonth: 10000,
searchesPerMonth: 10000,
mcpIntegrations: -1, // unlimited
prioritySupport: true,
customIntegrations: true,
dedicatedSupport: true,
},
},
},
// Credit costs per operation
creditCosts: {
addEpisode: parseInt(process.env.CREDIT_COST_EPISODE || "1", 10),
search: parseInt(process.env.CREDIT_COST_SEARCH || "1", 10),
chatMessage: parseInt(process.env.CREDIT_COST_CHAT || "1", 10),
},
// Billing cycle settings
billingCycle: {
// When to reset credits (1st of each month by default)
resetDay: parseInt(process.env.BILLING_RESET_DAY || "1", 10),
},
} as const;
/**
* Get plan configuration by plan type
*/
export function getPlanConfig(planType: "FREE" | "PRO" | "MAX") {
return BILLING_CONFIG.plans[
planType.toLowerCase() as keyof typeof BILLING_CONFIG.plans
];
}
/**
* Check if billing is enabled
*/
export function isBillingEnabled(): boolean {
return BILLING_CONFIG.enabled;
}
/**
* Check if Stripe is configured
*/
export function isStripeConfigured(): boolean {
return !!(
BILLING_CONFIG.stripe.secretKey && BILLING_CONFIG.stripe.publishableKey
);
}
/**
* Validate billing configuration
*/
export function validateBillingConfig() {
if (!BILLING_CONFIG.enabled) {
console.log(
" Billing is disabled. Running in self-hosted mode with unlimited credits.",
);
return;
}
if (!isStripeConfigured()) {
console.warn(
"⚠️ ENABLE_BILLING is true but Stripe is not configured. Billing will not work.",
);
}
console.log("✅ Billing is enabled with Stripe integration");
}

View File

@ -2,12 +2,9 @@ import { Prisma, PrismaClient } from "@core/database";
import invariant from "tiny-invariant";
import { z } from "zod";
import { env } from "./env.server";
import { logger } from "./services/logger.service";
import { isValidDatabaseUrl } from "./utils/db";
import { singleton } from "./utils/singleton";
import { type Span } from "@opentelemetry/api";
export { Prisma };
export const prisma = singleton("prisma", getClient);

View File

@ -17,6 +17,7 @@ import { renderToPipeableStream } from "react-dom/server";
import { initializeStartupServices } from "./utils/startup";
import { handleMCPRequest, handleSessionRequest } from "~/services/mcp.server";
import { authenticateHybridRequest } from "~/services/routeBuilders/apiBuilder.server";
import { trackError } from "~/services/telemetry.server";
const ABORT_DELAY = 5_000;
@ -27,6 +28,42 @@ async function init() {
init();
/**
* Global error handler for all server-side errors
* This catches errors from loaders, actions, and rendering
* Automatically tracks all errors to telemetry
*/
export function handleError(
error: unknown,
{ request }: { request: Request },
): void {
// Don't track 404s or aborted requests as errors
if (
error instanceof Response &&
(error.status === 404 || error.status === 304)
) {
return;
}
// Track error to telemetry
if (error instanceof Error) {
const url = new URL(request.url);
trackError(error, {
url: request.url,
path: url.pathname,
method: request.method,
userAgent: request.headers.get("user-agent") || "unknown",
referer: request.headers.get("referer") || undefined,
}).catch((trackingError) => {
// If telemetry tracking fails, just log it - don't break the app
console.error("Failed to track error:", trackingError);
});
}
// Always log to console for development/debugging
console.error(error);
}
export default function handleRequest(
request: Request,
responseStatusCode: number,

View File

@ -3,87 +3,146 @@ import { isValidDatabaseUrl } from "./utils/db";
import { isValidRegex } from "./utils/regex";
import { LLMModelEnum } from "@core/types";
const EnvironmentSchema = z.object({
NODE_ENV: z.union([
z.literal("development"),
z.literal("production"),
z.literal("test"),
]),
POSTGRES_DB: z.string(),
DATABASE_URL: z
.string()
.refine(
isValidDatabaseUrl,
"DATABASE_URL is invalid, for details please check the additional output above this message.",
),
DATABASE_CONNECTION_LIMIT: z.coerce.number().int().default(10),
DATABASE_POOL_TIMEOUT: z.coerce.number().int().default(60),
DATABASE_CONNECTION_TIMEOUT: z.coerce.number().int().default(20),
DIRECT_URL: z
.string()
.refine(
isValidDatabaseUrl,
"DIRECT_URL is invalid, for details please check the additional output above this message.",
),
DATABASE_READ_REPLICA_URL: z.string().optional(),
SESSION_SECRET: z.string(),
ENCRYPTION_KEY: z.string(),
MAGIC_LINK_SECRET: z.string(),
WHITELISTED_EMAILS: z
.string()
.refine(isValidRegex, "WHITELISTED_EMAILS must be a valid regex.")
.optional(),
ADMIN_EMAILS: z
.string()
.refine(isValidRegex, "ADMIN_EMAILS must be a valid regex.")
.optional(),
const EnvironmentSchema = z
.object({
NODE_ENV: z.union([
z.literal("development"),
z.literal("production"),
z.literal("test"),
]),
POSTGRES_DB: z.string(),
DATABASE_URL: z
.string()
.refine(
isValidDatabaseUrl,
"DATABASE_URL is invalid, for details please check the additional output above this message.",
),
DATABASE_CONNECTION_LIMIT: z.coerce.number().int().default(10),
DATABASE_POOL_TIMEOUT: z.coerce.number().int().default(60),
DATABASE_CONNECTION_TIMEOUT: z.coerce.number().int().default(20),
DIRECT_URL: z
.string()
.refine(
isValidDatabaseUrl,
"DIRECT_URL is invalid, for details please check the additional output above this message.",
),
DATABASE_READ_REPLICA_URL: z.string().optional(),
SESSION_SECRET: z.string(),
ENCRYPTION_KEY: z.string(),
MAGIC_LINK_SECRET: z.string(),
WHITELISTED_EMAILS: z
.string()
.refine(isValidRegex, "WHITELISTED_EMAILS must be a valid regex.")
.optional(),
ADMIN_EMAILS: z
.string()
.refine(isValidRegex, "ADMIN_EMAILS must be a valid regex.")
.optional(),
APP_ENV: z.string().default(process.env.NODE_ENV),
LOGIN_ORIGIN: z.string().default("http://localhost:5173"),
APP_ORIGIN: z.string().default("http://localhost:5173"),
POSTHOG_PROJECT_KEY: z.string().default(""),
APP_ENV: z.string().default(process.env.NODE_ENV),
LOGIN_ORIGIN: z.string().default("http://localhost:5173"),
APP_ORIGIN: z.string().default("http://localhost:5173"),
// google auth
AUTH_GOOGLE_CLIENT_ID: z.string().optional(),
AUTH_GOOGLE_CLIENT_SECRET: z.string().optional(),
// Telemetry
POSTHOG_PROJECT_KEY: z
.string()
.default("phc_SwfGIzzX5gh5bazVWoRxZTBhkr7FwvzArS0NRyGXm1a"),
TELEMETRY_ENABLED: z
.string()
.optional()
.default("true")
.transform((val) => val !== "false" && val !== "0"),
TELEMETRY_ANONYMOUS: z
.string()
.optional()
.default("false")
.transform((val) => val === "true" || val === "1"),
ENABLE_EMAIL_LOGIN: z.coerce.boolean().default(true),
//storage
ACCESS_KEY_ID: z.string().optional(),
SECRET_ACCESS_KEY: z.string().optional(),
BUCKET: z.string().optional(),
//Redis
REDIS_HOST: z.string().default("localhost"),
REDIS_PORT: z.coerce.number().default(6379),
REDIS_TLS_DISABLED: z.coerce.boolean().default(true),
// google auth
AUTH_GOOGLE_CLIENT_ID: z.string().optional(),
AUTH_GOOGLE_CLIENT_SECRET: z.string().optional(),
//Neo4j
NEO4J_URI: z.string(),
NEO4J_USERNAME: z.string(),
NEO4J_PASSWORD: z.string(),
ENABLE_EMAIL_LOGIN: z
.string()
.optional()
.default("true")
.transform((val) => val !== "false" && val !== "0"),
//OpenAI
OPENAI_API_KEY: z.string(),
//Redis
REDIS_HOST: z.string().default("localhost"),
REDIS_PORT: z.coerce.number().default(6379),
REDIS_TLS_DISABLED: z
.string()
.optional()
.default("true")
.transform((val) => val !== "false" && val !== "0"),
EMAIL_TRANSPORT: z.enum(["resend", "smtp", "aws-ses"]).optional(),
FROM_EMAIL: z.string().optional(),
REPLY_TO_EMAIL: z.string().optional(),
RESEND_API_KEY: z.string().optional(),
SMTP_HOST: z.string().optional(),
SMTP_PORT: z.coerce.number().optional(),
SMTP_SECURE: z.coerce.boolean().optional(),
SMTP_USER: z.string().optional(),
SMTP_PASSWORD: z.string().optional(),
//Neo4j
NEO4J_URI: z.string(),
NEO4J_USERNAME: z.string(),
NEO4J_PASSWORD: z.string(),
//Trigger
TRIGGER_PROJECT_ID: z.string(),
TRIGGER_SECRET_KEY: z.string(),
TRIGGER_API_URL: z.string(),
TRIGGER_DB: z.string().default("trigger"),
//OpenAI
OPENAI_API_KEY: z.string().optional(),
ANTHROPIC_API_KEY: z.string().optional(),
GOOGLE_GENERATIVE_AI_API_KEY: z.string().optional(),
// Model envs
MODEL: z.string().default(LLMModelEnum.GPT41),
EMBEDDING_MODEL: z.string().default("mxbai-embed-large"),
OLLAMA_URL: z.string().optional(),
COHERE_API_KEY: z.string().optional(),
});
EMAIL_TRANSPORT: z.string().optional(),
FROM_EMAIL: z.string().optional(),
REPLY_TO_EMAIL: z.string().optional(),
RESEND_API_KEY: z.string().optional(),
SMTP_HOST: z.string().optional(),
SMTP_PORT: z.coerce.number().optional(),
SMTP_SECURE: z
.string()
.optional()
.transform((val) => val === "true" || val === "1"),
SMTP_USER: z.string().optional(),
SMTP_PASSWORD: z.string().optional(),
//Trigger
TRIGGER_PROJECT_ID: z.string().optional(),
TRIGGER_SECRET_KEY: z.string().optional(),
TRIGGER_API_URL: z.string().optional(),
TRIGGER_DB: z.string().default("trigger"),
// Model envs
MODEL: z.string().default(LLMModelEnum.GPT41),
EMBEDDING_MODEL: z.string().default("mxbai-embed-large"),
EMBEDDING_MODEL_SIZE: z.string().default("1024"),
OLLAMA_URL: z.string().optional(),
COHERE_API_KEY: z.string().optional(),
COHERE_SCORE_THRESHOLD: z.string().default("0.3"),
AWS_ACCESS_KEY_ID: z.string().optional(),
AWS_SECRET_ACCESS_KEY: z.string().optional(),
AWS_REGION: z.string().optional(),
// Queue provider
QUEUE_PROVIDER: z.enum(["trigger", "bullmq"]).default("trigger"),
})
.refine(
(data) => {
// If QUEUE_PROVIDER is "trigger", then Trigger.dev variables must be present
if (data.QUEUE_PROVIDER === "trigger") {
return !!(
data.TRIGGER_PROJECT_ID &&
data.TRIGGER_SECRET_KEY &&
data.TRIGGER_API_URL
);
}
return true;
},
{
message:
"TRIGGER_PROJECT_ID, TRIGGER_SECRET_KEY, and TRIGGER_API_URL are required when QUEUE_PROVIDER=trigger",
},
);
export type Environment = z.infer<typeof EnvironmentSchema>;
export const env = EnvironmentSchema.parse(process.env);

View File

@ -10,10 +10,13 @@ export interface LogItem {
status: "PENDING" | "PROCESSING" | "COMPLETED" | "FAILED" | "CANCELLED";
error?: string;
sourceURL?: string;
type?: string;
integrationSlug?: string;
activityId?: string;
episodeUUID?: string;
data?: any;
spaceIds?: string[];
episodeDetails?: any;
}
export interface LogsResponse {
@ -29,9 +32,10 @@ export interface UseLogsOptions {
endpoint: string; // '/api/v1/logs/all' or '/api/v1/logs/activity'
source?: string;
status?: string;
type?: string;
}
export function useLogs({ endpoint, source, status }: UseLogsOptions) {
export function useLogs({ endpoint, source, status, type }: UseLogsOptions) {
const fetcher = useFetcher<LogsResponse>();
const [logs, setLogs] = useState<LogItem[]>([]);
const [page, setPage] = useState(1);
@ -45,12 +49,13 @@ export function useLogs({ endpoint, source, status }: UseLogsOptions) {
(pageNum: number) => {
const params = new URLSearchParams();
params.set("page", pageNum.toString());
params.set("limit", "5");
params.set("limit", "50");
if (source) params.set("source", source);
if (status) params.set("status", status);
if (type) params.set("type", type);
return `${endpoint}?${params.toString()}`;
},
[endpoint, source, status],
[endpoint, source, status, type],
);
const loadMore = useCallback(() => {
@ -99,7 +104,7 @@ export function useLogs({ endpoint, source, status }: UseLogsOptions) {
setHasMore(true);
setIsInitialLoad(true);
fetcher.load(buildUrl(1));
}, [source, status, buildUrl]); // Inline reset logic to avoid dependency issues
}, [source, status, type, buildUrl]); // Inline reset logic to avoid dependency issues
// Initial load
useEffect(() => {

View File

@ -0,0 +1,186 @@
import * as React from "react";
import type { ToastActionElement, ToastProps } from "~/components/ui/toast";
const TOAST_LIMIT = 1;
const TOAST_REMOVE_DELAY = 1000000;
type ToasterToast = ToastProps & {
id: string;
title?: React.ReactNode;
description?: React.ReactNode;
action?: ToastActionElement;
};
const actionTypes = {
ADD_TOAST: "ADD_TOAST",
UPDATE_TOAST: "UPDATE_TOAST",
DISMISS_TOAST: "DISMISS_TOAST",
REMOVE_TOAST: "REMOVE_TOAST",
} as const;
let count = 0;
function genId() {
count = (count + 1) % Number.MAX_SAFE_INTEGER;
return count.toString();
}
type ActionType = typeof actionTypes;
type Action =
| {
type: ActionType["ADD_TOAST"];
toast: ToasterToast;
}
| {
type: ActionType["UPDATE_TOAST"];
toast: Partial<ToasterToast>;
}
| {
type: ActionType["DISMISS_TOAST"];
toastId?: ToasterToast["id"];
}
| {
type: ActionType["REMOVE_TOAST"];
toastId?: ToasterToast["id"];
};
interface State {
toasts: ToasterToast[];
}
const toastTimeouts = new Map<string, ReturnType<typeof setTimeout>>();
const addToRemoveQueue = (toastId: string) => {
if (toastTimeouts.has(toastId)) {
return;
}
const timeout = setTimeout(() => {
toastTimeouts.delete(toastId);
dispatch({
type: "REMOVE_TOAST",
toastId: toastId,
});
}, TOAST_REMOVE_DELAY);
toastTimeouts.set(toastId, timeout);
};
export const reducer = (state: State, action: Action): State => {
switch (action.type) {
case "ADD_TOAST":
return {
...state,
toasts: [action.toast, ...state.toasts].slice(0, TOAST_LIMIT),
};
case "UPDATE_TOAST":
return {
...state,
toasts: state.toasts.map((t) =>
t.id === action.toast.id ? { ...t, ...action.toast } : t,
),
};
case "DISMISS_TOAST": {
const { toastId } = action;
if (toastId) {
addToRemoveQueue(toastId);
} else {
state.toasts.forEach((toast) => {
addToRemoveQueue(toast.id);
});
}
return {
...state,
toasts: state.toasts.map((t) =>
t.id === toastId || toastId === undefined
? {
...t,
open: false,
}
: t,
),
};
}
case "REMOVE_TOAST":
if (action.toastId === undefined) {
return {
...state,
toasts: [],
};
}
return {
...state,
toasts: state.toasts.filter((t) => t.id !== action.toastId),
};
}
};
const listeners: Array<(state: State) => void> = [];
let memoryState: State = { toasts: [] };
function dispatch(action: Action) {
memoryState = reducer(memoryState, action);
listeners.forEach((listener) => {
listener(memoryState);
});
}
type Toast = Omit<ToasterToast, "id">;
function toast({ ...props }: Toast) {
const id = genId();
const update = (props: ToasterToast) =>
dispatch({
type: "UPDATE_TOAST",
toast: { ...props, id },
});
const dismiss = () => dispatch({ type: "DISMISS_TOAST", toastId: id });
dispatch({
type: "ADD_TOAST",
toast: {
...props,
id,
open: true,
onOpenChange: (open) => {
if (!open) dismiss();
},
},
});
return {
id: id,
dismiss,
update,
};
}
function useToast() {
const [state, setState] = React.useState<State>(memoryState);
React.useEffect(() => {
listeners.push(setState);
return () => {
const index = listeners.indexOf(setState);
if (index > -1) {
listeners.splice(index, 1);
}
};
}, [state]);
return {
...state,
toast,
dismiss: (toastId?: string) => dispatch({ type: "DISMISS_TOAST", toastId }),
};
}
export { useToast, toast };

View File

@ -6,6 +6,7 @@ import { useOptionalUser, useUserChanged } from "./useUser";
export const usePostHog = (
apiKey?: string,
telemetryEnabled = true,
logging = false,
debug = false,
): void => {
@ -15,6 +16,8 @@ export const usePostHog = (
//start PostHog once
useEffect(() => {
// Respect telemetry settings
if (!telemetryEnabled) return;
if (apiKey === undefined || apiKey === "") return;
if (postHogInitialized.current === true) return;
if (logging) console.log("Initializing PostHog");
@ -27,19 +30,26 @@ export const usePostHog = (
if (logging) console.log("PostHog loaded");
if (user !== undefined) {
if (logging) console.log("Loaded: Identifying user", user);
posthog.identify(user.id, { email: user.email });
posthog.identify(user.id, {
email: user.email,
name: user.name,
});
}
},
});
postHogInitialized.current = true;
}, [apiKey, logging, user]);
}, [apiKey, telemetryEnabled, logging, user]);
useUserChanged((user) => {
if (postHogInitialized.current === false) return;
if (!telemetryEnabled) return;
if (logging) console.log("User changed");
if (user) {
if (logging) console.log("Identifying user", user);
posthog.identify(user.id, { email: user.email });
posthog.identify(user.id, {
email: user.email,
name: user.name,
});
} else {
if (logging) console.log("Resetting user");
posthog.reset();

View File

@ -5,7 +5,8 @@ import { useChanged } from "./useChanged";
import { useTypedMatchesData } from "./useTypedMatchData";
export interface ExtendedUser extends User {
availableCredits?: number;
availableCredits: number;
totalCredits: number;
}
export function useIsImpersonating(matches?: UIMatch[]) {
@ -23,7 +24,11 @@ export function useOptionalUser(matches?: UIMatch[]): ExtendedUser | undefined {
});
return routeMatch?.user
? { ...routeMatch?.user, availableCredits: routeMatch?.availableCredits }
? {
...routeMatch?.user,
availableCredits: routeMatch?.availableCredits,
totalCredits: routeMatch?.totalCredits,
}
: undefined;
}

View File

@ -0,0 +1,250 @@
import { exec } from "child_process";
import { promisify } from "util";
import { identifySpacesForTopics } from "~/jobs/spaces/space-identification.logic";
import { assignEpisodesToSpace } from "~/services/graphModels/space";
import { logger } from "~/services/logger.service";
import { SpaceService } from "~/services/space.server";
import { prisma } from "~/trigger/utils/prisma";
const execAsync = promisify(exec);
export interface TopicAnalysisPayload {
userId: string;
workspaceId: string;
minTopicSize?: number;
nrTopics?: number;
}
export interface TopicAnalysisResult {
topics: {
[topicId: string]: {
keywords: string[];
episodeIds: string[];
};
};
}
/**
* Run BERT analysis using exec (for BullMQ/Docker)
*/
async function runBertWithExec(
userId: string,
minTopicSize: number,
nrTopics?: number,
): Promise<string> {
let command = `python3 /core/apps/webapp/python/main.py ${userId} --json`;
if (minTopicSize) {
command += ` --min-topic-size ${minTopicSize}`;
}
if (nrTopics) {
command += ` --nr-topics ${nrTopics}`;
}
console.log(`[BERT Topic Analysis] Executing: ${command}`);
const { stdout, stderr } = await execAsync(command, {
timeout: 300000, // 5 minutes
maxBuffer: 10 * 1024 * 1024, // 10MB buffer for large outputs
});
if (stderr) {
console.warn(`[BERT Topic Analysis] Warnings:`, stderr);
}
return stdout;
}
/**
* Process BERT topic analysis on user's episodes
* This is the common logic shared between Trigger.dev and BullMQ
*
* NOTE: This function does NOT update workspace.metadata.lastTopicAnalysisAt
* That should be done by the caller BEFORE enqueueing this job to prevent
* duplicate analyses from racing conditions.
*/
export async function processTopicAnalysis(
payload: TopicAnalysisPayload,
enqueueSpaceSummary?: (params: {
spaceId: string;
userId: string;
}) => Promise<any>,
pythonRunner?: (
userId: string,
minTopicSize: number,
nrTopics?: number,
) => Promise<string>,
): Promise<TopicAnalysisResult> {
const { userId, workspaceId, minTopicSize = 10, nrTopics } = payload;
console.log(`[BERT Topic Analysis] Starting analysis for user: ${userId}`);
console.log(
`[BERT Topic Analysis] Parameters: minTopicSize=${minTopicSize}, nrTopics=${nrTopics || "auto"}`,
);
try {
const startTime = Date.now();
// Run BERT analysis using provided runner or default exec
const runner = pythonRunner || runBertWithExec;
const stdout = await runner(userId, minTopicSize, nrTopics);
const duration = Date.now() - startTime;
console.log(`[BERT Topic Analysis] Completed in ${duration}ms`);
// Parse the JSON output
const result: TopicAnalysisResult = JSON.parse(stdout);
// Log summary
const topicCount = Object.keys(result.topics).length;
const totalEpisodes = Object.values(result.topics).reduce(
(sum, topic) => sum + topic.episodeIds.length,
0,
);
console.log(
`[BERT Topic Analysis] Found ${topicCount} topics covering ${totalEpisodes} episodes`,
);
// Step 2: Identify spaces for topics using LLM
try {
logger.info("[BERT Topic Analysis] Starting space identification", {
userId,
topicCount,
});
const spaceProposals = await identifySpacesForTopics({
userId,
topics: result.topics,
});
logger.info("[BERT Topic Analysis] Space identification completed", {
userId,
proposalCount: spaceProposals.length,
});
// Step 3: Create or find spaces and assign episodes
// Get existing spaces from PostgreSQL
const existingSpacesFromDb = await prisma.space.findMany({
where: { workspaceId },
});
const existingSpacesByName = new Map(
existingSpacesFromDb.map((s) => [s.name.toLowerCase(), s]),
);
for (const proposal of spaceProposals) {
try {
// Check if space already exists (case-insensitive match)
let spaceId: string;
const existingSpace = existingSpacesByName.get(
proposal.name.toLowerCase(),
);
if (existingSpace) {
// Use existing space
spaceId = existingSpace.id;
logger.info("[BERT Topic Analysis] Using existing space", {
spaceName: proposal.name,
spaceId,
});
} else {
// Create new space (creates in both PostgreSQL and Neo4j)
// Skip automatic space assignment since we're manually assigning from BERT topics
const spaceService = new SpaceService();
const newSpace = await spaceService.createSpace({
name: proposal.name,
description: proposal.intent,
userId,
workspaceId,
});
spaceId = newSpace.id;
logger.info("[BERT Topic Analysis] Created new space", {
spaceName: proposal.name,
spaceId,
intent: proposal.intent,
});
}
// Collect all episode IDs from the topics in this proposal
const episodeIds: string[] = [];
for (const topicId of proposal.topics) {
const topic = result.topics[topicId];
if (topic) {
episodeIds.push(...topic.episodeIds);
}
}
// Assign all episodes from these topics to the space
if (episodeIds.length > 0) {
await assignEpisodesToSpace(episodeIds, spaceId, userId);
logger.info("[BERT Topic Analysis] Assigned episodes to space", {
spaceName: proposal.name,
spaceId,
episodeCount: episodeIds.length,
topics: proposal.topics,
});
// Step 4: Trigger space summary if callback provided
if (enqueueSpaceSummary) {
await enqueueSpaceSummary({ spaceId, userId });
logger.info("[BERT Topic Analysis] Triggered space summary", {
spaceName: proposal.name,
spaceId,
});
}
}
} catch (spaceError) {
logger.error(
"[BERT Topic Analysis] Failed to process space proposal",
{
proposal,
error: spaceError,
},
);
// Continue with other proposals
}
}
} catch (spaceIdentificationError) {
logger.error(
"[BERT Topic Analysis] Space identification failed, returning topics only",
{
error: spaceIdentificationError,
},
);
// Return topics even if space identification fails
}
return result;
} catch (error) {
console.error(`[BERT Topic Analysis] Error:`, error);
if (error instanceof Error) {
// Check for timeout
if (error.message.includes("ETIMEDOUT")) {
throw new Error(
`Topic analysis timed out after 5 minutes. User may have too many episodes.`,
);
}
// Check for Python errors
if (error.message.includes("python3: not found")) {
throw new Error(`Python 3 is not installed or not available in PATH.`);
}
// Check for Neo4j connection errors
if (error.message.includes("Failed to connect to Neo4j")) {
throw new Error(
`Could not connect to Neo4j. Check NEO4J_URI and credentials.`,
);
}
// Check for no episodes
if (error.message.includes("No episodes found")) {
throw new Error(`No episodes found for userId: ${userId}`);
}
}
throw error;
}
}

View File

@ -0,0 +1,82 @@
import { conversationTitlePrompt } from "~/trigger/conversation/prompt";
import { prisma } from "~/trigger/utils/prisma";
import { logger } from "~/services/logger.service";
import { generateText, type LanguageModel } from "ai";
import { getModel } from "~/lib/model.server";
export interface CreateConversationTitlePayload {
conversationId: string;
message: string;
}
export interface CreateConversationTitleResult {
success: boolean;
title?: string;
error?: string;
}
/**
* Core business logic for creating conversation titles
* This is shared between Trigger.dev and BullMQ implementations
*/
export async function processConversationTitleCreation(
payload: CreateConversationTitlePayload,
): Promise<CreateConversationTitleResult> {
try {
let conversationTitleResponse = "";
const { text } = await generateText({
model: getModel() as LanguageModel,
messages: [
{
role: "user",
content: conversationTitlePrompt.replace(
"{{message}}",
payload.message,
),
},
],
});
const outputMatch = text.match(/<output>(.*?)<\/output>/s);
logger.info(`Conversation title data: ${JSON.stringify(outputMatch)}`);
if (!outputMatch) {
logger.error("No output found in recurrence response");
throw new Error("Invalid response format from AI");
}
const jsonStr = outputMatch[1].trim();
const conversationTitleData = JSON.parse(jsonStr);
if (conversationTitleData) {
await prisma.conversation.update({
where: {
id: payload.conversationId,
},
data: {
title: conversationTitleData.title,
},
});
return {
success: true,
title: conversationTitleData.title,
};
}
return {
success: false,
error: "No title generated",
};
} catch (error: any) {
logger.error(
`Error creating conversation title for ${payload.conversationId}:`,
error,
);
return {
success: false,
error: error.message,
};
}
}

View File

@ -0,0 +1,290 @@
import { type z } from "zod";
import { IngestionStatus } from "@core/database";
import { EpisodeTypeEnum } from "@core/types";
import { logger } from "~/services/logger.service";
import { saveDocument } from "~/services/graphModels/document";
import { DocumentVersioningService } from "~/services/documentVersioning.server";
import { DocumentDifferentialService } from "~/services/documentDiffer.server";
import { KnowledgeGraphService } from "~/services/knowledgeGraph.server";
import { prisma } from "~/trigger/utils/prisma";
import { type IngestBodyRequest } from "./ingest-episode.logic";
export interface IngestDocumentPayload {
body: z.infer<typeof IngestBodyRequest>;
userId: string;
workspaceId: string;
queueId: string;
}
export interface IngestDocumentResult {
success: boolean;
error?: string;
}
/**
* Core business logic for document ingestion with differential processing
* This is shared between Trigger.dev and BullMQ implementations
*
* Note: This function should NOT call trigger functions directly for chunk processing.
* Instead, use the enqueueEpisodeIngestion callback to queue episode ingestion jobs.
*/
export async function processDocumentIngestion(
payload: IngestDocumentPayload,
// Callback function for enqueueing episode ingestion for each chunk
enqueueEpisodeIngestion?: (params: {
body: any;
userId: string;
workspaceId: string;
queueId: string;
}) => Promise<{ id?: string }>,
): Promise<IngestDocumentResult> {
const startTime = Date.now();
try {
logger.log(`Processing document for user ${payload.userId}`, {
contentLength: payload.body.episodeBody.length,
});
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
status: IngestionStatus.PROCESSING,
},
});
const documentBody = payload.body;
// Step 1: Initialize services and prepare document version
const versioningService = new DocumentVersioningService();
const differentialService = new DocumentDifferentialService();
const knowledgeGraphService = new KnowledgeGraphService();
const {
documentNode: document,
versionInfo,
chunkedDocument,
} = await versioningService.prepareDocumentVersion(
documentBody.sessionId!,
payload.userId,
documentBody.metadata?.documentTitle?.toString() || "Untitled Document",
documentBody.episodeBody,
documentBody.source,
documentBody.metadata || {},
);
logger.log(`Document version analysis:`, {
version: versionInfo.newVersion,
isNewDocument: versionInfo.isNewDocument,
hasContentChanged: versionInfo.hasContentChanged,
changePercentage: versionInfo.chunkLevelChanges.changePercentage,
changedChunks: versionInfo.chunkLevelChanges.changedChunkIndices.length,
totalChunks: versionInfo.chunkLevelChanges.totalChunks,
});
// Step 2: Determine processing strategy
const differentialDecision =
await differentialService.analyzeDifferentialNeed(
documentBody.episodeBody,
versionInfo.existingDocument,
chunkedDocument,
);
logger.log(`Differential analysis:`, {
shouldUseDifferential: differentialDecision.shouldUseDifferential,
strategy: differentialDecision.strategy,
reason: differentialDecision.reason,
documentSizeTokens: differentialDecision.documentSizeTokens,
});
// Early return for unchanged documents
if (differentialDecision.strategy === "skip_processing") {
logger.log("Document content unchanged, skipping processing");
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
status: IngestionStatus.COMPLETED,
},
});
return {
success: true,
};
}
// Step 3: Save the new document version
await saveDocument(document);
// Step 3.1: Invalidate statements from previous document version if it exists
let invalidationResults = null;
if (versionInfo.existingDocument && versionInfo.hasContentChanged) {
logger.log(
`Invalidating statements from previous document version: ${versionInfo.existingDocument.uuid}`,
);
invalidationResults =
await knowledgeGraphService.invalidateStatementsFromPreviousDocumentVersion(
{
previousDocumentUuid: versionInfo.existingDocument.uuid,
newDocumentContent: documentBody.episodeBody,
userId: payload.userId,
invalidatedBy: document.uuid,
semanticSimilarityThreshold: 0.75, // Configurable threshold
},
);
logger.log(`Statement invalidation completed:`, {
totalAnalyzed: invalidationResults.totalStatementsAnalyzed,
invalidated: invalidationResults.invalidatedStatements.length,
preserved: invalidationResults.preservedStatements.length,
});
}
logger.log(`Document chunked into ${chunkedDocument.chunks.length} chunks`);
// Step 4: Process chunks based on differential strategy
let chunksToProcess = chunkedDocument.chunks;
let processingMode = "full";
if (
differentialDecision.shouldUseDifferential &&
differentialDecision.strategy === "chunk_level_diff"
) {
// Only process changed chunks
const chunkComparisons = differentialService.getChunkComparisons(
versionInfo.existingDocument!,
chunkedDocument,
);
const changedIndices =
differentialService.getChunksNeedingReprocessing(chunkComparisons);
chunksToProcess = chunkedDocument.chunks.filter((chunk) =>
changedIndices.includes(chunk.chunkIndex),
);
processingMode = "differential";
logger.log(
`Differential processing: ${chunksToProcess.length}/${chunkedDocument.chunks.length} chunks need reprocessing`,
);
} else if (differentialDecision.strategy === "full_reingest") {
// Process all chunks
processingMode = "full";
logger.log(
`Full reingestion: processing all ${chunkedDocument.chunks.length} chunks`,
);
}
// Step 5: Queue chunks for processing
const episodeHandlers = [];
if (enqueueEpisodeIngestion) {
for (const chunk of chunksToProcess) {
const chunkEpisodeData = {
episodeBody: chunk.content,
referenceTime: documentBody.referenceTime,
metadata: {
...documentBody.metadata,
processingMode,
differentialStrategy: differentialDecision.strategy,
chunkHash: chunk.contentHash,
documentTitle:
documentBody.metadata?.documentTitle?.toString() ||
"Untitled Document",
chunkIndex: chunk.chunkIndex,
documentUuid: document.uuid,
},
source: documentBody.source,
spaceIds: documentBody.spaceIds,
sessionId: documentBody.sessionId,
type: EpisodeTypeEnum.DOCUMENT,
};
const episodeHandler = await enqueueEpisodeIngestion({
body: chunkEpisodeData,
userId: payload.userId,
workspaceId: payload.workspaceId,
queueId: payload.queueId,
});
if (episodeHandler.id) {
episodeHandlers.push(episodeHandler.id);
logger.log(
`Queued chunk ${chunk.chunkIndex + 1} for ${processingMode} processing`,
{
handlerId: episodeHandler.id,
chunkSize: chunk.content.length,
chunkHash: chunk.contentHash,
},
);
}
}
}
// Calculate cost savings
const costSavings = differentialService.calculateCostSavings(
chunkedDocument.chunks.length,
chunksToProcess.length,
);
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
output: {
documentUuid: document.uuid,
version: versionInfo.newVersion,
totalChunks: chunkedDocument.chunks.length,
chunksProcessed: chunksToProcess.length,
chunksSkipped: costSavings.chunksSkipped,
processingMode,
differentialStrategy: differentialDecision.strategy,
estimatedSavings: `${costSavings.estimatedSavingsPercentage.toFixed(1)}%`,
statementInvalidation: invalidationResults
? {
totalAnalyzed: invalidationResults.totalStatementsAnalyzed,
invalidated: invalidationResults.invalidatedStatements.length,
preserved: invalidationResults.preservedStatements.length,
}
: null,
episodes: [],
episodeHandlers,
},
status: IngestionStatus.PROCESSING,
},
});
const processingTimeMs = Date.now() - startTime;
logger.log(
`Document differential processing completed in ${processingTimeMs}ms`,
{
documentUuid: document.uuid,
version: versionInfo.newVersion,
processingMode,
totalChunks: chunkedDocument.chunks.length,
chunksProcessed: chunksToProcess.length,
chunksSkipped: costSavings.chunksSkipped,
estimatedSavings: `${costSavings.estimatedSavingsPercentage.toFixed(1)}%`,
changePercentage: `${differentialDecision.changePercentage.toFixed(1)}%`,
statementInvalidation: invalidationResults
? {
totalAnalyzed: invalidationResults.totalStatementsAnalyzed,
invalidated: invalidationResults.invalidatedStatements.length,
preserved: invalidationResults.preservedStatements.length,
}
: "No previous version",
},
);
return { success: true };
} catch (err: any) {
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
error: err.message,
status: IngestionStatus.FAILED,
},
});
logger.error(`Error processing document for user ${payload.userId}:`, err);
return { success: false, error: err.message };
}
}

View File

@ -0,0 +1,314 @@
import { z } from "zod";
import { KnowledgeGraphService } from "~/services/knowledgeGraph.server";
import { linkEpisodeToDocument } from "~/services/graphModels/document";
import { IngestionStatus } from "@core/database";
import { logger } from "~/services/logger.service";
import { prisma } from "~/trigger/utils/prisma";
import { EpisodeType } from "@core/types";
import { deductCredits, hasCredits } from "~/trigger/utils/utils";
import { assignEpisodesToSpace } from "~/services/graphModels/space";
import {
shouldTriggerTopicAnalysis,
updateLastTopicAnalysisTime,
} from "~/services/bertTopicAnalysis.server";
export const IngestBodyRequest = z.object({
episodeBody: z.string(),
referenceTime: z.string(),
metadata: z.record(z.union([z.string(), z.number(), z.boolean()])).optional(),
source: z.string(),
spaceIds: z.array(z.string()).optional(),
sessionId: z.string().optional(),
type: z
.enum([EpisodeType.CONVERSATION, EpisodeType.DOCUMENT])
.default(EpisodeType.CONVERSATION),
});
export interface IngestEpisodePayload {
body: z.infer<typeof IngestBodyRequest>;
userId: string;
workspaceId: string;
queueId: string;
}
export interface IngestEpisodeResult {
success: boolean;
episodeDetails?: any;
error?: string;
}
/**
* Core business logic for ingesting a single episode
* This is shared between Trigger.dev and BullMQ implementations
*
* Note: This function should NOT call trigger functions directly.
* Instead, return data that indicates follow-up jobs are needed,
* and let the caller (Trigger task or BullMQ worker) handle job queueing.
*/
export async function processEpisodeIngestion(
payload: IngestEpisodePayload,
// Callback functions for enqueueing follow-up jobs
enqueueSpaceAssignment?: (params: {
userId: string;
workspaceId: string;
mode: "episode";
episodeIds: string[];
}) => Promise<any>,
enqueueSessionCompaction?: (params: {
userId: string;
sessionId: string;
source: string;
}) => Promise<any>,
enqueueBertTopicAnalysis?: (params: {
userId: string;
workspaceId: string;
minTopicSize?: number;
nrTopics?: number;
}) => Promise<any>,
): Promise<IngestEpisodeResult> {
try {
logger.log(`Processing job for user ${payload.userId}`);
// Check if workspace has sufficient credits before processing
const hasSufficientCredits = await hasCredits(
payload.workspaceId,
"addEpisode",
);
if (!hasSufficientCredits) {
logger.warn(`Insufficient credits for workspace ${payload.workspaceId}`);
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
status: IngestionStatus.NO_CREDITS,
error:
"Insufficient credits. Please upgrade your plan or wait for your credits to reset.",
},
});
return {
success: false,
error: "Insufficient credits",
};
}
const ingestionQueue = await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
status: IngestionStatus.PROCESSING,
},
});
const knowledgeGraphService = new KnowledgeGraphService();
const episodeBody = payload.body as any;
const episodeDetails = await knowledgeGraphService.addEpisode(
{
...episodeBody,
userId: payload.userId,
},
prisma,
);
// Link episode to document if it's a document chunk
if (
episodeBody.type === EpisodeType.DOCUMENT &&
episodeBody.metadata.documentUuid &&
episodeDetails.episodeUuid
) {
try {
await linkEpisodeToDocument(
episodeDetails.episodeUuid,
episodeBody.metadata.documentUuid,
episodeBody.metadata.chunkIndex || 0,
);
logger.log(
`Linked episode ${episodeDetails.episodeUuid} to document ${episodeBody.metadata.documentUuid} at chunk ${episodeBody.metadata.chunkIndex || 0}`,
);
} catch (error) {
logger.error(`Failed to link episode to document:`, {
error,
episodeUuid: episodeDetails.episodeUuid,
documentUuid: episodeBody.metadata.documentUuid,
});
}
}
let finalOutput = episodeDetails;
let episodeUuids: string[] = episodeDetails.episodeUuid
? [episodeDetails.episodeUuid]
: [];
let currentStatus: IngestionStatus = IngestionStatus.COMPLETED;
if (episodeBody.type === EpisodeType.DOCUMENT) {
const currentOutput = ingestionQueue.output as any;
currentOutput.episodes.push(episodeDetails);
episodeUuids = currentOutput.episodes.map(
(episode: any) => episode.episodeUuid,
);
finalOutput = {
...currentOutput,
};
if (currentOutput.episodes.length !== currentOutput.totalChunks) {
currentStatus = IngestionStatus.PROCESSING;
}
}
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
output: finalOutput,
status: currentStatus,
},
});
// Deduct credits for episode creation
if (currentStatus === IngestionStatus.COMPLETED) {
await deductCredits(
payload.workspaceId,
"addEpisode",
finalOutput.statementsCreated,
);
}
// Handle space assignment after successful ingestion
try {
// If spaceIds were explicitly provided, immediately assign the episode to those spaces
if (
episodeBody.spaceIds &&
episodeBody.spaceIds.length > 0 &&
episodeDetails.episodeUuid
) {
logger.info(`Assigning episode to explicitly provided spaces`, {
userId: payload.userId,
episodeId: episodeDetails.episodeUuid,
spaceIds: episodeBody.spaceIds,
});
// Assign episode to each space
for (const spaceId of episodeBody.spaceIds) {
await assignEpisodesToSpace(
[episodeDetails.episodeUuid],
spaceId,
payload.userId,
);
}
logger.info(
`Skipping LLM space assignment - episode explicitly assigned to ${episodeBody.spaceIds.length} space(s)`,
);
} else {
// Only trigger automatic LLM space assignment if no explicit spaceIds were provided
logger.info(
`Triggering LLM space assignment after successful ingestion`,
{
userId: payload.userId,
workspaceId: payload.workspaceId,
episodeId: episodeDetails?.episodeUuid,
},
);
if (
episodeDetails.episodeUuid &&
currentStatus === IngestionStatus.COMPLETED &&
enqueueSpaceAssignment
) {
await enqueueSpaceAssignment({
userId: payload.userId,
workspaceId: payload.workspaceId,
mode: "episode",
episodeIds: episodeUuids,
});
}
}
} catch (assignmentError) {
// Don't fail the ingestion if assignment fails
logger.warn(`Failed to trigger space assignment after ingestion:`, {
error: assignmentError,
userId: payload.userId,
episodeId: episodeDetails?.episodeUuid,
});
}
// Auto-trigger session compaction if episode has sessionId
try {
if (
episodeBody.sessionId &&
currentStatus === IngestionStatus.COMPLETED &&
enqueueSessionCompaction
) {
logger.info(`Checking if session compaction should be triggered`, {
userId: payload.userId,
sessionId: episodeBody.sessionId,
source: episodeBody.source,
});
await enqueueSessionCompaction({
userId: payload.userId,
sessionId: episodeBody.sessionId,
source: episodeBody.source,
});
}
} catch (compactionError) {
// Don't fail the ingestion if compaction fails
logger.warn(`Failed to trigger session compaction after ingestion:`, {
error: compactionError,
userId: payload.userId,
sessionId: episodeBody.sessionId,
});
}
// Auto-trigger BERT topic analysis if threshold met (20+ new episodes)
try {
if (
currentStatus === IngestionStatus.COMPLETED &&
enqueueBertTopicAnalysis
) {
const shouldTrigger = await shouldTriggerTopicAnalysis(
payload.userId,
payload.workspaceId,
);
if (shouldTrigger) {
logger.info(
`Triggering BERT topic analysis after reaching 20+ new episodes`,
{
userId: payload.userId,
workspaceId: payload.workspaceId,
},
);
await enqueueBertTopicAnalysis({
userId: payload.userId,
workspaceId: payload.workspaceId,
minTopicSize: 10,
});
// Update the last analysis timestamp
await updateLastTopicAnalysisTime(payload.workspaceId);
}
}
} catch (topicAnalysisError) {
// Don't fail the ingestion if topic analysis fails
logger.warn(`Failed to trigger topic analysis after ingestion:`, {
error: topicAnalysisError,
userId: payload.userId,
});
}
return { success: true, episodeDetails };
} catch (err: any) {
await prisma.ingestionQueue.update({
where: { id: payload.queueId },
data: {
error: err.message,
status: IngestionStatus.FAILED,
},
});
logger.error(`Error processing job for user ${payload.userId}:`, err);
return { success: false, error: err.message };
}
}

View File

@ -0,0 +1,455 @@
import { logger } from "~/services/logger.service";
import type { CoreMessage } from "ai";
import { z } from "zod";
import { getEmbedding, makeModelCall } from "~/lib/model.server";
import {
getCompactedSessionBySessionId,
linkEpisodesToCompact,
getSessionEpisodes,
type CompactedSessionNode,
type SessionEpisodeData,
saveCompactedSession,
} from "~/services/graphModels/compactedSession";
export interface SessionCompactionPayload {
userId: string;
sessionId: string;
source: string;
triggerSource?: "auto" | "manual" | "threshold";
}
export interface SessionCompactionResult {
success: boolean;
compactionResult?: {
compactUuid: string;
sessionId: string;
summary: string;
episodeCount: number;
startTime: Date;
endTime: Date;
confidence: number;
compressionRatio: number;
};
reason?: string;
episodeCount?: number;
error?: string;
}
// Zod schema for LLM response validation
export const CompactionResultSchema = z.object({
summary: z.string().describe("Consolidated narrative of the entire session"),
confidence: z
.number()
.min(0)
.max(1)
.describe("Confidence score of the compaction quality"),
});
export const CONFIG = {
minEpisodesForCompaction: 5, // Minimum episodes to trigger compaction
compactionThreshold: 1, // Trigger after N new episodes
maxEpisodesPerBatch: 50, // Process in batches if needed
};
/**
* Core business logic for session compaction
* This is shared between Trigger.dev and BullMQ implementations
*/
export async function processSessionCompaction(
payload: SessionCompactionPayload,
): Promise<SessionCompactionResult> {
const { userId, sessionId, source, triggerSource = "auto" } = payload;
logger.info(`Starting session compaction`, {
userId,
sessionId,
source,
triggerSource,
});
try {
// Check if compaction already exists
const existingCompact = await getCompactedSessionBySessionId(
sessionId,
userId,
);
// Fetch all episodes for this session
const episodes = await getSessionEpisodes(
sessionId,
userId,
existingCompact?.endTime,
);
console.log("episodes", episodes.length);
// Check if we have enough episodes
if (!existingCompact && episodes.length < CONFIG.minEpisodesForCompaction) {
logger.info(`Not enough episodes for compaction`, {
sessionId,
episodeCount: episodes.length,
minRequired: CONFIG.minEpisodesForCompaction,
});
return {
success: false,
reason: "insufficient_episodes",
episodeCount: episodes.length,
};
} else if (
existingCompact &&
episodes.length <
CONFIG.minEpisodesForCompaction + CONFIG.compactionThreshold
) {
logger.info(`Not enough new episodes for compaction`, {
sessionId,
episodeCount: episodes.length,
minRequired:
CONFIG.minEpisodesForCompaction + CONFIG.compactionThreshold,
});
return {
success: false,
reason: "insufficient_new_episodes",
episodeCount: episodes.length,
};
}
// Generate or update compaction
const compactionResult = existingCompact
? await updateCompaction(existingCompact, episodes, userId)
: await createCompaction(sessionId, episodes, userId, source);
logger.info(`Session compaction completed`, {
sessionId,
compactUuid: compactionResult.uuid,
episodeCount: compactionResult.episodeCount,
compressionRatio: compactionResult.compressionRatio,
});
return {
success: true,
compactionResult: {
compactUuid: compactionResult.uuid,
sessionId: compactionResult.sessionId,
summary: compactionResult.summary,
episodeCount: compactionResult.episodeCount,
startTime: compactionResult.startTime,
endTime: compactionResult.endTime,
confidence: compactionResult.confidence,
compressionRatio: compactionResult.compressionRatio,
},
};
} catch (error) {
logger.error(`Session compaction failed`, {
sessionId,
userId,
error: error instanceof Error ? error.message : String(error),
});
return {
success: false,
error: error instanceof Error ? error.message : String(error),
};
}
}
/**
* Create new compaction
*/
async function createCompaction(
sessionId: string,
episodes: SessionEpisodeData[],
userId: string,
source: string,
): Promise<CompactedSessionNode> {
logger.info(`Creating new compaction`, {
sessionId,
episodeCount: episodes.length,
});
// Generate compaction using LLM
const compactionData = await generateCompaction(episodes, null);
// Generate embedding for summary
const summaryEmbedding = await getEmbedding(compactionData.summary);
// Create CompactedSession node using graph model
const compactUuid = crypto.randomUUID();
const now = new Date();
const startTime = new Date(episodes[0].createdAt);
const endTime = new Date(episodes[episodes.length - 1].createdAt);
const episodeUuids = episodes.map((e) => e.uuid);
const compressionRatio = episodes.length / 1;
const compactNode: CompactedSessionNode = {
uuid: compactUuid,
sessionId,
summary: compactionData.summary,
summaryEmbedding,
episodeCount: episodes.length,
startTime,
endTime,
createdAt: now,
confidence: compactionData.confidence,
userId,
source,
compressionRatio,
metadata: { triggerType: "create" },
};
console.log("compactNode", compactNode);
// Use graph model functions
await saveCompactedSession(compactNode);
await linkEpisodesToCompact(compactUuid, episodeUuids, userId);
logger.info(`Compaction created`, {
compactUuid,
episodeCount: episodes.length,
});
return compactNode;
}
/**
* Update existing compaction with new episodes
*/
async function updateCompaction(
existingCompact: CompactedSessionNode,
newEpisodes: SessionEpisodeData[],
userId: string,
): Promise<CompactedSessionNode> {
logger.info(`Updating existing compaction`, {
compactUuid: existingCompact.uuid,
newEpisodeCount: newEpisodes.length,
});
// Generate updated compaction using LLM (merging)
const compactionData = await generateCompaction(
newEpisodes,
existingCompact.summary,
);
// Generate new embedding for updated summary
const summaryEmbedding = await getEmbedding(compactionData.summary);
// Update CompactedSession node using graph model
const now = new Date();
const endTime = newEpisodes[newEpisodes.length - 1].createdAt;
const totalEpisodeCount = existingCompact.episodeCount + newEpisodes.length;
const compressionRatio = totalEpisodeCount / 1;
const episodeUuids = newEpisodes.map((e) => e.uuid);
const updatedNode: CompactedSessionNode = {
...existingCompact,
summary: compactionData.summary,
summaryEmbedding,
episodeCount: totalEpisodeCount,
endTime,
updatedAt: now,
confidence: compactionData.confidence,
compressionRatio,
metadata: { triggerType: "update", newEpisodesAdded: newEpisodes.length },
};
// Use graph model functions
await saveCompactedSession(updatedNode);
await linkEpisodesToCompact(existingCompact.uuid, episodeUuids, userId);
logger.info(`Compaction updated`, {
compactUuid: existingCompact.uuid,
totalEpisodeCount,
});
return updatedNode;
}
/**
* Generate compaction using LLM (similar to Claude Code's compact approach)
*/
async function generateCompaction(
episodes: SessionEpisodeData[],
existingSummary: string | null,
): Promise<z.infer<typeof CompactionResultSchema>> {
const systemPrompt = createCompactionSystemPrompt();
const userPrompt = createCompactionUserPrompt(episodes, existingSummary);
const messages: CoreMessage[] = [
{ role: "system", content: systemPrompt },
{ role: "user", content: userPrompt },
];
logger.info(`Generating compaction with LLM`, {
episodeCount: episodes.length,
hasExistingSummary: !!existingSummary,
});
try {
let responseText = "";
await makeModelCall(
false,
messages,
(text: string) => {
responseText = text;
},
undefined,
"high",
);
return parseCompactionResponse(responseText);
} catch (error) {
logger.error(`Failed to generate compaction`, {
error: error instanceof Error ? error.message : String(error),
});
throw error;
}
}
/**
* System prompt for compaction (for agent recall/context retrieval)
*/
function createCompactionSystemPrompt(): string {
return `You are a session compaction specialist. Your task is to create a rich, informative summary that will help AI agents understand what happened in this conversation session when they need context for future interactions.
## PURPOSE
This summary will be retrieved by AI agents when the user references this session in future conversations. The agent needs enough context to:
- Understand what was discussed and why
- Know what decisions were made and their rationale
- Grasp the outcome and current state
- Have relevant technical details to provide informed responses
## COMPACTION GOALS
1. **Comprehensive Context**: Capture all important information that might be referenced later
2. **Decision Documentation**: Clearly state what was decided, why, and what alternatives were considered
3. **Technical Details**: Include specific implementations, tools, configurations, and technical choices
4. **Outcome Clarity**: Make it clear what was accomplished and what the final state is
5. **Evolution Tracking**: Show how thinking or decisions evolved during the session
## COMPACTION RULES
1. **Be Information-Dense**: Pack useful details without fluff or repetition
2. **Structure Chronologically**: Start with problem/question, show progression, end with outcome
3. **Highlight Key Points**: Emphasize decisions, implementations, results, and learnings
4. **Include Specifics**: Names of libraries, specific configurations, metrics, numbers matter
5. **Resolve Contradictions**: Always use the most recent/final version when information conflicts
## OUTPUT REQUIREMENTS
- **summary**: A detailed, information-rich narrative that tells the complete story
- Structure naturally based on content - use as many paragraphs as needed
- Each distinct topic, decision, or phase should get its own paragraph(s)
- Start with context and initial problem/question
- Progress chronologically through discussions, decisions, and implementations
- **Final paragraph MUST**: State the outcome, results, and current state
- Don't artificially limit length - capture everything important
- **confidence**: Score (0-1) reflecting how well this summary captures the session's essence
Your response MUST be valid JSON wrapped in <output></output> tags.
## KEY PRINCIPLES
- Write for an AI agent that needs to help the user in future conversations
- Include technical specifics that might be referenced (library names, configurations, metrics)
- Make outcomes and current state crystal clear in the final paragraph
- Show the reasoning behind decisions, not just the decisions themselves
- Be comprehensive but concise - every sentence should add value
- Each major topic or phase deserves its own paragraph(s)
- Don't compress too much - agents need the details
`;
}
/**
* User prompt for compaction
*/
function createCompactionUserPrompt(
episodes: SessionEpisodeData[],
existingSummary: string | null,
): string {
let prompt = "";
if (existingSummary) {
prompt += `## EXISTING SUMMARY (from previous compaction)\n\n${existingSummary}\n\n`;
prompt += `## NEW EPISODES (to merge into existing summary)\n\n`;
} else {
prompt += `## SESSION EPISODES (to compact)\n\n`;
}
episodes.forEach((episode, index) => {
const timestamp = new Date(episode.validAt).toISOString();
prompt += `### Episode ${index + 1} (${timestamp})\n`;
prompt += `Source: ${episode.source}\n`;
prompt += `Content:\n${episode.originalContent}\n\n`;
});
if (existingSummary) {
prompt += `\n## INSTRUCTIONS\n\n`;
prompt += `Merge the new episodes into the existing summary. Update facts, add new information, and maintain narrative coherence. Ensure the consolidated summary reflects the complete session including both old and new content.\n`;
} else {
prompt += `\n## INSTRUCTIONS\n\n`;
prompt += `Create a compact summary of this entire session. Consolidate all information into a coherent narrative with deduplicated key facts.\n`;
}
return prompt;
}
/**
* Parse LLM response for compaction
*/
function parseCompactionResponse(
response: string,
): z.infer<typeof CompactionResultSchema> {
try {
// Extract content from <output> tags
const outputMatch = response.match(/<output>([\s\S]*?)<\/output>/);
if (!outputMatch) {
logger.warn("No <output> tags found in LLM compaction response");
logger.debug("Full LLM response:", { response });
throw new Error("Invalid LLM response format - missing <output> tags");
}
let jsonContent = outputMatch[1].trim();
// Remove markdown code blocks if present
jsonContent = jsonContent.replace(/```json\n?/g, "").replace(/```\n?/g, "");
const parsed = JSON.parse(jsonContent);
// Validate with schema
const validated = CompactionResultSchema.parse(parsed);
return validated;
} catch (error) {
logger.error("Failed to parse compaction response", {
error: error instanceof Error ? error.message : String(error),
response: response.substring(0, 500),
});
throw new Error(`Failed to parse compaction response: ${error}`);
}
}
/**
* Helper function to check if compaction should be triggered
*/
export async function shouldTriggerCompaction(
sessionId: string,
userId: string,
): Promise<boolean> {
const existingCompact = await getCompactedSessionBySessionId(
sessionId,
userId,
);
if (!existingCompact) {
// Check if we have enough episodes for initial compaction
const episodes = await getSessionEpisodes(sessionId, userId);
return episodes.length >= CONFIG.minEpisodesForCompaction;
}
// Check if we have enough new episodes to update
const newEpisodes = await getSessionEpisodes(
sessionId,
userId,
existingCompact.endTime,
);
return newEpisodes.length >= CONFIG.compactionThreshold;
}

Some files were not shown because too many files have changed in this diff Show More