core/hosting/docker/scripts/ollama-init.sh
Harshith Mullapudi f39c7cc6d0
feat: remove trigger and run base on bullmq (#126)
* feat: remove trigger and run base on bullmq
* fix: telemetry and trigger deploymen
* feat: add Ollama container and update ingestion status for unchanged documents
* feat: add logger to bullmq workers
* 1. Remove chat and deep-search from trigger
2. Add ai/sdk for chat UI
3. Added a better model manager

* refactor: simplify clustered graph query and add stop conditions for AI responses

* fix: streaming

* fix: docker docs

---------

Co-authored-by: Manoj <saimanoj58@gmail.com>
2025-10-26 12:56:12 +05:30

19 lines
341 B
Bash

#!/bin/bash
set -e
echo "Starting Ollama server..."
ollama serve &
OLLAMA_PID=$!
echo "Waiting for Ollama server to be ready..."
sleep 5
echo "Pulling mxbai-embed-large model..."
ollama pull mxbai-embed-large
echo "Model pulled successfully!"
echo "Ollama is ready to accept requests."
# Keep the Ollama server running
wait $OLLAMA_PID