Glow Server Package

Server-side logic for Glow AI chat

Overview

@zooly/glow-server provides the server-side logic for the Glow chat system: streaming LLM responses, tool execution, cache lookup, context switching, and chat lifecycle management.

Package Details

  • Package Name: @zooly/glow-server
  • Location: packages/glow-server
  • Type: Server-side module (Node.js)

Key Exports

Request Handlers

ExportPurpose
handleGlowChatStreamEntry point for streaming; validates session, delegates to cache/core, returns stream response
handleGlowChatCoreMain LLM flow: prompts, tools, streamText
handleGlowChatWithCacheWraps core; checks question cache when enabled
handleGlowChatGenerateGenerate flow: getOrCreateGlowChat + handleGlowChatWithCache + stream

Chat Lifecycle

ExportPurpose
getOrCreateGlowChatCreate new chat or return existing by chatId
getGlowChatDataLoad glowChat + glowSettings by glowChatId
createGlowChatCreate chat with options
createNewGlowChatCreate new chat for a flow slug

Context & Tools

ExportPurpose
changeChatContextSwitch chat to new flow by slug
returnToContextReturn to previous flow
buildContextStackBuild context from tool call logs
getCurrentNodeToolsResolve tools for current node
getToolsListAll tools for a flow
executeApiToolCall external API for API tools
toolsBaseBase tools (moveToNodeId, changeChatContext, returnToContext)

Prompts & Helpers

ExportPurpose
getGlowChatSystemPromptFlow-level system prompt
getGlowChatInstructionsSystemPromptNode-level instructions
getFirstNodeMetadataFirst node metadata from flow
getCurrentNodeMetadataCurrent node metadata
moveToNodeIdInstructionsInstructions for moveToNodeId tool

Utilities

ExportPurpose
extractQuestionFromMessageExtract question text for cache lookup
transformNodesMetadata, transformField, transformFieldsField transformers
generateZodSchemaGenerate Zod schema from tool fields
defaultOnFinishDefault onFinish handler
prepareStepHandlerPrepare step handler
onStepFinishHandlerStep finish handler
saveMessageDataSave message data

Model

ExportPurpose
defaultModelDefault model config (openai/gpt-4.1)

Dependencies

  • ai - AI SDK (streamText, convertToModelMessages, etc.)
  • @zooly/db - Database access, cache, embeddings
  • @zooly/types - GlowChat, GlowSettings, GlowTool, etc.
  • @zooly/util - extractQuestionFromMessage
  • @zooly/llm-mock - Mock LLM for testing
  • next - Request types
  • next-auth - Session types
  • zod - Schema validation
  • uuid - ID generation

Usage

Streaming Handler

import { handleGlowChatStream } from "@zooly/glow-server";

export async function POST(request: Request) {
  const session = await getServerSession(authOptions);
  return handleGlowChatStream(request, session ?? undefined);
}

Get or Create Chat

import { getOrCreateGlowChat } from "@zooly/glow-server";

const { glowChat, glowSettings } = await getOrCreateGlowChat({
  chatId: "my-chat-id",
  glowSlug: "backstage-mini-app",
  restart: false,
});

See App Integration for full integration steps.