React hook for real-time voice conversations with ElevenLabs Conversational AI
@zooly/voice-chat-client is a pure React client-side package that provides the useVoiceConversation hook for managing real-time voice conversation sessions with ElevenLabs Conversational AI.
This package is framework-agnostic — it does not depend on Next.js or any specific framework. API communication is injected via the voiceCallLogApi adapter prop.
@zooly/voice-chat-clientpackages/voice-chat/clientisAiSpeaking, isUserSpeaking, conversationMode, getStateLabel@elevenlabs/react — ElevenLabs React SDK@zooly/types — Shared types (VoiceCallMessage)@zooly/util — Shared utilitiesreact (peer) — React 18 or 19UseVoiceConversationPropsinterface UseVoiceConversationProps {
accountId: string | null;
voiceCallLogApi: VoiceCallLogApi;
onConnect?: () => void;
onDisconnect?: () => void;
onError?: (error: any) => void;
onMessage?: (message: VoiceCallMessage) => void;
onCallStarted?: (params: { agentId: string }) => void;
onCallEnded?: (params: { agentId: string }) => void;
}
| Prop | Purpose |
|---|---|
accountId | Current account ID, passed to voice call log creation |
voiceCallLogApi | Injected API adapter for creating and updating voice call logs |
onConnect | Called when the ElevenLabs session connects |
onDisconnect | Called when the session disconnects |
onError | Called on session errors |
onMessage | Called with each normalised message (user or assistant) |
onCallStarted | Called after session starts — wire analytics here |
onCallEnded | Called after session ends — wire analytics here |
VoiceCallLogApi Adapterinterface VoiceCallLogApi {
create: (params: { accountId: string | null; agentId: string }) => Promise<{ id: string }>;
update: (params: { id: string; voiceCallMessages: VoiceCallMessage[] }) => Promise<void>;
}
The consuming app implements this interface to bridge to its own API routes. This avoids hardcoding fetch URLs inside the package.
UseVoiceConversationReturn| Field | Type | Description |
|---|---|---|
agentId | string | ElevenLabs agent ID for the current session |
isCallOnGoing | boolean | Whether a voice session is active |
isStartVoiceCall | boolean | UI toggle for starting a voice call |
setIsStartVoiceCall | (v: boolean) => void | Setter for the toggle |
voiceCallMessages | VoiceCallMessage[] | Accumulated transcript |
setVoiceCallMessages | (msgs) => void | Replace the transcript |
addToVoiceCallMessages | (msg) => void | Append a message (e.g. synthetic errors) |
isInitializingVoiceCall | boolean | True while session is being established |
isPermissionGranted | boolean | Whether mic permission has been granted |
startConversation | (agentId: string) => Promise<void> | Start a session |
endConversation | () => Promise<void> | End the current session |
requestMicrophonePermission | () => Promise<void> | Request mic access |
currentVoiceCallLogId | string | null | DB ID of the current voice call log |
getConversationId | () => string | undefined | ElevenLabs conversation ID |
isAiSpeaking | boolean | True when the AI agent is speaking |
isUserSpeaking | boolean | True when the AI is not speaking |
conversationMode | "speaking" | "listening" | From the AI's perspective |
conversation | SDK return | Raw useConversation return value |
getStateLabel | () => string | Human-readable status label |
startConversation(agentId) with the ElevenLabs agent IDconversation.startSession()voiceCallLogApi.create()onMessage callback and are normalised into VoiceCallMessage objectsvoiceCallLogApi.update() (fire-and-forget, best-effort)endConversation() to tear down the sessionWhen the session is connected and isCallOnGoing is true, the hook sets up a Web Audio API analyser on the microphone input. The analyser computes average volume from frequency data using fftSize = 256 and smoothingTimeConstant = 0.8. The analysis runs via requestAnimationFrame and is cleaned up when the session ends.
The current isUserSpeaking is a simple inverse of isAiSpeaking — it does not use the audio analysis volume data. This means isUserSpeaking is true during silence when the AI is not speaking.
See App Integration for how to wire this hook into zooly-app.