Loading learning content...
Understanding where Server-Sent Events excel is as important as understanding how they work. SSE's one-way, text-based, HTTP-native design makes it ideal for certain use cases while less suitable for others. The difference between a successful SSE implementation and an over-engineered one often comes down to matching the technology to the problem.\n\nIn this comprehensive guide, we'll explore the most common SSE use cases, examining why SSE is the right choice, how to implement each pattern, and what architectural considerations apply at scale.
By the end of this page, you will understand which applications are best suited for SSE, implementation patterns for notifications, live feeds, dashboards, and progress tracking, scaling considerations for each use case, and when to use SSE versus alternatives.
SSE excels in scenarios where the server needs to push updates to clients, but clients don't need to send frequent messages back through the same channel. Let's categorize the primary use cases:
| Category | Examples | Why SSE Is Ideal |
|---|---|---|
| Real-Time Notifications | Alerts, mentions, system messages | Server pushes; clients just receive. Perfect one-way fit. |
| Live Data Feeds | News, social media, stock tickers | Continuous stream of updates. No client→server needed. |
| Dashboard Updates | Metrics, monitoring, admin panels | Periodic refresh without polling. Low complexity. |
| Progress Tracking | File uploads, job status, builds | Long-running operations with status updates. |
| Collaborative Cursors | Document editing positions | High-frequency updates, one direction per stream. |
| AI/LLM Streaming | ChatGPT-style token streaming | Server generates tokens continuously; client displays. |
The Common Thread\n\nAll ideal SSE use cases share these characteristics:
SSE handles 80% of real-time use cases with 20% of the complexity of WebSockets. Only reach for WebSockets when you genuinely need bidirectional messaging (chat, games) or binary data (audio, video). For most dashboard and notification needs, SSE is the simpler, more maintainable choice.
Notifications are perhaps the most common SSE use case. Users expect instant alerts for mentions, messages, system events, and activity updates. SSE delivers these reliably without complex infrastructure.\n\nNotification System Architecture
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103
// notification-service.tsimport express from 'express';import Redis from 'ioredis'; const app = express();const redis = new Redis();const subscriber = new Redis(); // User -> Response mappingconst userConnections = new Map<string, express.Response[]>(); // Subscribe to notification channelsubscriber.subscribe('notifications');subscriber.on('message', (channel, message) => { const notification = JSON.parse(message); deliverToUser(notification.userId, notification);}); function deliverToUser(userId: string, notification: Notification) { const connections = userConnections.get(userId) || []; const eventData = formatSSE('notification', notification); connections.forEach(res => { if (!res.writableEnded) { res.write(eventData); } });} function formatSSE(event: string, data: any): string { const id = `${Date.now()}-${Math.random().toString(36).slice(2)}`; return `id: ${id}\nevent: ${event}\ndata: ${JSON.stringify(data)}\n\n`;} // SSE endpointapp.get('/api/notifications/stream', authenticateUser, (req, res) => { const userId = req.user.id; // SSE headers res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); res.setHeader('X-Accel-Buffering', 'no'); // Register connection if (!userConnections.has(userId)) { userConnections.set(userId, []); } userConnections.get(userId)!.push(res); // Send unread notifications on connect sendUnreadNotifications(userId, res); // Keepalive const keepalive = setInterval(() => { if (!res.writableEnded) { res.write(': keepalive\n\n'); } }, 25000); // Cleanup req.on('close', () => { clearInterval(keepalive); const connections = userConnections.get(userId); if (connections) { const index = connections.indexOf(res); if (index > -1) connections.splice(index, 1); if (connections.length === 0) { userConnections.delete(userId); } } });}); async function sendUnreadNotifications(userId: string, res: express.Response) { const unread = await getUnreadNotifications(userId); for (const notification of unread) { res.write(formatSSE('notification', notification)); }} // Publish notification (called by other services)async function createNotification(notification: Notification) { // Store in database await saveNotification(notification); // Publish for SSE delivery await redis.publish('notifications', JSON.stringify(notification)); // Also send push notification for mobile/offline users await sendPushNotification(notification);} interface Notification { id: string; userId: string; type: 'mention' | 'like' | 'follow' | 'system'; title: string; body: string; data?: Record<string, any>; createdAt: string; read: boolean;}Users often have multiple tabs or devices open. The userConnections map stores an array of responses per user, broadcasting to all connected sessions. When a user marks a notification as read on one device, consider publishing a 'read' event to sync all devices.
Live feeds—stock tickers, sports scores, social media timelines, news updates—represent a classic SSE application. The server continuously generates updates that all interested clients receive.\n\nFeed Architecture Considerations
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107
// stock-feed-service.tsimport express from 'express';import Redis from 'ioredis'; const app = express();const subscriber = new Redis(); interface StockUpdate { symbol: string; price: number; change: number; changePercent: number; volume: number; timestamp: string;} // Client subscriptions: symbol -> responsesconst subscriptions = new Map<string, Set<express.Response>>(); // Subscribe to stock updates from market data servicesubscriber.psubscribe('stocks:*');subscriber.on('pmessage', (pattern, channel, message) => { const symbol = channel.split(':')[1]; const update: StockUpdate = JSON.parse(message); broadcastToSymbol(symbol, update);}); function broadcastToSymbol(symbol: string, update: StockUpdate) { const clients = subscriptions.get(symbol); if (!clients) return; const eventData = formatSSE('stockUpdate', update); for (const res of clients) { if (!res.writableEnded) { res.write(eventData); } else { clients.delete(res); } }} function formatSSE(event: string, data: any, id?: string): string { const eventId = id || `${Date.now()}-${Math.random().toString(36).slice(2)}`; return `id: ${eventId}\nevent: ${event}\ndata: ${JSON.stringify(data)}\n\n`;} // SSE endpoint with symbol subscriptionapp.get('/api/stocks/stream', (req, res) => { const symbolsParam = req.query.symbols as string; const symbols = symbolsParam?.split(',').map(s => s.trim().toUpperCase()) || []; if (symbols.length === 0) { res.status(400).json({ error: 'Provide symbols parameter' }); return; } if (symbols.length > 50) { res.status(400).json({ error: 'Maximum 50 symbols per connection' }); return; } // SSE headers res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); res.setHeader('X-Accel-Buffering', 'no'); // Subscribe to requested symbols for (const symbol of symbols) { if (!subscriptions.has(symbol)) { subscriptions.set(symbol, new Set()); } subscriptions.get(symbol)!.add(res); } // Send current prices immediately sendCurrentPrices(symbols, res); // Confirmation res.write(`event: subscribed\ndata: ${JSON.stringify({ symbols })}\n\n`); // Keepalive const keepalive = setInterval(() => { if (!res.writableEnded) { res.write(': keepalive\n\n'); } }, 25000); // Cleanup req.on('close', () => { clearInterval(keepalive); for (const symbol of symbols) { subscriptions.get(symbol)?.delete(res); } });}); async function sendCurrentPrices(symbols: string[], res: express.Response) { for (const symbol of symbols) { const price = await getCurrentPrice(symbol); if (price) { res.write(formatSSE('stockUpdate', price)); } }}For very high-frequency feeds (100+ updates/second), consider client-side throttling or server-side batching. Sending every tick wastes bandwidth and overwhelms the UI. Batch updates every 100ms or sample to key price levels for a better user experience.
Operational dashboards—metrics, logs, health status, admin panels—benefit greatly from SSE. Rather than polling for updates, dashboards receive changes as they happen, reducing server load while improving responsiveness.\n\nDashboard SSE Patterns
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138
// metrics-stream.tsimport express from 'express';import os from 'os'; const app = express(); interface SystemMetrics { timestamp: string; cpu: { usage: number; load: number[]; }; memory: { total: number; used: number; free: number; usagePercent: number; }; requests: { total: number; errorRate: number; p50: number; p99: number; };} // Track connected dashboard clientsconst dashboardClients = new Set<express.Response>(); // Push metrics to all connected dashboardsfunction broadcastMetrics(metrics: SystemMetrics) { const eventData = `event: metrics\ndata: ${JSON.stringify(metrics)}\n\n`; for (const client of dashboardClients) { if (!client.writableEnded) { client.write(eventData); } else { dashboardClients.delete(client); } }} // Collect and broadcast metrics every 2 secondssetInterval(async () => { const metrics = await collectMetrics(); broadcastMetrics(metrics);}, 2000); async function collectMetrics(): Promise<SystemMetrics> { const cpuUsage = os.loadavg()[0] / os.cpus().length * 100; const totalMem = os.totalmem(); const freeMem = os.freemem(); return { timestamp: new Date().toISOString(), cpu: { usage: Math.min(100, cpuUsage), load: os.loadavg(), }, memory: { total: totalMem, used: totalMem - freeMem, free: freeMem, usagePercent: ((totalMem - freeMem) / totalMem) * 100, }, requests: await getRequestMetrics(), };} // Alert streaminginterface Alert { id: string; severity: 'info' | 'warning' | 'critical'; title: string; description: string; metric: string; value: number; threshold: number; timestamp: string;} const alertThresholds = { 'cpu.usage': { warning: 70, critical: 90 }, 'memory.usagePercent': { warning: 80, critical: 95 }, 'requests.errorRate': { warning: 1, critical: 5 }, 'requests.p99': { warning: 500, critical: 1000 },}; function checkThresholds(metrics: SystemMetrics) { const alerts: Alert[] = []; for (const [path, thresholds] of Object.entries(alertThresholds)) { const value = getNestedValue(metrics, path); if (value >= thresholds.critical) { alerts.push(createAlert('critical', path, value, thresholds.critical)); } else if (value >= thresholds.warning) { alerts.push(createAlert('warning', path, value, thresholds.warning)); } } for (const alert of alerts) { broadcastAlert(alert); }} function broadcastAlert(alert: Alert) { const eventData = `event: alert\ndata: ${JSON.stringify(alert)}\n\n`; for (const client of dashboardClients) { if (!client.writableEnded) { client.write(eventData); } }} // SSE endpointapp.get('/api/dashboard/stream', requireAdmin, (req, res) => { res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); dashboardClients.add(res); // Send current state immediately collectMetrics().then(metrics => { res.write(`event: metrics\ndata: ${JSON.stringify(metrics)}\n\n`); }); const keepalive = setInterval(() => { res.write(': keepalive\n\n'); }, 25000); req.on('close', () => { clearInterval(keepalive); dashboardClients.delete(res); });});Send the full current state on connection, then stream only deltas. This ensures dashboards are immediately useful without waiting for the next update cycle, while keeping ongoing bandwidth low.
Long-running operations—file uploads, video processing, CI/CD builds, data imports—benefit from real-time progress updates. SSE provides a simple way to stream progress without polling.\n\nProgress Tracking Architecture
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142
// job-progress.tsimport express from 'express';import Redis from 'ioredis';import { v4 as uuid } from 'uuid'; const app = express();const redis = new Redis();const subscriber = new Redis(); interface JobProgress { jobId: string; status: 'pending' | 'processing' | 'completed' | 'failed'; progress: number; // 0-100 message?: string; result?: any; error?: string;} // Start a new jobapp.post('/api/jobs', async (req, res) => { const jobId = uuid(); // Initialize job status await redis.hset(`job:${jobId}`, { status: 'pending', progress: 0, createdAt: Date.now(), }); // Queue the job await redis.lpush('job:queue', JSON.stringify({ id: jobId, type: req.body.type, params: req.body.params, })); res.status(202).json({ jobId, progressUrl: `/api/jobs/${jobId}/progress`, });}); // Stream job progressapp.get('/api/jobs/:jobId/progress', async (req, res) => { const { jobId } = req.params; // Verify job exists const exists = await redis.exists(`job:${jobId}`); if (!exists) { res.status(404).json({ error: 'Job not found' }); return; } // SSE headers res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); // Send current status immediately const currentStatus = await redis.hgetall(`job:${jobId}`); res.write(`event: progress\ndata: ${JSON.stringify(currentStatus)}\n\n`); // If job already complete, close connection if (currentStatus.status === 'completed' || currentStatus.status === 'failed') { res.end(); return; } // Subscribe to job updates const jobSubscriber = new Redis(); await jobSubscriber.subscribe(`job:${jobId}:updates`); jobSubscriber.on('message', (channel, message) => { const update: JobProgress = JSON.parse(message); if (update.status === 'completed') { res.write(`event: complete\ndata: ${JSON.stringify(update)}\n\n`); cleanup(); res.end(); } else if (update.status === 'failed') { res.write(`event: error\ndata: ${JSON.stringify(update)}\n\n`); cleanup(); res.end(); } else { res.write(`event: progress\ndata: ${JSON.stringify(update)}\n\n`); } }); const keepalive = setInterval(() => { if (!res.writableEnded) { res.write(': keepalive\n\n'); } }, 25000); function cleanup() { clearInterval(keepalive); jobSubscriber.unsubscribe(); jobSubscriber.disconnect(); } req.on('close', cleanup);}); // Worker publishes progress updatesasync function updateJobProgress( jobId: string, progress: number, message?: string) { const update: JobProgress = { jobId, status: 'processing', progress, message, }; await redis.hset(`job:${jobId}`, { status: 'processing', progress, ...(message && { message }), }); await redis.publish(`job:${jobId}:updates`, JSON.stringify(update));} async function completeJob(jobId: string, result: any) { const update: JobProgress = { jobId, status: 'completed', progress: 100, result, }; await redis.hset(`job:${jobId}`, { status: 'completed', progress: 100, result: JSON.stringify(result), completedAt: Date.now(), }); await redis.publish(`job:${jobId}:updates`, JSON.stringify(update));}Progress SSE connections are naturally short-lived—they close when the job completes. This differs from notification SSE which stays open indefinitely. Design your server to handle both patterns: some connections last seconds, others last hours.
The rise of large language models (LLMs) has created a new killer use case for SSE: streaming AI-generated content token by token. Users expect the ChatGPT-style experience of seeing responses appear in real-time.\n\nWhy SSE for LLM Streaming
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253545556575859606162636465666768697071727374757677787980818283848586878889
// llm-stream.tsimport express from 'express';import OpenAI from 'openai'; const app = express();const openai = new OpenAI(); app.post('/api/chat', async (req, res) => { const { messages } = req.body; // Set SSE headers res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); res.setHeader('Connection', 'keep-alive'); try { const stream = await openai.chat.completions.create({ model: 'gpt-4-turbo-preview', messages, stream: true, }); let fullContent = ''; for await (const chunk of stream) { const content = chunk.choices[0]?.delta?.content || ''; if (content) { fullContent += content; // Stream token to client res.write(`event: token\ndata: ${JSON.stringify({ content })}\n\n`); } // Check for finish reason if (chunk.choices[0]?.finish_reason) { res.write(`event: done\ndata: ${JSON.stringify({ finish_reason: chunk.choices[0].finish_reason, usage: chunk.usage, })}\n\n`); } } // Save conversation to database await saveMessage(req.user.id, 'assistant', fullContent); res.end(); } catch (error) { console.error('LLM error:', error); res.write(`event: error\ndata: ${JSON.stringify({ error: 'Failed to generate response', })}\n\n`); res.end(); }}); // Alternative: Anthropic Claude streamingapp.post('/api/chat/claude', async (req, res) => { const { messages } = req.body; res.setHeader('Content-Type', 'text/event-stream'); res.setHeader('Cache-Control', 'no-cache'); const anthropic = new Anthropic(); const stream = await anthropic.messages.stream({ model: 'claude-3-opus-20240229', max_tokens: 4096, messages, }); stream.on('text', (text) => { res.write(`event: token\ndata: ${JSON.stringify({ content: text })}\n\n`); }); stream.on('message', (message) => { res.write(`event: done\ndata: ${JSON.stringify({ stop_reason: message.stop_reason, usage: message.usage, })}\n\n`); res.end(); }); stream.on('error', (error) => { res.write(`event: error\ndata: ${JSON.stringify({ error: error.message })}\n\n`); res.end(); });});Notice that LLM streaming uses POST with an SSE response. While EventSource only supports GET, you can use fetch() with response.body.getReader() to stream POST responses. Libraries like @microsoft/fetch-event-source simplify this pattern.
We've comprehensively explored the primary use cases for Server-Sent Events. Let's consolidate the key insights:
Module Complete\n\nYou now have comprehensive knowledge of Server-Sent Events—from protocol mechanics to browser support, reconnection handling, and practical use cases. SSE provides a powerful, simple solution for server-to-client real-time communication that works with standard HTTP infrastructure.
Congratulations! You've mastered Server-Sent Events. You understand when SSE is the right choice, how to implement it across various use cases, and the architectural patterns that make SSE applications scale. Apply this knowledge to build efficient, maintainable real-time features.