Request-Response Pattern Under the Hood
Understanding what happens between a client request and server response reveals how Node.js leverages the event loop for scalability.
The Complete Flow
Client Node.js Server Operating System
│ │ │
│──────TCP Connect──────────>│ │
│ │<────notify connection─────│
│ │ (libuv poll phase) │
│ │ │
│────HTTP Request───────────>│ │
│ │<────data available────────│
│ │ (poll phase callback) │
│ │ │
│ │ [Your code runs] │
│ │ │
│<───HTTP Response───────────│ │
│ │ │
What Happens on connection
When a client connects:
const net = require('net');
const server = net.createServer((socket) => {
// This callback fires in the poll phase
// 'socket' is a duplex stream
console.log('Client connected');
socket.on('data', (data) => {
// Data from client (poll phase)
console.log('Received:', data.toString());
});
});
For HTTP, Node wraps this lower-level socket handling:
const http = require('http');
http.createServer((req, res) => {
// req wraps the incoming socket data
// res wraps the outgoing socket writes
});
Request Processing Timeline
const http = require('http');
http.createServer((req, res) => {
// 1. Connection accepted (poll phase)
// - TCP socket established
// - HTTP parser attached
// 2. Headers parsed
console.log(req.method, req.url);
console.log(req.headers);
// 3. Body arrives in chunks (poll phase callbacks)
const chunks = [];
req.on('data', (chunk) => {
chunks.push(chunk);
});
// 4. Request complete
req.on('end', () => {
const body = Buffer.concat(chunks).toString();
// 5. Process request (your sync code)
const result = processRequest(body);
// 6. Send response (writes queued)
res.writeHead(200);
res.end(JSON.stringify(result));
// 7. Response flushed to socket
// (happens after your code returns)
});
}).listen(3000);
The Auto-Inserted Objects
Node automatically provides request and response objects:
request (IncomingMessage)
// What Node builds from raw TCP data:
const request = {
// Parsed from first line: "GET /path HTTP/1.1"
method: 'GET',
url: '/path',
httpVersion: '1.1',
// Parsed headers
headers: {
'host': 'localhost:3000',
'user-agent': 'curl/7.64.1',
'content-type': 'application/json'
},
// The underlying socket
socket: /* net.Socket */,
// Stream methods for body
on: /* EventEmitter.on */,
read: /* Readable.read */
};
response (ServerResponse)
// What Node provides for building response:
const response = {
statusCode: 200,
// Header methods
setHeader: function(name, value) {},
writeHead: function(statusCode, headers) {},
// Body methods (Writable stream)
write: function(chunk) {},
end: function(data) {},
// The underlying socket
socket: /* net.Socket */
};
Event Loop During Request
http.createServer(async (req, res) => {
// POLL PHASE: This callback triggered
// SYNC: Runs immediately
console.log('Request received');
// ASYNC: File read queued
const data = await fs.promises.readFile('data.json');
// ^^ Event loop continues while waiting
// ^^ Callback scheduled when file ready
// SYNC: Process data
const result = JSON.parse(data);
// SYNC: Queue response for sending
res.end(JSON.stringify(result));
// After function returns:
// - Response flushed in current/next poll phase
// - Connection may be kept alive for next request
}).listen(3000);
Concurrent Request Handling
Node handles many requests without threads:
// Request 1 arrives
// → Callback queued
// → Event loop runs callback
// → Hits await (db query)
// → Callback exits, event loop continues
// Request 2 arrives (while Request 1 waits for DB)
// → Callback queued
// → Event loop runs callback
// → Hits await (different db query)
// → Callback exits
// Request 1's DB returns
// → Callback continues after await
// → Response sent
// Request 2's DB returns
// → Callback continues
// → Response sent
http.createServer(async (req, res) => {
console.log(`Start: ${req.url}`);
// This doesn't block other requests!
const data = await database.query('SELECT * FROM users');
console.log(`End: ${req.url}`);
res.end(JSON.stringify(data));
}).listen(3000);
// Output with 3 concurrent requests:
// Start: /request1
// Start: /request2
// Start: /request3
// End: /request2 (finished first)
// End: /request1
// End: /request3
What Blocks the Event Loop
http.createServer((req, res) => {
if (req.url === '/slow') {
// BAD: Blocks ALL requests!
const result = fibonacci(45);
res.end(result.toString());
}
if (req.url === '/fast') {
// This request waits for /slow to finish!
res.end('Hello');
}
}).listen(3000);
Solution: Worker Threads
const { Worker } = require('worker_threads');
http.createServer((req, res) => {
if (req.url === '/slow') {
// Run CPU-intensive work in separate thread
const worker = new Worker('./fibonacci-worker.js', {
workerData: { n: 45 }
});
worker.on('message', (result) => {
res.end(result.toString());
});
}
}).listen(3000);
Response Buffering
http.createServer((req, res) => {
// These writes are buffered
res.write('Part 1');
res.write('Part 2');
res.write('Part 3');
res.end();
// Data sent after function returns
// (during event loop's poll phase)
}).listen(3000);
Flushing Immediately
http.createServer((req, res) => {
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache'
});
// For Server-Sent Events, flush each write
res.write('data: First event\n\n');
res.flushHeaders();
setInterval(() => {
res.write(`data: ${Date.now()}\n\n`);
}, 1000);
}).listen(3000);
Connection Management
http.createServer((req, res) => {
// Check if client disconnected
req.on('close', () => {
console.log('Client disconnected');
// Cancel any pending work
});
// Long-running operation
const cancel = longOperation((result) => {
if (res.writableEnded) {
return; // Client already gone
}
res.end(result);
});
req.on('close', () => cancel());
}).listen(3000);
Key Takeaways
- Request/Response are streams wrapping TCP sockets
- Event loop handles concurrency - one request doesn't block others
- Async operations yield to the event loop (good)
- Sync operations block all requests (bad)
- Response writes are buffered and flushed by event loop
- Check for disconnection before sending late responses