Free 40-page Claude guide — setup, 120 prompt codes, MCP servers, AI agents. Download free →
CLSkills
gRPCintermediateNew

gRPC Streaming Patterns

Share

Implement server-streaming, client-streaming, and bidirectional gRPC streams with backpressure

Works with OpenClaude

You are the #1 gRPC expert from Silicon Valley — the engineer that companies like Google, Square, and Lyft hire when their microservices need to handle millions of streaming RPCs per second. You've debugged HTTP/2 backpressure issues, fixed broken streams in production, and you know exactly when bidirectional streaming is the right tool vs when it's overkill. The user wants to use gRPC streaming for real-time or large-data scenarios.

What to check first

  • Decide stream type: server-streaming (one request, many responses), client-streaming (many requests, one response), or bidirectional
  • Verify your protobuf definitions use the 'stream' keyword on the right side
  • Check that both client and server use the same gRPC version

Steps

  1. Define the streaming RPC in your .proto file
  2. Regenerate gRPC code (grpc_tools_node_protoc / protoc)
  3. On the server: implement the handler that returns/accepts a stream
  4. On the client: open the stream, send/receive messages, handle 'data', 'end', and 'error' events
  5. Implement backpressure handling for bidirectional streams
  6. Add deadlines to prevent infinite stream lifetime

Code

// streaming.proto
syntax = "proto3";

service Logger {
  // Server streaming: tail logs in real-time
  rpc Tail(TailRequest) returns (stream LogEntry);

  // Client streaming: upload many logs, get summary back
  rpc UploadLogs(stream LogEntry) returns (UploadResult);

  // Bidirectional: chat-like interaction
  rpc Chat(stream ChatMessage) returns (stream ChatMessage);
}

message TailRequest { string filter = 1; }
message LogEntry { string message = 1; int64 timestamp = 2; }
message UploadResult { int32 received = 1; }
message ChatMessage { string from = 1; string text = 2; }

// SERVER (Node.js)
function tail(call) {
  const interval = setInterval(() => {
    call.write({ message: 'log line', timestamp: Date.now() });
  }, 1000);
  call.on('cancelled', () => clearInterval(interval));
  // call.end() to close the stream from server side
}

function uploadLogs(call, callback) {
  let count = 0;
  call.on('data', (entry) => count++);
  call.on('end', () => callback(null, { received: count }));
}

function chat(call) {
  call.on('data', (msg) => {
    call.write({ from: 'server', text: `echo: ${msg.text}` });
  });
  call.on('end', () => call.end());
}

// CLIENT (Node.js)
const client = new LoggerClient('localhost:50051', credentials.createInsecure());

const stream = client.tail({ filter: 'error' });
stream.on('data', (log) => console.log(log.message));
stream.on('end', () => console.log('stream ended'));
stream.on('error', (err) => console.error(err));

// Add deadline so the stream auto-closes
const deadline = new Date(Date.now() + 60_000);
client.tail({ filter: 'error' }, { deadline });

Common Pitfalls

  • Forgetting backpressure in bidirectional streams — fast senders overwhelm slow receivers
  • Not handling stream errors — uncaught errors crash the process
  • Streams without deadlines — connections leak indefinitely
  • Sending oversized messages on a single stream message — gRPC has a 4MB default limit

When NOT to Use This Skill

  • When a single request/response is sufficient — streaming adds complexity
  • Through HTTP/1.1 proxies — they don't support streaming, only HTTP/2
  • When you need broadcast (one to many) — use a pub/sub system, not gRPC streaming

How to Verify It Worked

  • Use grpcurl to test the stream from the command line
  • Send 1000 messages and verify all are received in order
  • Test what happens when the server crashes mid-stream — client should get an error

Production Considerations

  • Set MaxConnectionAge on the server to force periodic reconnection
  • Monitor stream count — leaked streams will eventually exhaust file descriptors
  • Implement reconnection logic on the client with exponential backoff
  • Use load balancing aware of long-lived streams (Linkerd, Envoy)

Quick Info

CategorygRPC
Difficultyintermediate
Version1.0.0
AuthorClaude Skills Hub
grpcstreamingrpc

Install command:

Want a gRPC skill personalized to YOUR project?

This is a generic skill that works for everyone. Our AI can generate one tailored to your exact tech stack, naming conventions, folder structure, and coding patterns — with 3x more detail.