Skip to main content
Apache Top-Level Project · Open Source

AI-Native Asynchronous
Communication Engine

Purpose-built for multi-agent collaboration, AI inference scheduling, and MCP/session state management — battle-tested at trillion-message scale.

Classic scenario solutions for agentic AI systems

MCP & Long-Session State Continuity

AI tasks are long-running and GPU-expensive. Unstable SSE/WebSocket connections cause session interruption, context loss, and wasted compute.
  • Each session maps to a LiteTopic (chat/{sessionID}), keeping app servers completely stateless
  • On reconnection to any node, subscribe to the same LiteTopic and resume from breakpoint
  • Backend LLM tasks continue running — results persist regardless of frontend disconnection
Session ContinuityZero Compute WasteStateless Scaling
MCP & Long-Session State Continuity Architecture

The underlying capabilities that make enterprise AI communication reliable, fast, and effortless.

LiteTopic — Million-Scale

  • Million-level LiteTopics per cluster, auto-created on demand
  • TTL-based auto-expiration, zero manual maintenance
  • Per-consumer selective subscription within same group
  • Strict ordering within each LiteTopic
📦

Batch Consumption

  • Dual-trigger: message count + time window
  • PushConsumer & SimpleConsumer dual-mode
  • Native fit for LLM Batch API paradigms
  • Adaptive pacing for varying throughput
🎛️

Fine-Grained Flow Control

  • Per-LiteTopic Suspend/Resume at consumer level
  • Millisecond-level rate limiting, isolated impact
  • Smooth traffic curves with elastic scaling
  • Cluster-wide top-N queue governance
🛡️

High Reliability & HA

  • Durable persistence + offset management
  • Multi-replica sync replication, cross-region DR
  • Auto failover, financial-grade transactions
  • Breakpoint recovery for AI workflows
☁️

Cloud-Native Architecture

  • Compute-storage separation, stateless Proxy
  • Kubernetes-native, infinite elastic scaling
  • gRPC SDK: Java / Go / Python / C++ / Rust
  • Trillion-level throughput, 10M+ TPS proven
🔗

Protocol & Ecosystem

  • MCP (Model Context Protocol) native support
  • A2A (Agent-to-Agent) protocol compatible
  • LangChain, CrewAI, AutoGen, AgentScope
  • Framework-agnostic: Dify, Coze, and more
Also Included — Classic Integration Capabilities
EventBridge

CloudEvents-compatible event bus for cross-platform routing and Serverless integration.

MQTT

Native IoT protocol support for massive device connectivity and cloud-edge collaboration.

RocketMQ Connect

Low-code data integration connecting 50+ sources for streaming ETL pipelines.

RocketMQ Streams

Lightweight stream processing engine with Flink SQL compatibility.

Join Community

Subscribe to email groups, follow blog posts and participate in events