
Amazon API Gateway
Fully managed service to create, publish, and secure APIs at any scale for backend access
Discover top open-source software, updated regularly with real-world adoption signals.

Cloud‑native gateway unifying APIs, LLMs, and MCP traffic
Kong Gateway delivers high‑performance API, AI, and MCP routing with extensible plugins, Kubernetes‑native ingress, and declarative deployment, enabling teams to secure, scale, and manage traffic effortlessly.

Kong Gateway is designed for developers, DevOps, and AI engineers who need a single control plane to manage traditional APIs, large language model (LLM) services, and micro‑service control plane (MCP) traffic. It offers a cloud‑native, platform‑agnostic architecture that scales from small clusters to global deployments.
The gateway provides advanced routing, load balancing, health checks, and a rich set of authentication methods (JWT, OAuth, ACLs, etc.). Its universal LLM API lets you route requests to providers such as OpenAI, Anthropic, GCP Gemini, AWS Bedrock, Azure AI, and more. MCP features add traffic governance, security, and observability, while a vibrant plugin ecosystem extends functionality for rate limiting, transformations, serverless integration, and analytics.
Run Kong on any infrastructure: Docker‑compose for quick local testing, DB‑less declarative mode, hybrid control‑plane/data‑plane separation, or as a native Kubernetes Ingress Controller. For managed experience, Kong Konnect offers a SaaS control plane with real‑time analytics and developer portals.
When teams consider Kong, these hosted platforms usually appear on the same shortlist.

Fully managed service to create, publish, and secure APIs at any scale for backend access

Fully managed multicloud API management service for publishing, securing, and monitoring APIs across environments

API management platform to publish, secure, and analyze APIs
Looking for a hosted option? These are the services engineering teams benchmark against before choosing open source.
Unified API and LLM routing
Route requests to REST APIs and multiple LLM providers through a single gateway, simplifying authentication, observability, and policy enforcement.
MCP governance for internal services
Apply traffic policies, security controls, and real‑time observability across microservices using MCP features.
Serverless function integration
Invoke AWS Lambda, Azure Functions, or other serverless workloads via plugins without custom code.
Hybrid multi‑region deployment
Separate control and data planes to achieve high availability and low latency across geographic regions.
Yes, Kong supports a declarative, DB‑less mode where configuration is provided via YAML or JSON files.
Kong’s universal LLM API abstracts provider specifics, allowing you to route requests to OpenAI, Anthropic, GCP Gemini, AWS Bedrock, Azure AI, and others from a single endpoint.
Kong includes a native Ingress Controller and can be deployed as a control plane or data plane within Kubernetes clusters.
Kong Gateway is released under the Apache‑2.0 license.
Core features are open source; advanced analytics, developer portals, and managed control‑plane (Konnect) are part of Kong’s commercial offerings.
Project at a glance
ActiveLast synced 4 days ago