🔌 MCP Intermediate

Kong

by Kong

Kong: Cloud-Native API & AI Gateway for Microservices

Production-ready gateway managing APIs, AI/LLM traffic, and microservices with plugins for auth, rate limiting, and observability.

42,591 Stars
5,053 Forks
42,591 Watchers
132 Issues
🔌

About This Project

Kong is a high-performance, scalable gateway built on Nginx that sits in front of your APIs and microservices to handle routing, authentication, rate limiting, and traffic control. Written in Lua, it provides a plugin-based architecture that lets you extend functionality without modifying core code.

As cloud-native applications grow more complex, Kong addresses critical challenges like securing API endpoints, managing traffic between hundreds of microservices, and now routing AI/LLM requests efficiently. It works seamlessly with Kubernetes as an Ingress controller, making it ideal for containerized environments and DevOps workflows.

What sets Kong apart is its dual focus on traditional API management and modern AI gateway capabilities. You can proxy requests to OpenAI, manage LLM operations, and apply consistent policies across both REST APIs and AI services. The extensive plugin ecosystem covers everything from OAuth2 and JWT authentication to Prometheus monitoring and custom transformations.

Whether you're running on-premises, in the cloud, or across hybrid environments, Kong delivers enterprise-grade reliability with declarative configuration, real-time analytics, and the flexibility to handle millions of requests per second while maintaining low latency.

Key Features

  • Plugin-based architecture with 50+ official plugins for auth, security, traffic control, and monitoring
  • Native Kubernetes Ingress controller with custom resource definitions (CRDs)
  • AI gateway capabilities for routing, rate limiting, and observability of LLM/AI API calls
  • High-performance Lua/Nginx foundation handling millions of requests with millisecond latency
  • Declarative configuration via YAML/JSON with GitOps support and database or DB-less modes

How You Can Use It

1

Centralized API gateway for microservices architectures with authentication and rate limiting

2

Kubernetes Ingress controller managing traffic routing and load balancing for containerized apps

3

AI/LLM gateway proxying and monitoring requests to OpenAI, Anthropic, and other AI providers

4

Multi-cloud API management with consistent security policies across different environments

5

Serverless function gateway orchestrating AWS Lambda, Google Cloud Functions, and Azure Functions

Who Is This For?

Platform engineers, DevOps teams, backend developers, and architects building microservices, API-first applications, or AI-powered systems requiring robust traffic management and security