Autoscale any workload using

Kafka

RabbitMQ

Prometheus

AWS

GCP

Azure

Redis

PostgreSQL

MongoDB

Datadog

Kedify autoscales any Kubernetes workload using proven open source technology.

keda-icon-text keda-icon-circle
Circle shape
7.6k

GitHub Stars

3.5k

Contributors

50+

Organizations

65+

Scalers

Unlock Intelligent Autoscaling, Powered by KEDA

Dynamically adapt to demand - up, out, or down - while prioritizing cluster performance and efficiency. Powered by KEDA, a Cloud Native Computing Foundation graduate, Kedify is built on popular and battle-tested open source technologies.

Start Scaling
Autoscaling
Estically

Scale Elastically Up or Down to Zero

Tailored resource adjustment specific to your workload and cluster: automatically scale applications with more precision. Effortlessly handle sudden traffic surges, processing demands, expected holiday spikes, or any other type of event.

Adopt Cloud-Agnostic, Event-Driven Scaling

  • Avoid autoscaling implementations that result in cloud provider lock-in
  • Scale any workload: E-Commerce, AI & Machine Learning, Data & Analytics, IoT, Gaming, and more
  • Choose from 65+ open source scalers, your own scaler or use Kedify to create your own custom scaler
Get Started
Highlights

Scalers

Support for 65+ proven scalers including AWS SQS, Azure Service Bus, GCP Tasks, Redis, Prometheus, Apache Kafka, RabbitMQ, Selenium and many more.

Explore All Scalers Arrow Right Icon

Getting started
is simple

  • Kedify Agent icon

    Step 1

    Use `kubectl` to install the Kedify Agent and KEDA on your Kubernetes cluster.

  • Configure Workload icon

    Step 2

    Configure a workload and application to autoscale using Kedify.

  • Monitor and Optimize Autoscaling icon

    Step 3

    Monitor and optimize the autoscaling of your applications and workloads.

Kedify Agent

Latest from our Blog