Applying a Rate Limit in Sidekiq

I discovered an easy way to implement job throttling using the ruby-limiter gem and the new “capsules” feature of Sidekiq 7.


I’ve been working on a Rails side-project where I need to frequently poll an external API for data in response to user activity. The polling takes place in the background via Sidekiq jobs.

The challenge is that this external API is notoriously strict about enforcing a rate limit: if the API is called too frequently, it will start failing with HTTP 400 errors. In the past I have tried solving this using the sidekiq-throttled gem. It sometimes works, but maintenance of the gem is spotty, and new Sidekiq releases often cause the gem to break.

I’ve now landed on a more robust approach using a new official feature of Sidekiq, paired with a gem from Shopify. I’ll walk you through the following solution in this post:

  1. How to use a Sidekiq capsule to limit concurrency, so that rate-limited jobs are constrained to a single thread.
  2. How to enforce a rate limit with the ruby-limiter gem, so that API calls don’t exceed a certain number per minute.

Have you also implemented something like this using a different approach? Let me know!


With the release of Sidekiq 7, a new concept called capsules can be used to define concurrency limits.

A capsule is a way to explicitly allocate threads to one or more Sidekiq queues.

Capsules are configured in the Sidekiq.configure_server block, which is typically placed in a Rails app in config/initializers/sidekiq.rb.

How to make a worker single-threaded

To limit a worker to be processed one job at a time (i.e. concurrency=1), first place that worker’s jobs in a dedicated queue, like this:

class MyRateLimitedWorker
  include Sidekiq::Worker
  sidekiq_options queue: "limited"
  # ...
This worker’s jobs will run in the limited queue.

Now you can configure a Sidekiq capsule to run the limited queue with a concurrency of 1:

Sidekiq.configure_server do |config|
  config.capsule("limited") do |cap|
    cap.concurrency = 1
    cap.queues = %w[limited]
Instruct Sidekiq to run the limited queue one job at a time.

With this setup, Sidekiq will process MyRateLimitedWorker jobs one at a time, but still run them as fast as possible. In other words, the concurrency is limited, but the execution speed is not throttled in any meaningful way.

In my case, it was also important to limit the rate at which jobs are executed. I layered on another gem for this, as explained in the next section.


The ruby-limiter gem, maintained by Shopify, is a pure-Ruby mechanism for throttling the rate at which a block of code is executed within a single Ruby process.

How to enforce an arbitrary rate limit

First, install the gem:

gem "ruby-limiter"

The gem provides a limit_method macro for applying throttling rules to arbitrary Ruby methods. This also works within a Sidekiq worker class.

class MyRateLimitedWorker
  include Sidekiq::Worker
  sidekiq_options queue: "limited"

  extend Limiter::Mixin
  limit_method :expensive_api_call, rate: 20, interval: 60, balanced: true

  def perform
    # ...
    # ...


  def expensive_api_call
    # ...
This throttles expensive_api_call to 20 calls per 60 seconds. Specifying balanced: true means those 20 calls are spaced evenly to prevent bursting.

Together with the Sidekiq capsule configuration, this ensures that MyRateLimitedWorker jobs are processed one at a time, and the API calls do not exceed one every three seconds.

Keep in mind that the ruby-limiter gem will block (i.e. sleep) to enforce its rate limits. This is effective, but it means that while waiting, the Sidekiq thread will be “stuck” and unable to work on other jobs while the rate limit is being observed. That is why placing the rate-limited jobs in their own dedicated Sidekiq capsule is important: other queues not governed by the capsule will still be able to proceed normally.

Other considerations

Although this solution has worked great for me, there are some important caveats to consider:

  • The rate-limiter gem works within a single process. If you deploy multiple Sidekiq processes, each will effectively have its own independent rate limit, which may not be what you want.
  • Sidekiq Enterprise has its own rate limiting system. I haven’t tried it, but if your team can afford the price tag, it might be worth considering (plans start at $229 per month).


Share this? Copy link

Feedback? Email me!

Hi! 👋 I’m Matt Brictson, a software engineer in San Francisco. This site is my excuse to practice UI design, fuss over CSS, and share my interest in open source. I blog about Rails, design patterns, and other development topics.

Recent articles

View all posts →

Open source projects


App template for Rails 7 projects; best practices for TDD, security, deployment, and developer productivity. Now with optional Vite integration! ⚡️

Updated 1 month ago


A friendly CLI for deploying Rails apps ✨

Updated 20 days ago

More on GitHub →