I’ve been working on a Rails side-project where I need to frequently poll an external API for data in response to user activity. The polling takes place in the background via Sidekiq jobs.
The challenge is that this external API is notoriously strict about enforcing a rate limit: if the API is called too frequently, it will start failing with HTTP 400 errors. In the past I have tried solving this using the sidekiq-throttled gem. It sometimes works, but maintenance of the gem is spotty, and new Sidekiq releases often cause the gem to break.
I’ve now landed on a more robust approach using a new official feature of Sidekiq, paired with a gem from Shopify. I’ll walk you through the following solution in this post:
- How to use a Sidekiq capsule to limit concurrency, so that rate-limited jobs are constrained to a single thread.
- How to enforce a rate limit with the ruby-limiter gem, so that API calls don’t exceed a certain number per minute.
Have you also implemented something like this using a different approach? Let me know!
With the release of Sidekiq 7, a new concept called capsules can be used to define concurrency limits.
A capsule is a way to explicitly allocate threads to one or more Sidekiq queues.
Capsules are configured in the
Sidekiq.configure_server block, which is typically placed in a Rails app in
To limit a worker to be processed one job at a time (i.e. concurrency=1), first place that worker’s jobs in a dedicated queue, like this:
Now you can configure a Sidekiq capsule to run the
limited queue with a concurrency of 1:
With this setup, Sidekiq will process
MyRateLimitedWorker jobs one at a time, but still run them as fast as possible. In other words, the concurrency is limited, but the execution speed is not throttled in any meaningful way.
In my case, it was also important to limit the rate at which jobs are executed. I layered on another gem for this, as explained in the next section.
The ruby-limiter gem, maintained by Shopify, is a pure-Ruby mechanism for throttling the rate at which a block of code is executed within a single Ruby process.
First, install the gem:
The gem provides a
limit_method macro for applying throttling rules to arbitrary Ruby methods. This also works within a Sidekiq worker class.
Together with the Sidekiq capsule configuration, this ensures that
MyRateLimitedWorker jobs are processed one at a time, and the API calls do not exceed one every three seconds.
Keep in mind that the ruby-limiter gem will block (i.e. sleep) to enforce its rate limits. This is effective, but it means that while waiting, the Sidekiq thread will be “stuck” and unable to work on other jobs while the rate limit is being observed. That is why placing the rate-limited jobs in their own dedicated Sidekiq capsule is important: other queues not governed by the capsule will still be able to proceed normally.
Although this solution has worked great for me, there are some important caveats to consider:
- The rate-limiter gem works within a single process. If you deploy multiple Sidekiq processes, each will effectively have its own independent rate limit, which may not be what you want.
- Sidekiq Enterprise has its own rate limiting system. I haven’t tried it, but if your team can afford the price tag, it might be worth considering (plans start at $229 per month).