April 7

Faster API Batch Requests

We recently released v1.22.0 of the AerisWeather API. Along with several minor bug fixes and improvements, this release also includes a significant speed improvement for batch requests.

Clients are seeing as much as a 15x speed improvement with batch requests! That’s a huge win, especially for mobile applications.

What are batch requests?

When developing an application, you will often need data from multiple endpoints to provide a complete weather overview to your users. With the AerisWeather API, you can conveniently query multiple data sets in a single query via batch requests.

Multiple Endpoints for a Single Location

An example use case would be creating a weather overview for a mobile application. For the overview, you need the location information, current conditions, a seven-day forecast, any active alerts, storm threats, and the minutely precip forecast for the next hour. 

You could make six individual API queries or utilize a single batch request. Batch requests allow up to 25 data requests with a single API query. 

An example batch request for our application would be similar to:


The location – “Minneapolis, MN” in the string above – can be interchanged for any supported place format the API allows.

For mobile applications, the location will typically be the phone’s GPS coordinates (latitude, longitude). For example:


Notice that the batch endpoint allows you to pass query parameters specific to a single endpoint request, though the parameter string must be URL-encoded. Review the Batch Request documentation for more details.

Forecasts for Multiple Locations

Batch requests can also be used to fetch data for different locations. This functionality leads to the client’s use case of fetching data for many locations. In cases such as fetching forecasts every few hours for all the US zip codes, you can combine up to 25 requests in a single query to speed up the time spent fetching the data.

An example that fetches a seven-day forecast for five US zip codes:


Speeding Batch Requests

Motorsports racer Mario Andretti, once stated: “If you wait, all that happens is that you get older.” No one wants to grow old waiting, especially in a mobile application experience. 

Historically, while batch requests have often been faster and more convenient than making individual requests, they were still perceived as slow. This slowness was understandable because previously, an API batch request was processed synchronously, i.e., one data request at a time. 

We revisited batch requests based on this feedback, refactoring the codebase with speed in mind. Now batch requests are handled asynchronously, i.e., all data requests simultaneously. This change provides a tremendous speed improvement.  

As seen in the above graph, once V1.22.0 of the AerisWeather API was released, the maximum latency for batch requests dropped significantly, seeing as much as a 15x speed improvement. 

Additionally, the average latency for batch requests dropped, seeing a 40% speed improvement:

Batch Requests for the Win!

These speed improvements are significant wins! If you are using batch requests today, they immediately became faster with the recent API release. If you are not using batch requests, read more about them for use in the future. 

Not an AerisWeather API user? Contact our sales team or sign up for a developer account to get started today.

Share this post:

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.