How I went from 2000+ ms to 150< ms for API calls
APIs, APIs, APIs everywhere! One major thing we as developers run into when using external services through REST APIs is the latency, and if the remote API goes down our users are left hanging in middle of requests.
This particular project was a mobile client based one, where we only had the mobile client using an external API. External API was using OAuth for authentication and mobile client had to make sure on each request the token is refreshed properly. Simple set up, simple app.
Though it’s simple enough, the main dependency and the major component of the app is out of our control; the API. By mid way of the project the API calls started to slow down and users started to notice the “lag”.
To gain some what control of the API, the option is to build a middle-wear between the real API and mobile client. But since the application is already deployed we have to make sure that the transition happens with minimal work for our mobile developers, given these requirement and situation the best approach is to build a proxy which maintain the formats, endpoints of the existing API.
By the end of building the proxy, client developers only have to swap the domain of the api, and proxy will take care of the rest.
Brining in Caching
With a cache sitting behind the proxy, the responses of the requests can be cached for sometime offering a significant decrease in waiting time for the client.
Using PHP, here’s the stuff we need. Make sure to run the latest and greatest version of PHP.
Checkout the working code at https://github.com/sahanh/api-proxy
There’s an excellent composer package to build out a proxy, https://github.com/jenssegers/php-proxy
Redis for Cache
For the cache, I’m using Redis. Redis is an in-memory key-value store/data structure server which fits perfectly for our use case. http://redis.io/topics/quickstart
The proxy package uses Symfony http foundation , a popular component of symfony project used by other frameworks including Laravel. Symfont http foundation allows us to work with HTTP layer in an object oriented way.
public we have our serving index.php. In index.php file, we create a Request object and pass it to
RequestManager which make sure the caching and authentication is happening.
RequestManager uses our proxy package and redis caching. Keep in mind the passing request object has all the details related to our final API call. Our client is sending a uri (eg: /sample/endpoint) and if it’s
POST/PUT hopefully a request body.
Managing the cache
As soon as the RequestManger receive the request, it generates a unique identifier for that request so it can look up in cache for it’s existence.
When generating this key, it takes the path (URI) client called and the requesting body. This will make sure that when client send a different request body, its treated as a new cacheable request. These 2 string are concatenated and a hash is generated, this will be the “index” to manage the specific response cache of the request.
Using the generated key
RequestManager looks up a response in our cache and if it exists, a Symfony Http Response object is returned to the caller. Otherwise continuing the request flow.
If no response exist in cache, we calling the vendor API, we have defined the vendor API host in class, and using client’s request path we build the final url to call.
After building request to the vendor API, we are calling the vendor API, if it’s a
401, in our APIs case, the request token is invalid, so we are attempting it again after refreshing our token. Note that in here generated token is also stored in cache. After another request, if the status is 200 (success) we store the response in cache with the key we generated.
This simple approach can give a huge boost to the overall user experience, when caching always make sure to pay attention to changing variables of request life cycle. For my case, treating uri + body unique was sufficient. Also checkout full fledged caching systems like Varnish.
Checkout a demo of the API once its implemented.