Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

I need to interface with a 3rd party API. With this API I make a GET request fro

ID: 647453 • Letter: I

Question

I need to interface with a 3rd party API. With this API I make a GET request from within the end user's browser and receive an XML response. This data is to be used in a browser based application where the user can search through it, use it to make decisions, etc. The main issue is that most browsers have locked down cross-domain XML use, so I can't simply get the XML from the API.

The overall data, though, is basically broken into two sets.

The first set of data is public and only needs to be updated every so often, so it can be cached for all users on the server side, lightening the traffic considerably.
The second set of data is private and individual to each user. This data is also updated in the API more frequently. This leads caching to be much less effective.
For scalability reasons I would like to keep the server's load a small as possible.

I see two options before me:

Provide a proxy that can be used to route XML requests to the 3rd party server and directly back and forth between client and 3rd party API.
Have the server do the conversion from XML to JSON and strip out unnecessary information. This essentially means making a new API for our server, which translates into requests from the 3rd party API
What would be the best way to provide the data to the user? (Does not have to be one of the two options)

Explanation / Answer

The proxy option is the easiest one to implement. You don't have any custom development to do, the only thing to do is to set up a proxy. It's also straightforward: there is no additional code to maintain, and if the API changes, you have no changes to make on your side.

A proxy would be a preferred choice:

If you need to ship working software fast. This makes it a good choice, for example, if you were about to ship a feature, but found during the implementation phase of the project that you can't just make cross-domain AJAX requests.

Or if the current API is well designed: the architecture is good, the calls are very clear, the documentation is complete and easy to understand.

Or if the current API is subject to change. If it changes, you just need to change the JavaScript implementation. If instead of a proxy, you are parsing the results and generating your own JSON, there is a risk that the changes to the API will require the changes in your server-side code.

On the other hand, parsing the result has a benefit to make it possible to abstract completely the API on client-side. This is a slower alternative, since it requires to design the new interface (if the original API is not well designed) and to implement the extract, transform and load features, but it may be a good long-term choice for a large project. This is a preferred choice:

If you need additional features. You can exploit the different features which weren't available in the original API, such as caching on a level which is not supported by an ordinary proxy server, or encryption, or a different authentication model.

For example, if the number of AJAX requests becomes an issue or if two-ways communication model makes sense, you can implement Web Sockets.

Or if the current API is not well designed. Like a facade pattern, this approach enables you to redesign the API. If the original one is poor, having a facade makes it possible to solve the bad design choices made by the original authors of a legacy API. You can act as well on large parts, such as the overall architecture of the API, but also on details, such as the names of arguments or the error messages.

While modifying an existent API is sometimes impossible, having a facade can make it possible to work with a piece of clean code which abstracts the drawbacks and errors in the original design.

Or if the current API is subject to change. Indeed, you may prefer to change server-side code instead of JavaScript if the API changes over time, while keeping the public interface of your facade unaffected. It may be easier either because you're more experienced with server-side programming or because you know more tools for server-side refactoring or because it's easier in your project to deal with server-side code versioning.

You may notice that I omitted talking about JSON, performance, caching, etc. There is a reason for that:

JSON vs. XML: it's up to you to pick the right technology. You do it by measuring objectively the overheat of XML over JSON, the time it takes to serialize data and the ease of parsing.

Performance: benchmark different implementations, pick the fastest one, then profile it and optimize it based on the results from the profiler. Stop when you achieve the performance specified in the non-functional requirements.

Also, understand what are you trying to achieve. There are several parts interacting with each other: the original API, the bandwidth between your server and the API's one, the performance of your server, the bandwidth between your server and the end users and the performance of their machines. If you're asked to obtain a response to a request within 30 ms., but the original API spends 40 ms. processing the request, no matter what you do, you won't be able to obtain the required performance.

Caching: caching is one of the techniques to make your web application feeling faster, reducing bandwidth, etc.

Make sure you use client caching as well (server-side caching won't reduce bandwidth usage between you and the customers), given that setting up the HTTP headers properly is often tricky.

Make sure you determine correctly what to cache, for how long and when to invalidate it: if the description of the product changed 10 seconds ago, but the customers of an e-commerce website still see the old version, it's OK. If the owner changed the description, submitted it, and still sees the previous variant because of the caching, this is problematic.

Don't focus only on caching. Minification, for example, is important as well. Reducing the number of requests can also be beneficial.