When doing an AJAX JSONP request with jQuery you rely on the jQuery.parseJSON method to parse the JSONP response into an object. While jQuery.parseJSON works most of the time I’ve seen it behave as if a memory leak is occurring. This isn’t new to jQuery.ajax() as it has somewhat of a history with memory leaks. The version I am using currently is jQuery 1.9.0, and with a particular project I am seeing a “stair-step” memory leak pattern where the usual ebb and flow of memory being allocated/deallocated appears as a predictable pattern. However, at a certain point there is a spike that itself never goes down which in effect sets the normal ebb/flow memory usage pattern at a higher level than previous. This happens 4 times within my web app, each spike setting a higher and higher average level of memory usage.

The Application

To understand what this is all about I should give a brief description of this particular application. What I have is an app that will be wrapped in PhoneGap. When this app launches on a mobile device it will download an XML document. That document contains data referencing all of the app’s downloadable assets. The app will then check to see if the data exists locally or not (if it did it would have been stored on the app via WebSQL) and if not (or there has been an update as determined by each asset’s timestamp) it will determine what files need to be downloaded and commence the download operation.

The “download” consists of AJAX calls to individual JSONP-type documents which have a structure similar to {date:value,name:value,data:value} where “data” is an image serialized to base64.

There are hundreds of these asset files. The individual files allow for the creation of a download/progress indicator where the file that is currently being downloaded is of course a percentage of the total.

The rest of this discussion deals with the downloading of the application’s assets as that is where the memory issue occurs.

jQuery.parseJSON

Anyone who uses jQuery.ajax() who works with JSON/JSONP will (or at least *should*) be using jQuery.parseJSON. The jQuery framework will take the JSON/JSONP server response and transparently transform it into a JavaScript Object to do with as the developer sees fit.

Initially I implemented the AJAX request exactly as I’ve done many times before not seeing the need to deviate from what has been a predictable pattern of interacting with JSONP server responses. Here is a code snippet of such a setup:

...
$.ajax({
   url:       service_path_here,
   type:      'GET',
   dataType:  'JSONP',
   timeout:    30000
...

Simple stuff, we’re requesting a JSONP response from the server and we have a timeout of 30 seconds.

Observe the following memory usage timeline obtained from Chrome via a script using the above AJAX configuration. The script “loops” in a synchronous fashion through an array of URL’s in an effort to cache the data within the browser. The workflow is: make request > download > parse > store via WebSQL > on txn success/failure move to next URL and start the process over:

jquery_memory_leak_complete_download

You can see quite plainly the pattern here and it is evident that we have a memory leak. Upon investigating the timeline I can see the exact AJAX calls where the leaks occur. The data contained within the JSONP payloads is the aforementioned base64 string and their lengths exceed 2 million characters. I now know the culprit but not why the leak happens but suspect that there is something wrong with jQuery’s handling of the data, in particular, jQuery.parseJSON.

JSON.parse

Some Googling on the topic reveals that there is some previous history on memory leaks while using jQuery to do AJAX calls. One note in particular mentioned the eval method being the root cause. jQuery’s JSON parser is based on work by Douglas Crockford who is well known within the JavaScript community. All web app developmers should be aware of him and his JSON parser script as it is commonly used/referenced by many frameworks. Anyway, I know from having visited D Crockford’s site many times that he offers more than one JSON parser and that one of them explicitly avoids using the eval method in its parsing routines.

Before I proceed to json_parse (the “eval-less” parser) I want to try JSON.parse. It follows that his JSON.parse script will also exhibit the memory leak when presented with JSON exceeding 2 million characters (and thus by extension any JavaScript library’s JSON parser based on his work, of which there are many). Here is the config for that test:

...
$.ajax({
   url:       service_path_here,
   type:      'GET',
   dataType:  'JSONP',
   timeout:    30000,
   converters: {'text json':JSON.parse} 
...

And the memory profile While using JSON.parse:

json.parse_memory_leak_complete_download

We can see the same thing happening – time to use Douglas Crockford’s alternative parser – json_parse.

json_parse

Douglas Crockford describes json_parse as “an alternative JSON parse function that uses recursive descent instead of eval.” Perfect – we want to get away from eval, here’s its implementation within jQuery:

...
$.ajax({
   url:       service_path_here,
   type:      'GET',
   dataType:  'JSONP',
   timeout:    30000,
   converters: {'text json':json_parse} 
...

And the resulting memory usage:

json_parse_memory_leak_non_optimized_complete_download

Thats what we’re looking for – you can see the spikes as the multi-million character JSON payloads get parsed and you can see GC happening afterwards. Memory gets reclaimed just the way we want it to.

From here we can delve further into optimizing the code responsible for these graphics – which is what I’ve done. A week or so of optimization and in some cases code-rewrite results in the following final download memory footprint for my web app:

json_parse_memory_leak_complete_download

Through streamlining a number of things – including removing all my attempts at debugging the issue – I’ve cut the download time and the overall memory foot print has been further improved – possible in large part to json_parse.

Conclusion

jQuery.parseJSON is useful for the vast majority of applications. However, if you are dealing with large JSON payloads – in excess of 1.8 to 2+ million characters in length you should use an alternative JSON parser such as json_parse. Do so and you will ensure that memory is reclaimed as needed and thus avoid app slugishness and crashes.

Here’s the link to get both JSON.parse and json_parse: https://github.com/douglascrockford/JSON-js.