StreamTable.js – The next generation search filter

So, you think you should search data? Or filter it?  And now, stream … what?

Searching for data has a performance price – we need heavy duty server resources and have delayed responses. See the LinkedIn search.

Filtering for data requires us to load the entire data on the client-side (via JSON) and then filter the results. So, its slow to load but has faster search results.

StreamTable.js is JavaScript that helps us stream large tabular data and filter it. This gives us the best of both worlds – super fast rendering of the pages or tables, no more “Loading data…” delays, no more waiting for page to load and fast client-side search filtering.

Move over DataTables

DataTables does client side filtering but doesn’t quite get it right. The problem with DataTables is that is too heavy and has complex JavaScript options (with all its aaData and aoData and fnRender to quote a few). To add to our woes, the data load via ajax (sAjaxSource) can be rather slow in rendering data, especially if the table requires a LOT of information – say 10,000 records or more.

Here is a ‘simple’ example of using dataTable:

$('#ClientsList').dataTable( {
   "sAjaxSource": '/accounts.json',
   "aaSorting": [[ 1, "desc" ]],
   "aoColumns": [
    {'sTitle': 'Client Type'},
    {'sTitle': 'Reports Due',
     'fnRender': function(obj) {
       return '<span class="badge badge-important">' + obj.aData[4] + '</span>'
      }
    },
    {'sTitle': 'Actions',
       'fnRender': function(obj) {
          return "<a href='#' class='btn btn-primary btn-mini'>View</a> " + '<a href="/accounts/' + obj.aData[5] +'/detail" data-remote="true" class="btn btn-primary btn-mini load_view">Filings</a>';
       }
    }
    ],
    "bFilter": true,
    "sPaginationType": "bootstrap",
    "oLanguage": {
      "sLengthMenu": "_MENU_ records per page"
    }
 });

Now, to me that was congested, confusing and prone to error. Further more, I have no control over the sAjaxSource. So, if the server is sending large amount of data, we are done for!

StreamTable.js to the rescue

Now, not only can we stttrrreeeaaam data to our tables silently and populate it but also have live filters showing instant results. The page never ‘hangs’, we get millisecond response  and we can manage the table configuration easily with a lot of control.

StreamTable uses mustache template for rendering and can work seamlessly with any other templating mechanism too.

StreamTable can get data in chunks to the table or can fetch all of it silently. So, whether there are 50 rows or 5000 rows, the page loads with data instantly.

StreamTable can manage various JSON data formats: array of arrays or array of Objects. This helps us easily integration with to_json or as_json methods.

StreamTable has live filtering, so a user can see filtering results even while the data is streaming.

Here is an example of using StreamTable.js

var options = {
  view: view,                    //View function to render rows.
  data_url: 'clients/all.json',  //Data fetching url
  stream_after: 2,               //Start streaming after 2 secs
  fetch_data_limit: 500,         //Streaming data in batch of 500
}

$('#clients_table').stream_table(options, data);

And here is the JavaScript

# app/assets/javascripts/clients.js
var template = Mustache.compile($.trim($("#template").html()));

var view = function(record, index){
  return template({record: record, index: index});
};

and the HTML template

# app/views/client/index.html.haml
<script id="template" type="text/html">
  <tr>
    <td>{{index}}</td>
    {{#record}}
      <td>{{name}}</td>
      <td>{{type}}</td>
      <td><span class='badge badge-info'>{{due}}</span></td>
    {{/record}}
  </tr>
</script>

A clean and fast approach. Its important to note that the first request should have the first page of data ready to render. So, we can fetch for example the first 10 records and render directly in HAML.

# app/views/clients/index.html.haml
:coffeescript
  @data = #{@entities.to_json}

For the remaining data, StreamTable sends the chunked Ajax request with the the offset and limit parameters.

clients/all.json?q="search text"&limit=500&offset=1000

Benchmarks

Did I just hear you say benchmarks? Check it out for your self. We have tested for upto 100,000 records loading in a chunks of 500. The server response was in milliseconds and the page renders completely in under 2 seconds. Then the rest of the data is silently loaded into the tables.

Check out the demo here (it simulates chunking):

http://jiren.github.io/StreamTable.js/stream.html and http://jiren.github.io/StreamTable.js/index.html

You can fork the repository on github. STREAM AWAY!

About these ads
This entry was posted in Javascript, Search and tagged , , , , . Bookmark the permalink.

35 Responses to StreamTable.js – The next generation search filter

  1. Abhishek says:

    Hi Jiren, I’m a B-Tech CSE student from India. This is just amazing. I have used your js library filter.js before for one of my projects and it was amazing. Now streaming tables is just mindblowing. Two days before I was contemplating on a similar problem before and here I have a solution in front of me. Thanks for your contributions. However I’d like to know how can I raise my javascript skills to start writing my library of own tailoring it to meet my needs. Can you please help me with a road map. If you cannot discuss it here, we’ll take it offline if you please.

  2. Gernot says:

    Why would you want to stream > 10’000 rows to the client? That makes no sense for the user. Maybe you should rethink the use case instead.

    • jiren says:

      This is just a benchmark we have tested. Main use case is, in the table there are more rows in that situation no need to wait until whole table is rendered.

    • Gautam Rege says:

      Ah! But in your question lies the answer :) Simply answered – why not? Till now, it was assumed that for large datasets, you should search, not filter. Here is another use-case:

      Assume a clerk at a bank desk using an internet based banking system (hope to see that one day) and he gets a bunch of cheques (or checks depending on your country). He would either be adding them to the system or editing them (approve / reject etc.). In such a case, it’s actually easier for him to filter rather than enter a search criteria every time — why, you ask? Because currently people are used to Desktop apps, not online apps – and the expectation with web app results is that they should be ‘as fast as a desktop app’. So, if the clerk is doing data-entry, he doesn’t care about how many records are loaded in the table. If he is ‘filtering’, he wouldn’t mind waiting ONCE for data to load, and if its a a live filter, he can probably find what he is looking for even before all the data loads. In either case, his productivity has improved because there is no latency!

      Now, you may argue that this is a one off case and in any case, the system should be intelligent enough to not load ALL the data. But thats the beauty of it – I say why not when we can and at no additional resource cost. If its going to improve productivity (i.e. save time by client side processing) I say why not!

      In the example above, I use a ‘bunch’ – it’s a relative term – even if that bunch is a 50 or 500, the performance boost in human productivity is note-worthy.

    • gbuchanan08 says:

      10000+ rows to the client is exactly what I’m looking for. Although I agree that (for most public sites) this is overkill, I am working on a company internal log viewer for hardware/OS test results. Each test run can produce 35000+ lines. Test analysts are the only people viewing these logs and they will be happy to have this much capability.

      Use cases come in all forms.

  3. Jim O'Keefe says:

    How does this compare to your filter.js. That has a stream option as well, right?

    • jiren says:

      Streamtable render rows according to per page. i.e per page 10 row then it will render only 10 html element, so while streaming it is not going to render all records. filter.js has no pagination so it will render all html element, so when it streaming it will add html elements to page.

  4. nice script! congratulations! I’m trying to use it with a nodejs app that returns a json via REST (data_url: ‘http://localhost:3000/nodes’) but I get the error “http://localhost is not allowed by Access-Control-Allow -Origin. “how can I fix? thanks

  5. hardin81 says:

    Hi! Thank you for this amazing script!

    I have a question though…
    Is it possible to use this along with more input fields?

    Case: Let’s say if you have a huge table of apartments. Each apartment have an array of available dates and also an array of facilities (this attributes can be hidden). Is it possible to set an arrival date and departure date and some facilities and then TableStream can find matching apartments in the table?

    Date from: [2013-10-10] (datepicker)
    Date to: [2013-10-15] (datepicker)
    Facilities: [x] aircondition [x] beach [x] pool [x] dishwasher
    Words [Type here...]

    —————————–
    TableStream goes here….

    …lists all the apartments that is avaiable from 2013-10-10 to 2013-10-15 and has aircondtion, beach, pool and dishwasher
    —————————–

    Thank you once again for a amazing script! Keep up the good work!

    • jiren says:

      Currently filtering by criteria is not available in StreamTable.js but you can use filter.js(https://github.com/jiren/filter.js) it has both streaming and filtering, Only problem is filter.js render all records and then do hide-show, so there is a performance hit.
      So can I know how many records you are going to list on page?

      • hardin81 says:

        Thanks for fast reply and your tips :)
        It has about 1000-1500 records

    • jiren says:

      You can use filter.js. In render function you can use the table view like Streamtable.js.
      Also, I am going to implement filter using criteria in Streamtable.js in few days.
      If you have performance issue then tell me so I can help you out in customisation of js.

      • hardin81 says:

        Thank you very much for this!
        I’m playing around with it right now and this is excactly what I’m looking for :)
        The tricky thing now is to set an datespan and match that to an array of avaiable dates :)
        Thank you for your helpfulness ★★★★★

  6. Eddie Shipman says:

    Need way to omit some columns for searches, How would I do that?

    • jiren says:

      There is option called fields. So you can select fields like fields: ['name','address',......]

      • Eddie Shipman says:

        Not exactly sure what you are saying. How would I have all my fields (columns) but omit some of them from the indexing for searches?

      • Eddie Shipman says:

        Never mind. I re-read the documentation.

  7. How would I combine this with a near real-time updating of the table data? Let’s say, I fetch the current alldata.json file with 1000+ records, and each record is updated at individual/unknown times, and I want to subscribe to diff updates from the server and update the table when some records have changed……?

    • jiren says:

      We can implement this using websocket, for this server app and browser both should supports websocket. Which web framework and server are you using?

  8. drfrankinfurter says:

    I couldn’t get it to work. :/

    I was able to initialize the table in my view but it doesn’t grab the data I’m supplying to it.

  9. Thanks for the excellent tutorial

  10. Jason says:

    Hey Jiren – this is really fantastic! Nice Work!

    Is there anyway or syntax to search (filter) two columns? Using your dataset I want to find all movies in 2010 that have a rating of 7.3.

    Is that possible?

  11. Alex says:

    How about a case where we start off with only a single record and that array gradually increases over time. Would you be able to load the first sets of data and then have it start streaming when you reach a certain number of records? From what I can tell you load data from two different places – One for the initial load and then another to stream from is that correct? In the use case above would we need a mechanism of writing to the first file up until a certain number of records and then add any further updates to the second stream file source?

    Thanks

    • jiren says:

      Yes it is possible, for this need some customisation to define multiple sources, event notification when data is available(by long pooling or websocket).

  12. Alex says:

    Hi Jiren,

    Thanks for getting back to me.

    Just to be clear it would be preferable to have the data loaded from a single source and simply stream any data over say the first 50 records. The site example uses two separate sources, which works fine with a fixed data set, but mine will be dynamic.

    Could you specify the stream source the same as the initial data load source, and ask it to stream from the 50th record onwards to avoid duplication for example?

    I’ll be using node.js so socket.io is a nice option for the event notifications.

  13. jiren says:

    Yes you can directly pass objects array initially. After that it will request for further data from specified url. Only problem is right now it preiodicaly it send ajax request to load data.

  14. JT says:

    Hello – I currently have an implementation of datatables using socket.io (using flashsockets and websockets) for streaming of results. Think stock ticker like. Each result that needs to go in the table streams back one “row” of data a time. There is never a concept of “here is 50 rows at first” and then stream the remaining. My source of information streams in one JSON structured item at a time. I would like these results to stream in and also sort by name field after each new rows comes in. Does this use case makes sense to you and does StreamTable support this type of use?

    Thanks in advance.

    • jiren says:

      Currently streamtable is not supporting websocket but it fetch data on certain interval. I would like to add web socket into this.
      Right now one way to implement this is after creating stream table object you can connect to websocket server and add data on message receive

      “`
      i.e
      var st = StreamTable(‘#stream_table’, { view: view, auto_sorting: true});
      var socket = io.connect(‘http://localhost’);
      socket.on(‘stock ticker’, function (data) {
      st.addData(data);
      });

      auto_sorting: true will do sorting after new data add.

  15. popotam2 says:

    Josh! The idea and realization is perfect. thanks a lot, going to use your component in my start-up. Does your component support DataTables options like fixed headers, https://datatables.net/extras/fixedheader/ ? Or make a hint how could i combine those functions.
    I am using Yii – is there an extension for Yii ?

  16. Murthy says:

    Dear Jiren,

    Thanks a lot for brilliant effort by you and your team. Congratulations! for creating this wonderful script. Can you please give me step by step instructions on how to use remote json data ( by using url of a php file that generates json data ) with streamtable.js? I mean, I want to have complete working example of streamtable with json data fetched from php file using mysql.

    Thanks & Regards,

    Murthy

  17. Raj says:

    Hi Jiren,

    The example provided below is duplicating records

    http://jiren.github.io/StreamTable.js/stream.html

    Can this be stopped? I only want the updated records to be streamed. Why does it keep on repeating the same records over and over again.

    Please help.

    Thanks,
    Raj

  18. Sri says:

    Hello Jiren,

    Today only I came to know about StreamTable.js, really it’s a brilliant idea by you and your team. Congratulations! for creating this wonderful script. Can you please give me step by step instructions on how to use remote json data ( by using url of a java file that generates json data ) with streamtable.js? I mean, I want to have complete working example of streamtable with json data fetched from database (OTSDB) using java and otsdb rest api’s. I spent a lot of time of using this in my project but finally seeking your help to move forward.

    Thanks,
    Sri.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s