0

I have a location text field hooked up to the jquery plugin EasyAutocomplete:

enter image description here

It's working, but the problem is the dataset is large - 100,000 records or so. It takes an unacceptably long time to match any results.

I have a json endpoint in my app that loads the records for the autocomplete. The code in the Rails controller looks like this:

def index
  @locations = Location.select(:id,:canonical_name)

  respond_to do |format|
    format.json { render json: @locations }
  end
end

So far what I've tried is putting an index on the canonical_name column, and putting a Rails.cache.fetch around the loading of the records, but neither of these things helped much.

What can be done to speed up this operation?

4
  • I'm assuming the filtering of the results is done on the server and not on the client? (not very familiar with RoR so trying to infer).
    – Taplar
    Commented Apr 10, 2020 at 17:07
  • 1
    You are loading all the locations at once which makes the applications slow. You should use AJAX to filter results as user types. Checkout the example easyautocomplete.com/examples#examples-ddg
    – Amit Patel
    Commented Apr 10, 2020 at 17:24
  • @Taplar no, there is no evidence of server side filtering at all. Its just dumping the whole table into JSON.
    – max
    Commented Apr 10, 2020 at 17:55
  • 1
    Welp, so that's the problem, :)
    – Taplar
    Commented Apr 10, 2020 at 17:55

1 Answer 1

0

The problem here is not searching through a large JSON array. Its that you're not doing the search in the database in the first place.

You're first selecting 100,000 rows from your database and using it to initialize 100,000 model objects. That not only will take a lot of time but eat through a gigantic chunk of RAM.

You then go through and serialize that collection of 100,000 rows into a mega long JSON string which eats even more RAM.

This is then sent across the internet to the poor client that gets a monster JSON response and runs out of data on their mobile plan. Even if you implemented some kind of magic stack sort that's phenomenally fast the costs if just getting the data here are staggering.

Caching might slightly speed up the server side but its going to do nothing for the initial cold cache hit or the amount of data sent across the wire and the amount of RAM this will use on the client.

Implementing a server side search like that provided by geonames or google is actually quite an undertaking and you might want to investigate the existing options.

A really naive implementation would be:

@locations = Location.where(
  "canonical_name like ?", "%#{params[:query]}%"
).select(:id, :canonical_name)
1
  • In fact if you ever find yourself asking for the fastest way to search an array there is a 99% chance you're doing it completely wrong.
    – max
    Commented Apr 10, 2020 at 18:31

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.