1

In order to build the array of strings necessary for JQuery autocomplete, I'm traversing an array of objects and flattening it into an array of unique strings. I'm doing this about how you might expect:

var input = [{prop1:'value1', prop2: 'value2'}];

 $.each(input, function (index, value) {
                $.each(value, function (i, v) {
                    if (v != undefined && v != null && v != '' && ($.inArray(v, keywords) < 0)) {
                        keywords.push(v);
                    }
                });
            });

The problem is that it performs poorly for large input sets, and blocks the UI while it's running. How can I make this better?

2 Answers 2

3

You've got an O(n^2) algorithm there. Looking through that keywords array over and over again is going to get really slow (as you've noticed).

Instead, keep the keywords as an object:

var keywords = {};

    // ... inside your ".each():
    if (v && !keywords[v])
      keywords[v] = true;

Note that your tests of "v" against undefined, null, and the empty string can be replaced by letting JavaScript check it's truthiness. That's not always the right thing, but it's fine here.

You can also keep the keywords in an array, if you like. Just don't search the array to determine whether a keyword has been seen yet.

6
  • +1 it should be noted that arrays and objects are handled internally the same way by Javascript. This way at least the engine does the uniqueness checking instead of your code. But the performance of keywords[100] and keywords['some-string'] for an array and an object with the same numbers of properties/elements is probably similar. Commented Jun 30, 2011 at 15:13
  • Well yes, but that's not really the point - the key is to index by the string itself, to take advantage of the fast hashtable implementation internal to the JavaScript runtime.
    – Pointy
    Commented Jun 30, 2011 at 15:15
  • Right, I agree. I just wanted to point out that there's not really a downside to doing this (e.g. arrays are not a faster storage mechanism than objects anyway). Intuitively, one would expect array access by index to be faster than objects by key, but it's not the case. So in some situations a programmer may think they should use an array (because later access would be by numeric index). Interestingly a quick test I set up shows objects FASTER than arrays for like uses: jsperf.com/array-vs-object-jamie Commented Jun 30, 2011 at 15:25
  • Whoah - just ran that test in firefox. Results COMPLETELY different - arrays more than 10 times faster. And both results faster than Chrome. That is unexpected. Firefox must optimize arrays in some way. Interesting. Test probably not perfect though (as I create a variable def in the loop). Commented Jun 30, 2011 at 15:30
  • I get the same results. That's pretty amazing.
    – Pointy
    Commented Jun 30, 2011 at 15:33
1

i think you're trying to fix wrong problem

why is your autosuggestion list so big that javascript performs poorly? just send top 10 autosuggestions to client and you'll be fine

3
  • I'm creating an UI widget that has a built-in autocomplete. I don't want developers to have to create a specialized indexing service just to generate the autocomplete data. Commented Jun 30, 2011 at 15:01
  • so what should developer do if he has 10 millions of records with data that has to be suggested by autocomplete widget?
    – keymone
    Commented Jun 30, 2011 at 15:04
  • set {automcomplete:false} in my widget and use their own solution. This will work for the vast majority of cases, assuming I can get the performance I want... Commented Jun 30, 2011 at 16:08

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.