Skip to content

Conversation

@tommelo
Copy link

@tommelo tommelo commented Jun 25, 2017

As mentioned in the Issue #6 I have added the possibility of getting multiple results:

var scraper = require('google-search-scraper');

var options = {
  query: 'nodejs',
  limit: 10,
  fullResult: true
};

scraper.search(options, function(err, results) {
  if(err) throw err;
  // Array of url's
  console.log(results);
})

@steventsao
Copy link

Thanks @tommelo . This prevents an issue where a crawler gets stuck when there are fewer results than the set limit. The original result callback doesn't inform the caller when there are no more urls, and makes the following implementation fragile:

let urls = [];
scraper.search(options, (err, url) => {
   urls.push(url);
   // handleResults is never called if there are fewer results.
   if (urls.length === options.limit) {
     handleResults(name, urls);
   }
 });

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants