Getting started with Crawlers

Yesterday, Chris A and I did our second webinar where we showed the more than 50 people in attendance how to build an Crawler! If you weren’t able to make it, don’t worry! We recorded the whole thing so you can re-live the excitement all over again. We also have a great Crawler Tutorial, if you’re more of the reading type.

After I demonstrated the power of the Crawler we tried to answer some of your questions around crawlers and Unfortunately, we weren’t able to get through all of them; so Chris and I have pulled out the most popular ones and answered them for you here!

Questions from the Webinar

What is a Dataset?

In terms of, a Dataset is a combination of one or more data sources. Every time you build an Extractor, Crawler or Connector on our platform and click that awesome “Show me the data” button you are taken to a Dataset page for that source.

Once you’re on a Dataset page there are loads of cool things you can do such as combine it with other data sources, share it via social media, download it in a number of formats or visit our integrate page to access the code for it. You can learn more about how Datasets work and what you can do with them here.

What is the Dashboard?

The import•io Dashboard is really just another way of viewing your data. The ‘spreadsheet view’ that the Dataset page gives you isn’t always the most exciting, so we created Dashboards as a more visually appealing way to see your data. For more info on how to use and create Dashboards click here.

How often does the crawler run?

The crawler runs only when you tell it to. You can re-run a crawler you’ve built at any time by clicking the ‘Edit’ button on the Crawler on the My Data page, checking the settings and then hitting ‘Go’ again. Bear in mind though, that if you re-run the crawler it will replace all your previously crawled data so if you want to save it you need to download it to your own machine.

If you only need to refresh a part of your crawled data, you can use this awesome web app Chris built, which shows you how to use the Crawler over the API to update only the rows you need changed.

Where do I find the video?

Glad you asked! I’ve uploaded the video to the YouTube channel and embedded it here as well!

Even more fun with Crawlers

For those of you who are a bit more technical and want to get the most out of your crawler, I recommend you read this Advanced Crawling tutorial, which will show you all the awesome features available in our Advanced mode.

Hope to see you at the next one!

Can’t get enough our webinars? You’re in luck! The next one will be on 8 April at 4pm GMT. Chris and I will be showing you how to use Extractors and Connectors plus walking you through a BRAND NEW feature! Sign up here!

If you have any topics or questions you want us to cover in future webinars, email me on!

Turn the web into data for free

Create your own datasets in minutes, no coding required

Powerful data extraction platform

Point and click interface

Export your data in any format

Unlimited queries and APIs

Sign me up!