A bit too much madness…

So this weeks webinar didn’t exactly go as planned….I guess the internet couldn’t handle that much awesome and it decided to cut out half way through the webinar. Which is a real shame, cause it was shaping up to be one for the record books! Anyway, I apologies for the outage – rest assured our internet providers got a right talking to – it’s one of the hazards of having a hipster London office.

I know you guys were loving the Google Sheets madness (brought to you by the man, the myth, the legend: Andrew Fogg) and while I wasn’t able to salvage the recording, I have written down a brief outline of all the things we talked about (and were planning to talk about) along with links to relevant tutorials.

Integrating Live Data to Google Sheets

The first thing we need to do is get our data into a GS. Head over to your My Data page and select the extractor you want to integrate – you’ll see that there’s a Google Sheets tab.

Before you can access the formula in full you’ll need to get your API KEY. Click on the link that says “API KEY” and enter your import.io password. Your API key is unique to you and it helps to connect your account to your data.

Once you’ve put in your API KEY you can take the full formula and paste it into a cell in google sheets. The sheet will go to our API, which will call the page and return all the data into the sheet.

Let’s look at that formula in a bit more detail…

First is =ImportHtml, which is a GS formula to (as you’d imagine) import HTML.

The next bit is the “query to the import.io API:

If you were to paste that into the URL bar (with your API key in obviously) it would return all the data.

I also want to draw your attention to the URL that I have bolded, which is the URL that the data is being pulled from. If I were to change this to another Ikea URL, I would get the data returned for that page instead.

The last part , “table”, 1) just finishes off the GS formula by telling it to display the data as a table and to only display it once.

We can make it much easier to change the URL by inserting a row above my table and pasting the relevant Ikea URL into cell A1. Then we change the formula by taking out the Ikea URL and replacing it with “&A1&” which is the cell reference for where we pasted the URL. Now all we have to do is paste a new URL into cell A1 to get new data from Ikea, instead of changing the formula each time.

Refreshing your data

The question that always gets asked at this point is: How often does the data refresh?

You can manually refresh this data by re-pasting the value into A1, which will force a recalculation of your API data. Supposedly, the formula will also refresh automatically on a schedule, Google don’t tell you how often this refreshing takes place, but people on the web guesstimate that it’s about once an hour.

Connecting two extractors

Now that we have the building blocks of integrating data into sheets, we can start to do some more complex things with it. To start with we’ll combine two extractors together. This is useful if the data you want is spread across two pages.

Links to contracts (1st extractor) Links to contracts (1st extractor) Contract details (2nd extractor) Contract details (2nd extractor)

You’ll need two extractors for this, one to pull the links from the first page and then a second one to get the information from the second page.

Then paste the formula for each extractor into a separate tab in GS. Then in the formula of the second one (the one with the page information) change the URL to be a cell reference to the relevant URL in the first sheet – just like you did in the first example.

Batch Search

That’s all great, but what if you want to run all the links you gathered in the first extractor through the second one? That’s where batch search comes in! This one took a few scripts to manage, so Andrew has made it into a handy spreadsheet for you.

You’ll need to fill in your User ID and API KEY (which you can get from your account page) and the GUID for the extractor you want to run the URLs through (which you can get in the URL bar of the My Data Page).

GIUD is highlighted GIUD is highlighted

Once you’ve plugged those in, hit the “Get Source” button, which will go and find your extractor at import.io and let you know which inputs you can use. Next, paste the list of URLs starting at column 20A and going down (one URL per cell). Then hit “Get Results” and the sheet will run a script that passes each URL through the extractor and returns the results into a new tab.

Join us next time

Our next webinar will, hopefully, have enough internet to make it all the way through – I’ve already started stockpiling! To get you all in the holiday spirit, we’re doing a Getting Started webinar with a Christmas twist! There’ll be Ugly Christmas jumpers, figgy pudding and carolling! Sign up now.

Turn the web into data for free

Create your own datasets in minutes, no coding required

Powerful data extraction platform

Point and click interface

Export your data in any format

Unlimited queries and APIs

Sign me up!

Comments

Comments are closed.