How to build a web app with data

An import.io Webinar Production An import.io Webinar Production

Hello again lovely people. For this week’s webinar I mostly handed the reins over to our resident Developer Experience Engineer, Chris A, who showed you how to build an App with data!

What is an app?

So, first of all, why do you want to build an app? Well, generally it’s because you want to solve a problem. In Chris’ case he wanted to build his own computer and he needed computer parts. He can buy these parts online from a number of different sites, but there is no good way to compare all of them. So, Chris decided to build an app that would let him compare products from all the sites at once.

When you think about it an app is really just a cool idea, wrapped up in a bit of code, powered by some data. At import.io, we help you take care of the data part, as long as it’s on the web… and a little bit of the coding too! The great thing about using our tool, is that not only do we let you get data in a number of different ways, but we generate the code to help you integrate it.

For this webinar, we didn’t actually show you how to build a data source (not enough time), but you can see us build this particular computer parts Connector (and Mix) in our previous webinar.

Integrating your Data

When building your app the first thing to do is to start from the Dataset page with your data source in it and hit the “Integrate” button. The next step, is to pick the programming language you want to use from the list in the middle. For this example, Chris used JavaScript since he is building a web app.

Once you’ve chosen your language, you will need to retrieve your API key by putting your password into the box. This will give you your API key and your User GUID, it will also update all the code snippets to include the data source you have chosen to integrate.

The easiest way to see how an integration will work, is to scroll down and hit “Run this example”. This will open a new tab and when you hit the “Query” button it will run the query on your source live and display the results in a simple HTML format on the page.

From there, Chris showed you how to quickly modify the source code of that page to include a search box which you can use to query that data source with new inputs.

That’s all pretty neat, but it’s still only one data source. So, next Chris showed you how to combine a bunch of different computer parts Connectors to create a Mix, so that he can search for one term across all six sources.

Here’s the Particle Mix.

The best thing about making a Mix, is that you can integrate it just like a Connector. Chris showed you where to find the query function and demonstrated that it really is just as simple as copy and pasting it into the app he wrote.

You can access the code Chris wrote for his web app on here (you will need your own API key and User GUID).

Once you’ve got the data through import.io and into your app, you can run a bit of extra analysis to make the data more relevant. In this example, Chris used an algorithm to filter the responses by relevancy and price.

For more developer tools and open source code, visit our company Github page, we have loads of helpful web apps to let you do more with your data. Or visit our Tech Blog for ideas, use cases and how-to’s using some of our more advanced features, there’s also a dedicated Technical Tutorials section. Finally, if you have a question visit our Developer Forum or use our tag on StackOverflow.

Question Time

What do I do if my Crawler gets stuck at page 4 or 5?

If your Crawler appears to be getting stuck after the first couple of pages, this is most likely a mapping issue. We only require that you train 5 pages, but if pages vary slightly it is a good idea to train more than that. For example, when he was crawling his blog, Chris trained 10 pages to make sure he covered all the possible variations of data. This isn’t strictly necessary, but the more training you can give the Crawler, the better it will work. If the problem persists, then send the 5 URLs over to us at support@import.io and we can take a look for you!

Can I slip you a tenner to get my APIs working?

Surprisingly, we get this question a lot; and while I would love to get a little extra beer money, I’m afraid I have to decline. We try to help everyone as much as we can, and sometimes that can mean that we might not be able to respond to your request immediately – especially those of you in the US. That’s why we’ve started offering paid fast-track support for those of you whose businesses depend on your APIs working. We’re still working through all the details, but if you’re interested in this service, get in touch with our sales team on our enterprise page.

What do I do if I get a “#ERROR” message when pasting the code into Google Sheets?

This is an internal Google error message so it is difficult to know what exactly has happened. The best thing to do is copy the URL from the Integrate page you got the code from and send it to our support team so we can look into it for you.

Can I connect an Extractor to a Connector?

This is something that we support using our API, but not currently on our site. You can find some example source code for doing it on our Github page.

Can I schedule import.io to crawl sites regularly?

Right now, you can schedule your crawler to run automatically using the command line. We are also working on a way to allow users to do this through the UI.

How do the content owners feel about this scraping of data?

At import.io we sit between the content owners and the people who need data. Our aim is to make the exchange of data as painless as possible for both parties. In most cases content owners actually benefit from the reuse of their data. For example, when Graham created his affiliate site, he started sending loads more traffic and sales to the sites he got the data from. Data scraping only becomes a problem for the content owner, when the person taking it is breaking the copyright of the site. import.io is a pipeline for data, we don’t store or index any of it, and therefore it is the responsibility of each user to make sure that they understand and obey the copyright rules of each site they pull data from.

Join us next time….

You’re in luck! Next week we’ll be doing two webinars! On Tuesday, Chris and I will be back to show you what sort of cool app our team has built during the internal “hack day” (Sign up). And on Thursday, Andrew will teach you how to use Google Sheets to combine an Extractor and a Connector (Sign up).

Comments

Hi
I like the javascript example done by chris above. Thanks, the video is a great tutorial.

2 questions:

  1. How would i create a connector that was more advanced in that i had to input 2 fields eg, on rightmove.co.uk I want to query by postcode AND type of property at the same time?

  2. Do you have a video/tutorial/blog post on how i can format my query results? i.e. make it look pretty?

Thanks again

Hi Faisal,
Glad you liked it! To answer your questions…

  1. You should be able to record multiple inputs using the URL pattern. Here is a tutorial on doing just that: http://support.import.io/knowledgebase/articles/249415-advanced-connector-url-patterns

  2. I believe you would need to write your own JavaScript to format your results. You can have a look at our Github page which has a bunch of open source projects that might be what you’re looking for: https://github.com/import-io

Hope that helps! If you have any more questions, please email support@import.io and we’ll do everything we can to help you out 🙂

Hi,
Thanks for recording this webinar, guys. You know there are a lot of people who are not professional coders but still like to make simple apps for their own use [or for their clients].

So this intro webinar was really useful for the rest of us 🙂
Thanks again for taking the time to produce this.

Comments are closed.

Turn the web into data for free

Create your own datasets in minutes, no coding required

Powerful data extraction platform

Point and click interface

Export your data in any format

Unlimited queries and APIs

Sign me up!