For venture capital (VC) firms, staying on top of fast-moving technology trends is critical. By leveraging Import.io, Madding King III at Camp One Ventures was able to establish a useful and objective way to track trending buzzwords in the VC sector. Read on to find out what he discovered, and…
What is Artificial Intelligence? Our Chief Scientist Louis Monier gives you the straight dope on AI. Artificial Intelligence, always a very polarizing subject, is back on top of the news. Unless you have been on a deep space mission for the past year, you have been exposed to opinions ranging…
If you haven’t heard of R before, you should know that it’s one of the most popular statistical programming languages in the world, used by millions of people. It’s open source nature fosters a great community which helps make data analysis accessible to everyone. If you want a better understanding of how R works, and its syntax, we recommend you to take this free Introduction to R tutorial by DataCamp.
While import.io gives you access to millions of data points, R gives you the means to perform powerful analysis on that data and to turn it into beautiful visualizations. It’s a pretty nifty combo!
In this post, you’ll learn 3 easy ways to get your import.io data into R. This is a beginner tutorial so don’t worry if you’re not that familiar with R or import.io’s advanced features.
Let’s get started!
The word “crawling” has become synonymous with any way of getting data from the web programmatically. But true crawling is actually a very specific method of finding URLs, and the term has become somewhat confusing.
Before we go into too much detail, let me just say that this post assumes that the reason you want to crawl a website is to get data from it and that you are not technical enough to code your own crawler from scratch (or you’re looking for a better way). If one (or both) of those things are true, then read on friend!
In order to get data from a website programmatically, you need a program that can take a URL as an input, read through the underlying code and extract the data into either a spreadsheet, JSON feed or other structured data format you can use. These programs – which can be written in almost any language – are generally referred to as web scrapers, but we prefer to call them Extractors (it just sounds friendlier).
A crawler, on the other hand, is one way of generating a list of URLs you then feed through your Extractor. But, they’re not always the best way.
The latest company to integrate with our Magic API is the always innovative Silk.co. Now you can extract data from a website and create an interactive visualization automatically. All you do is choose a website. It really is that easy.
Don’t believe us? Read the rest of the post and see for yourself!
If you’re a data nerd like us, you’re surely familiar with the powerful data visualization software that is Tableau. And if you’re not, you should totally check them out. Anyway, the guys at Tableau have just released version 9.1 and with that comes a very exciting update.
Tableau 9.1 includes a Web Data Connector feature that lets you push data into Tableau via an API!
Alex here, and thanks to everyone who joined yesterday’s webinar. For those who missed it, we took a sneak peek of a couple of brand new features that quite a few people have been asking for – Bulk Extract and the new Bookmarklet are coming to make your data extraction projects with import.io even smoother.
This is a recap of our most recent webinar where we looked at advanced crawling techniques using import.io. Follow us down the garden XPath as we check out some features for confident users looking to get the most out of their crawlers.
This webinar is all about our advanced features. If you’re new to import, I recommend you watch this Getting Started webinar first, because we’ll be skipping some of the basics to get down into the real meat of what import can do. Advanced crawling, XPaths, URL templates – this webinar’s got all that and more.
This morning a stumbled across Tagul, an online word cloud creation tool which has a couple of really cool features. The first, is you can upload your own image for it to put your words in (I obviously immediately uploaded our pink owl – Owen). The second is that you can put in a URL for it to pull the words off of.