Originally posted on February 25th, 2016 If you’re one of the 125k KimonoLabs users who got this message last week… “After almost 2 years of building and growing kimono, we couldn’t be happier to announce that the kimono team is joining Palantir.” …you’re probably wondering what to do next. KimonoLabs…
Once the sole purview of academics and a few of the largest high-tech companies, deep learning now represents an approach that’s poised for rapid and widespread growth across a range of companies and industries. Artificial intelligence (AI) was a concept that was introduced in the 1950s. Initially, AI was inherently…
For venture capital (VC) firms, staying on top of fast-moving technology trends is critical. By leveraging Import.io, Madding King III at Camp One Ventures was able to establish a useful and objective way to track trending buzzwords in the VC sector. Read on to find out what he discovered, and…
What is Artificial Intelligence? Our Chief Scientist Louis Monier gives you the straight dope on AI. Artificial Intelligence, always a very polarizing subject, is back on top of the news. Unless you have been on a deep space mission for the past year, you have been exposed to opinions ranging…
If you haven’t heard of R before, you should know that it’s one of the most popular statistical programming languages in the world, used by millions of people. It’s open source nature fosters a great community which helps make data analysis accessible to everyone. If you want a better understanding of how R works, and its syntax, we recommend you to take this free Introduction to R tutorial by DataCamp.
While import.io gives you access to millions of data points, R gives you the means to perform powerful analysis on that data and to turn it into beautiful visualizations. It’s a pretty nifty combo!
In this post, you’ll learn 3 easy ways to get your import.io data into R. This is a beginner tutorial so don’t worry if you’re not that familiar with R or import.io’s advanced features.
Let’s get started!
The word “crawling” has become synonymous with any way of getting data from the web programmatically. But true crawling is actually a very specific method of finding URLs, and the term has become somewhat confusing.
Before we go into too much detail, let me just say that this post assumes that the reason you want to crawl a website is to get data from it and that you are not technical enough to code your own crawler from scratch (or you’re looking for a better way). If one (or both) of those things are true, then read on friend!
In order to get data from a website programmatically, you need a program that can take a URL as an input, read through the underlying code and extract the data into either a spreadsheet, JSON feed or other structured data format you can use. These programs – which can be written in almost any language – are generally referred to as web scrapers, but we prefer to call them Extractors (it just sounds friendlier).
A crawler, on the other hand, is one way of generating a list of URLs you then feed through your Extractor. But, they’re not always the best way.
The latest company to integrate with our Magic API is the always innovative Silk.co. Now you can extract data from a website and create an interactive visualization automatically. All you do is choose a website. It really is that easy.
Don’t believe us? Read the rest of the post and see for yourself!
If you’re a data nerd like us, you’re surely familiar with the powerful data visualization software that is Tableau. And if you’re not, you should totally check them out. Anyway, the guys at Tableau have just released version 9.1 and with that comes a very exciting update.
Tableau 9.1 includes a Web Data Connector feature that lets you push data into Tableau via an API!
We are super excited to announce that Blockspring have just released an integration with us to let you automatically pull data from the web into a spreadsheet. They’ve created an awesome solution that lets you use a range of great APIs using only a spreadsheet. Their Excel and Google Sheets plugin enables you to bring data into your spreadsheet, run text-analysis and much more.
So today we want to show you how you can use live web data from a spreadsheet to do a bunch of cool things in just a few minutes. Currently Blockspring is using our Magic API, which automatically generates a table of data from a URL. You just have to provide the Blockspring integration a URL and it pulls the data from that site into a nice, orderly table – all without leaving your spreadsheet.
In my job at Silk.co, I help lots of journalists, workers at NGOs and marketers build data visualizations from spreadsheets. Often we use Import.io to extract data in from the public Internet to push into a spreadsheet which we then upload into Silk for further analysis and visualization. (Here’s one we did with Import.io about Uber Jobs which was picked up in Mashable). I do spend considerable time thinking how to best represent data with visualizations. I am by no means an expert in data visualization on the level of Alberto Cairo or Edward Tufte.
That said, I do have some basic visualization guidelines that I use. These guidelines enable anyone quickly match the goal of their data visualization to the visualization type (or types) that should work best for their data.