One of the best things about running the User Success team at import•io is all the cool use cases you come across on a daily basis. The innovative ways our users find to benefit from and integrate data they get from our platform never cease to amaze me. It seemed a bit selfish to keep all these awesome ideas to myself, so I’m going to be showcasing some use cases that catch my attention.
Hopefully, these use cases will give you a little inspiration for how you too can use data to do some pretty cool stuff. After all, it’s easy to use the import•io browser to extract data, but what happens next? What do you do with the data?
This case study shows how to filter a massive data set into a more manageable set of targeted results.
Searching for a python developer in New York City
Jason runs a company called Plug.dj. With his business expanding Jason needed to grow his team by finding a Python developer for his company. He wanted to compile a list of all the potential candidates for the job from a list of NYC-based developers and then filter it based on his criteria to find the perfect dev for the job. He followed the same process with a number of different sites, but as an example we’ll show you how he did it with angel.co.
Achievable data output
Jason wants a filtered CSV file of a number of suitable candidates to contact.
As always, the first step is to collect the data you need. Jason started a crawler on https://angel.co/developer, to collect the name, investments, number of followers, location, role and history of all their listed developers.
Angel.co presented a somewhat unique crawling problem, because they use AJAX calls instead of new URLs to “load” the pages. If you run into a site like this that you want to crawl, you can generally get around the issue by putting the corresponding URL pattern into the “Where to Crawl” field in your crawler’s advanced options like so:
and so on…
After the crawl had finished, Jason was left with thousands of results, which was a great start, but not particularly useful when looking for a targeted list of candidates. To filter all this data into something he could actually use, he first download the data as a CSV and opened it in a Google document.
You do this by clicking the Download button on the data set page and choosing CSV.
Then, using the filters tool on the location column, Jason was able to narrow his search down based on his unique requirements, giving him a much more manageable list of 24 candidates, from which he was able to find the perfect Python Dev!
Data is just the beginning
Often, getting data is only the first step in solving your problem or answering your question. Jason’s recruitment issue is a classic example of how a few simple filtering steps can take you from an overwhelming data set to a succinct and helpful list.
At import•io we’re not just about getting you data, we want to help you do really cool stuff with it too! If you’re not sure what data you need or what to do with it, just get in touch with us on firstname.lastname@example.org and we’ll help you figure it out.
A little extra info
You can check out Jason’s full data set here.
Once it’s open you can clone it by clicking on the cogs on the right hand side of the dataset’s name. Then you can edit it yourself or re-crawl for fresh data.