Use Case

How to build a lists crawler using Byteline Web Scraper


Here is the Byteline flow used for this use case. You can clone it by following the link
How to build a lists crawler?

Try it free

Sign up today to get ahead
using our bleeding-edge, no-code service.
Configure Scheduler with URLs list
The scheduler node is configured with a Google Spreadsheet containing the list of URLs to crawl
Web scraper with URL expression
The web scraper node scrapes data from Coinbase by using an expression for the URL. This expression points to a URL record from the Google Spreadsheet

Step by Step Instructions

Byteline step by step blog

Byteline allows you to crawl list of URLs from any site by using URL as an expression on its Web Scraper. In this documentation, we are scraping a list of URLs from the Coinbase web page using the Web Scraper node. This is one of the best lists crawlers for scraping data from any site and then directly pushing the data to a cloud service.

Here we will configure the following nodes to scrape a list of URLs:

Scheduler Trigger Node - First of all, we’ll have to configure the scheduler node to run the flow at a regular interval of time. A Google spreadsheet with the list of URLs is configured on the scheduler.
Web Scraper Node - After that, we’ll need to configure the Web Scraper node to scrape data from a  webpage. Here, we will scrape a list of URLs from Coinbase (a centralized exchange to buy and sell cryptocurrencies).

Let’s get started.

Create flow
Configure scheduler with URLs list
Configuring Web Scraper
Test Run