How Consulting Firms Analyze Labor Marketing Trends Using Web Scraping

Labour marketing trends analysis is important for businesses and companies to grow faster and get recognized in the market sooner.

Understanding labor market trends is critical to de-marking compelling strategies for job creation. Worldwide and territorial business estimates and projections trace the capability of various regions in the world to ingest a regularly developing worldwide labor force.

Nobody can make certain of what will occur later on. However, a few patterns in the labor market give hints about what will probably occur. When settling on your instruction or vocation choices, it is critical to comprehend these trends and use sound judgment dependent on this data.

How Consultancy Firms Analyze Labor Marketing Trends 

Roughly, the entire process can be divided into 2 sections, collecting information and visualizing the collected information.

The most common way for a consulting company to collect data is to scrape data online. Usually, they scrape the data from job boards, like indeed, glassdoor, etc.

To predict supply and demand relationships or further analyze labor marketing trends, a consulting company consultant should use some data science skills and ML/Al techniques.  They use EDA-like data science techniques to figure out trends using visuals.

This article will guide you through the entire labor marketing trends analysis process, for a small business or a startup, from data gathering with web scraping to data processing.

Section 1: Web Scraping

The data we need for the labor marketing trends analysis is as follows:

  • Title: contains Job Profile names
  • Title_Url: contains URL of Job Profiles and URL itself contains more content about the job description on the following page.
  • Company: This attribute basically contains the company’s name.
  • Location: This attribute contains locations where the jobs are opened for listed profiles.
  • Description: This attribute contains full-fledged details on job type, work experience, working hours, type of work kind of things.

Without a doubt, a consulting company can hire a tech team to help with web scraping and gather data from multiple job boards, covering from creating crawlers and maintaining them daily or monthly. However, if you’re a startup OR working in a small company without enough budget, then,  try using a web scraping tool.


Automatic Data Scraping Crawler (Octoparse):

You can use any automation tool for data scraping. But, I found Octoparse as the best solution. Because of its easy-to-use interface and the basic tutorials and blogs provided by their team to get started with this software are super easy to understand.

It isn’t only good for non-developers but also time-saving for Developers. Although, devs can also write python scripts to scrape any website listed above, they have to change the code as per the site because each web page layout is different. But using Octoparse is so simple, you can scrape static as well as dynamic sites in seconds.

For market trend analysis , you have to scrape detailed description of job title in order to analyse it using ML techniques. So, octoparse provides you the solution with easy steps to scrape data on the page that follows on the click.

I prefer Octoparse over other tools in market because of these characteristics:

  • Easy management: create scraping agencies to extract details from job aggregator websites. As many as you want! we can help create 1000+ agents in one week.
  • Timely manner with API connection: set time schedules to keep up with market dynamics, scrape and upload to the database at the same time.
  • Complete and accurate data attributes collect directly from websites: scrape monitors various data attributes including Job Title, Companies name, Location, reviews, ratings, Detailed description of the job in a specific area.
  • Intuitive Interface: visualize workflow with no coding prerequisites.

Scrape Jobs in particular location to see trends 

Well, you can scrape any website listed above but I am here scraping job board site Scraping these web-pages regularly can give one the most updated list of job openings.

Here I am using Octoparse version 8, To get started with the software you need to install it first and you need to register yourself. After that any time you want to use it, simply log in to use its free access.

You can use task templates to achieve the task of scraping job board sites directly, or use auto-detection to gather information automatically, or can choose specific elements manually in advanced options.

Anyone  can scrape any website in just 3 simple steps using this software, as listed below:

Step 1Setup Octoparse Crawler
  • Go to the “+” button on the leftmost panel → click on it → Choose Advanced option → and paste the URL in the empty space provided as shown in the figure below. → Click on save.
Octoparse Advance mode

Step 2Setting Up Workflow

  •  Choose either Auto-detect web page data  or  Edit task workflow manually (if you want to choose specific elements )
Octoparse auto-detect
  • After auto-detecting the web page, the Octoparse bot scrapes useful contents for you itself and displays it in the data preview section, and provides you with the switches in the TIP panel.
  • Check the next loop item from the action TIP panel and make changes if required → If you want to scrape data that follows on the click of URL → check the box and choose the attribute for which you want to scrape data on a click.
tip panel Octoparse
  • So if you are okay with the settings → click on save settings in TIP panel. 
  • So you will get to see a new window displaying content of the url that follows on a click, you will get to see workflow too.
  • On selecting the required text → you will get to see the pop-up TIP panel → Select the first highlighted option to retrieve the text of the selected portion.
octoparce data extraction
  • Now, Everything is done → save and run the task.

Step-3- Data Extraction
  • When you think you have extracted enough data, stop the run.
glassdoor data extraction
  • Now export the data. You must be able to see that this tool has scraped 136 lines in just only 11 mins including the content which follows on the click.
Export data
  • After exporting, it will ask the format in which you want to save the data on your system. I am extracting it as excel.
export data in excel
  • This is how the final dataset will look alike:
Labor market dataset

Section 2: Data Visualization

EDA using python to visualize result:

  • Analyzing Location wise Data Scientist jobs.
Location Wise data
  1. After observing the above plot, we found that almost 43% of the jobs are located in Bangalore.
  2. The top 4 cities namely Bangalore, Mumbai, Gurgaon, Hyderabad and Pune constitute almost 82% of total data science jobs in the country.
  3. So if you are from any of these cities, your chances of getting a Data scientist job is probably higher than that of other cities.
  • Analysing Company attribute for listing percentage of more job postings.
companies with more job postings
  1. So from the above graph, one can easily interpret that Accenture, Paypal, Bluejeans, and Linesight topped the list with higher percent of job postings.
  • Analyzing most frequent Roles in Data Science field. 
data scientist roles
  • This is an important step to look into because, after a few results, job portals usually start showing some other jobs that are irrelevant to the job we were searching for. Just to be assured that we are looking at the right roles, we checked the top 10 frequently mentioned roles.

Full Code


After analysis, We are in a position to tell which are the locations having more no. of job postings and which are the companies that top the list of Data Science job openings and one more thing that we figure out using exploratory data analysis is that what are the roles in Data Science field that are in demand.

Cecilia W

I am a content writer and a digital marketer in data science who believes in the power with which data can endow people’s businesses. I am working on writings that convey real values. If you have any feedback or ideas about web scraping and data analytics, talk to me at