/DesktopModules/DigArticle/MediaHandler.ashx?portalid=0&moduleid=572&mediaid=284&width=885&height=200

Crowdsourced data: full disclosure or half-truths?

     
December 17, 2019

Crowdsourced vs. traditional market data

These days, the internet is our constant companion and reliable provider of abundant information right at out fingertips. But when it comes to compensation data, is the information reliable? Well, that depends.

Crowdsourced compensation sites are analogous to the information Zillow provides for the housing market. Each of these sites aims to unveil information to the masses. Zillow began as nothing more than a crude guess at home values, but over time a home’s “Zestimate” has become quite relevant in determining the asking - and ultimately, the selling - price.

In their infancy, crowdsourced data sites were simply open public platforms where employees could annonymously share information that was previously reserved for whispers at the water cooler. But, just like the water cooler gossip, you had to beware of the validity of the information. Modern crowdsourced sites like PayScale, LinkedIn, Glassdoor, Fishbowl, and Salary.com are much more formalized these days and are widely accepted as valid resources thanks to increased user participation, data vetting, and report refinements.

With the growth of digital, businesses seem to be in a state of constant re-birth and companies are being forced to adopt new technologies to stay relevant. HR is no exception as companies strive to run leaner and more efficiently than before. Crowdsourced data is quickly accessible and frequently cheaper, more flexible, and dynamically more responsive than traditional compensation surveys. You may have limited budgets and only be able to purchase traditional market data every two or three years. These crowdsourced surveys could help fill the gaps, especially for rapidly-changing positions.

Traditional market data Crowdsourced data
Pros Cons Pros Cons
  • Rigorous vetting and data refinement
  • Data sourced from knowledgeable professionals
  • Company/manager viewpoint
  • Correlation with job family ranges
  • Large employee participant data
  • Consistency year after year
  • Higher priced
  • Typically updated once a year
  • Managerial bias
  • Survey structure and questions slow to change
  • Limited viewpoint dependent on number of participating companies
  • Slow to respond to market changes
  • Lower-cost, quick, and timely data
  • Data sourced directly from, and available to, employees
  • Employee viewpoint
  • New or trendy jobs typically reported
  • Broad scope of employees from various companies
  • More rural areas reported
  • Quick to respond to market changes
  • Fluctuations in reported compensation
  • Data could be under/over reported
  • Employee bias and tends to skew to younger people
  • Jobs may stand alone without job family positions for context
  • Greater possiblility of small employee sample size
  • Bigger cities still dominate
  • Less consistency year after year

Balance is the key

Since crowdsourcing is completely reliant on voluntary information, a person must virtually, although annonomously, “raise their hand” to share. From the start, this reduces the participation numbers that might have been captured by a compensation professional reporting for an entire company.

Often, the subset of employees who contribute to these online sites are the ones seeking data for validation, promotion, or for a job search, which leads to a self-selection/contribution bias. There may only be a small number of respondents for a particular role.

Traditional surveys are typically selected by considering the organization’s needs, industry insights, company sizing, and compensation philosophy. They provide stability and consistency amongst all the chaos flying around the internet.

With Zillow, when you look at a home’s Zestimate, you assume certain pieces of information to be true such as the location, number of bedrooms, and if it has a pool. These key pieces of information can sway a home price significantly. The same thing applies to compensation data.

There are a myriad of ways to describe the same job. Most compensation professionals will eagerly profess that job matching is an art and a science. So when an employee describes his or her own job on one of these sites, they may be prone to over- or under-estimating particular skills, and in doing so, shift the suitable match.

In some scenarios, the job itself may be evolving thanks to the adoption of technology. The skills that were required two years ago may not match the same skills necessary today. A traditional survey would recognize these changes in the descriptions and update the survey accordingly — hence all participants would be adjusting at the same time. With self-reported data, the changes are often smaller and more frequent due to the continuous update cycle. Therefore, in an online environment, you may not be speaking the same language as another person in the same position just one year prior.

So which one should you use? Both!

The best practice is to use at least three distinct sources when pricing a job. Crowdsourced data can be a wonderful complement to traditional market data reports. By comparing the results against each other, you can validate data that is the same and identify data that requires more attention when they differ.

At the end of the day, Zillow can bring you a close proximity of a home price, but the agent provides the insights and nuances that can’t be found in the numbers alone. Similarly, you provide the “rest of the story” when it comes to pay. You know the company goals, pay philosophies, and personal stories beyond the numbers that help make compensation amounts meaningful.

Come see what Mercer has to offer with our selection of surveys including our most popular standard Mercer Benchmark Database or stay ahead of the trends with our Hot Topics & Reports.