“Terrifying rumors initially propelled by Facebook’s algorithms have sparked fears that men driving white vans are kidnapping women all across the United States for sex trafficking and to sell their body parts,” CNN reported on December 4. Got your attention? Although no evidence exists to support the claims, a series of these posts went viral and prompted the mayor of Baltimore to issue dire warnings based on the unsubstantiated threats. What does this have to do with business? It illustrates our sometimes blind reliance on “data” in the absence of analysis and validation. Every industry and market is consumed by an urgent need to amass more and more data. This is particularly true in our industry. The persistent issue, however, has little to do with accessing or collecting information—it’s about how we interpret and respond to it.
Lies, Damn Lies, and Statistics
Lauded American author Mark Twain famously quipped in his autobiography, “There are three kinds of lies: lies, damned lies and statistics.” The intent was to demonstrate the sheer power of data, particularly the use of statistics to bolster otherwise weak or dubious arguments. The quote was not Twain’s. He attributed it to Benjamin Disraeli, though researchers traced the citation back to an 1895 article written by Leonard H. Courtney. Regardless, Twain popularized the concept and brought the potentially misleading aspect of statistics into the public discussion.
About 60 years later, in 1954, Darrell Huff would expand on the idea with the publication of his book “How to Lie with Statistics,” which went on to become one of the most famous business primers in history. His goal was to explain, in simple terms, the abstract concepts of statistical methods, their increasing presence in commerce and society, and how they’re interpreted. The work was ahead of its time when released and remains relevant in our current culture—one obsessed with Big Data.
Let’s go back to the situation with the white vans. As CNN’s Donie O’Sullivan explained:
The latest online-induced panic shows how viral Facebook posts can stoke paranoia and make people believe that spotting something as common as a white van, can be deemed suspicious and connected to a nationwide cabal.
“Don’t park near a white van,” Baltimore Mayor Bernard “Jack” Young said in a TV interview on Monday. “Make sure you keep your cellphone in case somebody tries to abduct you.”
The mayor said he had not been told of the apparent threat by Baltimore Police but said it was “all over Facebook.”
The key phrase here is “all over Facebook.” Somehow, social networks have grown to capture more attention and belief than actual news media. The presumption is that if enough people are posting, sharing, and reposting stories, they must be true. Data has much to do with articles going viral, because algorithms behind the scenes are promoting them.
O’Sullivan further described how the warnings rose to widespread notoriety, which seemed to begin with a Baltimore resident named Saundra Murray who shared pictures of a “suspicious” van at a gas station on Instagram:
Murray's post racked up more than 3,200 likes on Instagram. A few days later, on November 17, another woman in Baltimore posted screenshots of Murray's Instagram post to Facebook. That Facebook post had been shared more than 2,000 times by this Tuesday.
A separate Facebook post from another woman in Baltimore on November 18 that was shared more than 5,000 times showed a stock image of a white van and warned: "When you come out into the mall parking lot, and you see a van like this parked next to your car, DO NOT GO TO YOUR CAR."
No police reports of actual incidents have been recorded, yet van drivers admitted that they had been harassed.
Big Data, Broad Interpretations
Huff’s book brims with similar scenarios. In some cases, he discovered, data were intentionally misconstrued or manipulated: “The secret language of statistics, so appealing in a fact-minded culture, is employed to sensationalize, inflate, confuse, and oversimplify.” The rest of the time, the conclusions contained bias or overly broad interpretations.
In one of the book’s first studies, Huff cited an infamously flawed statistic by Time Magazine about the Yale graduating class of 1924. “THE AVERAGE Yaleman, Class of ‘24,” the article began, “makes $25,111 a year.” It was a weirdly specific number. It was also inaccurate. Adjusting for annual and total inflation, that salary today would be $343,316.72, an astonishing amount of money. Huff illustrated the successive problems with this representation.
- There was a really slim probability that the “average” income of any group could be reckoned down to the dollar.
- The incomes reported were not entirely based on salary; “people in that bracket are likely to have well-scattered investments.”
- Most importantly, Huff noted, “This lovely average is undoubtedly calculated from the amounts the Yale men said they earned.” Not, however, what they may actually have earned.
- Not every graduate was willing to report his income, decreasing the size and composition of the sampling group.
- Averages are broad and misleading things. Bill Gates, for example, earned $11.5 billion this year. The typical software developer in his state took home about $75,000. When combined, one could trot out a statistic that claims their average salary is somewhere near $5.7 billion. And we know that isn’t true.
Many companies consider themselves data driven, and they rely heavily on information gathered from a variety of sources: their clients, workers, suppliers, and more. Yet, too often we find that their interpretations of the data are biased, oversimplified, overly broad, blindly embraced, or inductively reasoned to prove a hypothesis rather than deductively analyzed to uncover a reality. In short, people still have a proclivity to accept information that justifies and supports preexisting beliefs.
Developing Meaningful and Reliable Analytics
As hiring managers and HR leaders struggle to grasp the complexities of people analytics and big data, they frequently revert to relying on “gut instincts.” Yet, the problem is larger than that. It’s not merely that our biases get in the way, but that past performance can’t predict results with a comfortable degree of certainty.
Before embarking down the path to utilizing big data, we need to prepare for a mindset shift. This is first step contingent workforce professionals should consider when approaching clients. People analytics are not reactive—when used properly, they provide illumination rather than support. That means we should approach data with curiosity and impartiality, not as a vehicle to prove something we already believe...or that others believe. In the end, the results of a careful interpretation might not be what we had hoped, but they will point us in the best direction.
Identify the Goal
Know the objectives and what could be different or changed because of the results. Ask your team a few simple questions.
- What are we trying to achieve?
- What information do we need to make a choice or course correct the current direction?
- What is the real business problem we’re trying to tackle?
By identifying the answers to these questions, we can work backward to uncover the data our clients need.
Build Thoughtful Samples
- Drive thinking that extends beyond a single department or group. Consider how the datasets affect the organization and its talent as a whole.
- Protect against confirmation biases that can arise from like perspectives or people who think the way we do. Approach the analysis as one of the researchers on “MythBusters.” Attempt to disprove accepted norms. Be receptive to risks, failures, and unexpected outcomes. all of these situations are critical learning experiences that will improve the process.
- Use good data: reliable, valid, clean, and complete. The data should be objective, not based on a specific business group, job category, company division, or hiring manager.
- Design comparisons across groups and over time.
Enlist Stakeholders Early
Even the most thoughtful and expertly performed analysis can fail if stakeholders are not informed and included in the process. Decision makers will be more likely to participate, review the research, understand its value, and implement the recommended changes when they’re engaged at the onset. Otherwise, the entire effort can be jeopardized. Without prior knowledge and inclusion, other stakeholders in the process may feel as though they’re being told how to do their jobs, especially if they think things are going well right before the moment they’re handed a report outlining all the things they need to change.
Despite best intentions, recipients in this scenario will feel blindsided. And when that happens, crucial plans languish on a shelf unimplemented and collecting dust, which amounts to wasted opportunities, squandered time and lost costs.
Assemble the Right Team
Designing the right team is imperative and should take place before any data collection or analysis occur. Although MSPs and contingent workforce program managers have mountains of useful data in their systems, the effort must be more expansive and collaborative to succeed. The best teams include a broad swath of representatives. In an outsourced workforce program, that would incorporate professionals from the client organization, the MSP, the VMS and staffing partner firms. These subject matter experts will be required to address the Whys, the Whats and the Hows of the project.
- Why: hiring managers, operational leaders and executives to provide the business expertise.
- What: staffing partners, procurement leaders and HR officers to provide expertise on the talent.
- How: Data analytics specialists from the MSP, staffing firm, client organization or technology provider (e.g., VMS) who understand the information, how to gather it and how to interpret it into meaningful results that decision makers can act upon.
Verify, Then Trust
The benefits delivered by people analytics are unparalleled. And while the process might seem foreign and overwhelming at first, Big Data can open our eyes to a world of exceptional talent and innovators we didn’t see before. We just need to make sure we’re looking in the right places, keeping our eyes and minds open, turning over the proper stones, and validating conclusions before the leap to them.