The combination of social media and predictive analytics could transform the way we assess leadership potential. But at what cost?
How many times have you bought a gift for a loved one online? Perhaps you’re busy at work and don’t have time to go shopping. Or, if you’re anything like me, perhaps you’ve forgotten how soon a gift is due and you’re looking for a quick solution. Either way, you can select, buy, and pay for an item in less time than it takes to walk to the shops. And every time you use it, the website becomes more intelligent. How many times have you been surprised by the accuracy of Amazon or Alibaba’s recommendations to you? I sometimes wonder whether Amazon knows my family better than I do!
Now imagine that those algorithms could be used in executive search and leadership consulting, both for search firms and for candidates. Imagine that an algorithm in a database or on a social networking site could provide you with valuable information about cultural fit, required skills, and leadership traits. Or for candidates, imagine that a social networking site could provide you with a probability score for a promotion, based on various career and training opportunities available to you.
Many executive search and leadership consulting firms are already gathering tremendous swathes of data that will yield this depth of intelligence, not to mention the vast amounts of information being amassed by consumer social networking sites, like LinkedIn and Facebook. “There is a whole field of science that is interested in how you assemble the most effective teams,” says Jennifer Golbeck, director at the University of Maryland’s Human-Computer Interaction Lab. “People want to know how well an individual is going to perform in their job. There are algorithms to infer personality traits around effectiveness that can get results without a search firm even interviewing someone.”
The science of executive search is evolving rapidly, with development and research in areas such as psychology, sociology, and anthropology. Golbeck estimates that in the next five years we will see companies launching off-theshelf products that leverage Big Social Data algorithms to predict individual performance. But she cautions that there will be “growing pains” during the implementation of these tools.
Growing pains
Back in 2012 the American retailer Target hit the headlines as they predicted a teenage girl’s pregnancy and started marketing to her before she had told her father. The retailer can predict whether an individual is pregnant with 87% accuracy if they purchase these three seemingly random objects: a large handbag, vitamins, and a brightly colored rug. This is a startling insight into the depth of consumer intelligence provided by these algorithms and the slightly scary Orwellian marketing tools available.
Through analyzing a range of social media sites, executive search and leadership consulting firms will benefit from the same depth of information about candidates. But is there any tension with discrimination, privacy, civil rights, and equal opportunity laws? Social media is increasingly being used to screen candidates, and is also being used during the background checking stage. Employment law hasn’t yet caught up with this, as Jim Hostelter, AESC’s general counsel, explains. “It is a rapidly emerging issue from a legal point of view. There is no question that when you access information on the internet all of the usual laws apply.” But what happens if you check a shortlisted candidate’s social media profile and discover they have an undetected drinking problem, that they are having an extra-marital affair, or that they routinely get into online arguments? Or what if the candidate reveals an ugly opinion – racism, anti-Semitism? How do you walk the tightrope between presenting the information to the client without breaking discrimination laws?
The right to privacy?
Some would argue that those who use social media sites forgo some degree of privacy – that by sharing their holiday photos with their friends on Facebook instead of in a physical photo album, they are opening themselves up to intrusion. It’s an evolution of the argument that celebrities forgo the right to privacy once they become famous.
Executives have two options: they can continue posting online and ignore any privacy concerns, or they can purge everything they have on existing sites and avoid social media use in the future.
In the first scenario executives open themselves up to the chance that conclusions can be reached that they have no control over. Golbeck explains that social algorithms can already predict a range of things about an individual – political opinions, religious beliefs, who their spouse is, whether they drink, smoke, or use recreational drugs – without any of this being explicitly stated anywhere online. These predictions don’t currently have a profound impact on our lives because they are mainly used to tweak the adverts that we see online, but if these technological leaps were adopted in executive search, it would require a tremendous amount of responsibility from search firms to use this information responsibly, maturely, and accurately.
In the second scenario, where executives purge everything and ignore social media, an individual becomes conspicuous in their absence. Many people will be reluctant to follow this option. Personal brand is increasingly a differentiator for candidates, as well as a way to get the attention of executive search firms, and social media is the biggest branding amplifier available today. But there is an even more uncomfortable reality with this option. Last year Janet Vertesi, a Princeton sociology professor, decided to try and hide her pregnancy from the world. She had seen the stories about Target predicting pregnancy and, along with her husband, undertook an experiment. They didn’t post about the pregnancy on social media (in fact, when her uncle sent her a Facebook message to wish her well, she deleted the message and unfriended him – more on messages later), they paid for everything in cash, and they used a discreet web browser called Tor – typically used for buying drugs and other illegal activity. The inciting incident arrived when they wanted to purchase a stroller on Amazon. Her husband bought $500 worth of gift cards for the online retailer, paid for in cash. The transaction triggered an alert to the authorities to report suspicious activity. “Many people say that the solution to this discomfiting level of personal-data collection is simple: if you don’t like it, just opt out,” Vertesi wrote recently in Time Magazine . “But as my experience shows, it’s not as simple as that. And it may leave you feeling like a criminal.”
Human interpretation
Although employment law has been slow to react to social media as a recruitment tool, and even more sluggish to anticipate what predictive analytics could mean, there are other areas of the law where this is being challenged. In Europe, the debate around “the right to be forgotten” – the idea that an individual can request that something about them is removed from search engine results if it is unfavorable – has forced the likes of Google to change the way their search engine functions. In this instance the debate has oscillated between freedom of expression versus right to privacy. Google and Facebook are also defending themselves against class action lawsuits in the United States, charged with using the information in private Gmail emails and Facebook messages to inform targeted marketing. Both instances are relevant because if privacy rules become tighter as the use of predictive algorithms increase, the data is less accurate. The phrase ‘you don’t know what you don’t know’ comes to mind.
The decision on whether or not to hire someone should never come from just one algorithm.
The final concern around predictive data is that it overrides human judgment. Golbeck describes an incident from the 1990s in the healthcare sector, where new technology was released that analyzed kidneys and diagnosed disease with pretty good accuracy rates. As use increased the doctors favored it over their own judgment in the marginal cases, leading to misdiagnosis. “Every algorithm has error in it,” Golbeck explains. “As computer scientists we are fine with that, but if you start using these algorithms to predict people’s potential and their leadership traits, you need to keep in mind that you’re getting a similar quality of results as your Amazon or Netflix recommendations. The decision on whether or not to hire someone should never come from just one algorithm.”
In the same way that the profession has evolved to embrace sourcing and candidate identification through social media, it will most likely do the same if predictive social algorithms become mainstream during the next five years, as Golbeck predicts. There is no doubt that this technology could have a positive impact on the quality of executive placements. It will be incumbent upon the search firms themselves to walk the tightrope of technology evolution and legal risk. “It is a balancing act,” Hostetler says. “But the really good search firms will understand how to use this technology, avoid legal vulnerability, and navigate risk.” As ever, the firms that behave with integrity and excellence, while avoiding conflicts of interest, will reap the rewards of this exciting (and scary) new technology.
Get AESC SmartBrief for the latest in C-level news.