Quality of Hire Blog

arrowBack to Quality of Hire Blog

Reflections on HR West 2016

by Joseph Murphy

I had the honor of speaking at HR West in Oakland, CA last week. The conference was well attended by about 1,000 HR professionals, speakers, and service providers from the greater Bay Area and beyond. The Northern California Human Resources Association pulled together an exceptional lineup of concurrent sessions and keynote speakers. I met many fascinating individuals on a quest to learn and grow and took away some key concepts that apply to talent acquisition and improving quality of hire.

Evidence-based practices

Stanford professor and leadership luminary Jeffrey Pfeffer shared lots of data on leaders, leadership effectiveness, and the gap between inspiration and execution. Inspiration is ethereal and fleeting.  Execution is objective and enduring.

Pfeffer argued that business has received very little return on billions of dollars in investment in leadership development. This apparent lack of value is due largely to an absence of objective metrics and the use of satisfaction ratings instead of a practical focus on operational and bottom-line outcomes. He called for documenting results with evidence-based practices.

Evidence-based practices in talent acquisition can relate quality of hire to operational outcomes. Objective candidate evaluation can speed up time to fill, meaning fewer days of vacancy and more workflow throughput. Objective candidate evaluation can lead to better retention rates, reducing staffing process waste and re-work, dropping real dollars to the bottom line.

Bank of America won the prestigious Human Resource Management Impact Award in 2015 based on evidence of how their Virtual Job Tryout candidate evaluation resulted in improved employee retention and a whopping $6.8-million reduction in recruiting, onboarding, and training costs.

Human-machine collaboration

Results-oriented, innovative business leader Eva Sage-Gavin (former CHRO of Gap, Inc.) shared her take on several trends and research data related to the core traits of highly effective CHROs. One of the striking trends she singled out was the rise of the enabled worker. Enabled workers capitalize on task automation and maximize the potential of the potent combination of empirical evidence, algorithms, and human judgment to solve problems.

The day after Sage-Gavin’s presentation, AlphaGo, a program developed by the artificial intelligence firm Google DeepMind, won its first match against world Go champion Lee Sedol. While IBM’s AI superstar Watson has become unbeatable in chess by a human alone, a human-machine collaboration beats Watson in chess every time. Evidence-enabled humans make better decisions. It won’t be long before a human-machine collaboration dominates the Go playing field.

The heavy reliance on random and unstructured data in the talent acquisition process impedes the meaningful human-machine collaborations that could lead to superior hiring decisions. The mainstay in talent screening is words: resumes, job applications, social media profiles, and digital footprint scrapings. These data are passively sourced, often created with editorial support focused on SEO optimization and inconsistent in accuracy and relevance. Two people who have had the same experiences are—owing to the nature of humans—likely to use different words to characterize these experiences, while two people who have had different experiences—owing to the nature of chance—may use very similar words to describe them. Discernment of job relevance is a challenge; systematization, impossible.

To capitalize on recruiter-machine collaboration, talent acquisition practices must move beyond seeking insights from analysis of passively collected random and unstructured data. Predictive modeling for improved hiring outcomes requires intentionally collected, structured candidate data. And the evidence from over 100 years of hiring decisions proves that recruiters supported with data from multimethod assessment produce better results. Period.

Pfeffer’s message warned about being persuaded by the unique and unusual outliers. In every situation, a handful of outstanding exceptions occur. Casual observers are commonly captivated by the extreme successes of such giants as Apple and Google, but such marveling often eclipses the remarkable evolution of innumerable other highly successful firms just as deserving of attention. Calling out the misdirection of such overplay, Pfeffer invites us to focus on the preponderance of evidence, lest we be swayed by eloquence and confidence that can offer only one data point as argument.

If passing a certification exam is essential for performing a job, the candidate evaluation should be designed to predict exam success rates. If faster time to proficiency equates to greater revenue, faster service, or less reliance on supervisors, then the candidate evaluation should be designed to predict learning effectiveness and efficiency.

Using human-machine collaboration, our clients collect, analyze, and score candidate data that predicts new-hire performance outcomes. One of our clients improved their license exam pass rate from 82 percent to 98 percent in one year, saving over $1 million in recruiting and training costs. Another client was able to identify candidates who completed their self-study product and service training 47 percent faster.  This better performance correlated to greater customer engagement, higher share of wallet, and a broader range of products and services sold. 

Human-machine collaboration will require increased analytic capacity. Identifying where to begin and what next steps to take can be opportunities to learn new skills and add new resources.

Workflow digitization

Monika Fahlbusch, chief employee experience officer at BMC Software came out swinging with demands for accountability, responsibility, and upping our game for digitizing workflows. She asserted that millennials will reject non-digital methods. And, for high engagement, work routines must have elements that mirror the social-digital interface. Monika urged us to consider how to blur the lines between what work and non-work looks and feels like. And to get us moving in that direction, she left us with a call to action: “Go back and kill a process that sucks productivity!”

The job application experience, largely controlled by enterprise database systems, is one area where upgrading the quality of the digital experience can kill a process that, honestly, just plain sucks for candidates. Research from the Talent Board’s Candidate Experience Awards survey indicates applicants want an opportunity to perform. Mindless resume-based activities—upload, copy, paste, retype, complete required fields, etc.—are largely viewed as sucky activities. Stop making your candidates do them.

Nobody likes a test, but everybody loves a test-drive. Virtual Job Tryout technology lets your candidates experience the job firsthand by letting them take it for a kind of ride. Candidates get to navigate through a range of day­-to-day activities and show their stuff via simulated tasks. And candidate reactions indicate the Virtual Job Tryout is the anti-suck. Over 90 percent of candidates indicate they are willing to refer others to apply based on the quality of the experiences.  Candidates also say completing the Virtual Job Tryout puts them in a much better position to decide if the job is right for them.

Fahlbusch compels us to meet the emerging workforce on their digital playing field. Simulation-based assessments allow candidates to play the job. Now that is engagement.

Analytic capacity development

In several sessions, the speakers asked for a show of hands from those who felt comfortable claiming statistical or analytical competence. I issued the same informal survey in my own session. The lack of hands in the air was evidence of the increasing need for owning the skills and infrastructure at the heart of evidence-based decision making. Whitney Martin of ProActive Consulting, in her session, Improving Selection Science, shared data from a Bersin by Deloitte study citing only 14 percent of companies have achieved some level of advanced talent analytics. In my session, I shared an even deeper insight. According to the same Bersin study, only four percent of companies are engaged in predictive modeling with human resource data. All of Shaker’s clients are in that four percent.

Virtual Job Tryout technology is a practical example of bringing together task automation, digitized workflow, and a big-data approach that enables evidence-based human-machine collaboration. Adding analytic capacity can convert quality of hire from myth to measure of meaning.

Each of the HR West speakers, in their own way, urged participants to up their recruiting game—add new skills, improve core processes, and above all, document contribution. Where will you begin?

Want to know more about big data and talent analytics maturity models? Read our white paper.