Quality of Hire Blog

arrowBack to Quality of Hire Blog

Data Miners Guide To Quality of Hire - Crowd Sourcing HR.com Content

by Joseph Murphy

Debbie McGrath at HR.com has been intrigued with our work in helping organization improve their quality of hire. Debbie invited me to submit a presentation for the Talent Acquisition Conference – Quality of Hire Track. My submission is Data Miners Guide to Quality of Hire.  I am asking for your support for this content to become part of the conference.   Please read on.

I recently presented a version of this session for the Cleveland Staffing Management Association. Here is what Frank Zupan, SMA Chair had to say about the session.

Joe Murphy absolutely nailed it with his recent presentation "Miners Guide to Quality of Hire in the Era of Big Data". He drilled in to all layers of HR and recruiting data, leading our audience into the "Big Dig" of current and future requirements of critical organizational information. This track was extremely well received by the Cleveland SHRM SMA audience, and we've had great ongoing discussions prompted by Joe's mining expedition.

Getting the presentation on the slate is up to you. I need your vote. The conference content is being determined by crowd sourcing. It is an interesting approach. Each person voting on content has to make a decision. It is said that decision quality is directly related to the data that supports that decision. So, I have provided a review and some details of my presentation so you can make an informed decision. The link to the voting site is at the end of this post.

Reference Point The word quality implies some standard, a reference point to move toward or exceed. As such, quality of hire is concerned with defining the reference point and measuring the relationship to that point with an outcome. In the case of a staffing process, the reference point is performance requirements or quality standards of the job. The outcome is the performance of the individual. Therefore, quality of hire is determined by the relationship of performance to standards.  This impacts the time horizon over which data is collected.  When can you  measure performance? The critical element to consider is time to proficiency.

Measurement Categories Kirkpatrick’s Evaluation Model is widely known for distinguishing the methods and relative value of various forms of measuring learning outcomes. Recently Josh Bersin took that same framework and applied it to talent analytics maturity. My session explores recruiting metrics for all four levels of data gathering and talent analytics suggested by these two models.

At the core are the Three Os of Quality of Hire:

  1. Opinions - data commodities
  2. Observed Behaviors - data rare minerals
  3. Objective Metrics - data gems

In addition, the session places the recruiting data into a leading-lagging indicator framework. This demonstrates that quality of hire is the lagging indicator of a measurement and analysis process.  How many sources of data are in your quality of hire metrics?  There may be data in your backyard ready for mining.

Big Data This catchy buzz word overshadows the rigorous work that selection scientists have been doing for about 100 years. Quality of hire can be documented and improved over time with evidence-supported hiring decisions. The Data Miners Guide to Quality of Hire will help you audit your current practices and invite you to add more rigor to your staffing process improvement.

The session examines three stages of evolution in the quality of hire journey:

  1. Prospector
  2. Developer
  3. Operator

And three difference specialist roles/skill-sets  that are required along the way are defined.

Thanks for your consideration and the time you took to read this. I would appreciate your vote of support. Follow the link below and find my session in the Quality of Hire track.

Talent Acquisition Conference Content Voting

Thank you.

I look forward to seeing you in San Francisco in January.