If you are beginning to feel paranoid, that might be a good thing, for there are invisible forces pulling the strings and making decisions that affect every aspect of our lives.
This is not some grand conspiracy or obscure magic. It is math.
Algorithms now recommend the movies we watch, decide what news headlines we see, influence our choice of romantic partner, and even determine employment opportunity. Yes, whether you get the job or not is now in the hands -- or the calculations -- of an algorithm.
Currently 72% of resumes never see human eyeballs; they are processed, reviewed and evaluated by Human Capital Management systems. These data-driven, decision-making mechanisms hold incredible promise for employers and applicants: increased efficiency in processing applications and objective decision-making that is not influenced by emotions, personal interests, or other human foibles.
Despite this promise, actual use of algorithms in HR recruitment has produced some rather disturbing and unintended results. As Xerox recently discovered, seemingly simple and objective data points, like an applicant’s ZIP Code, can unintentionally encode bias into the algorithmic process.
The problem here is not only with the way algorithms reproduce human prejudice. There’s more to it. The problem is that we implicitly trust the algorithm because it is, we think, simply churning through numeric data, and numbers do not lie.
And to make matters worse, these algorithms are proprietary systems or black boxes that are impervious to outside review and oversight.
So if there is bias, we might never know how or why.
I’m David Gunkel, and that is my perspective.