A few days ago, John Hollon wrote a short blurb about the growing population of HR "experts."  I could not agree more. I blame it on public exposure from the Internet. The web has made it possible for almost anyone with a computer and an opinion to claim expertise. So how do we separate expertise from strong opinion? It's not easy. In my case, it took studying jobs and developing selection tests to discover the clues. It's embarrassing to admit they were there all the time …I just never thought about it until I had to measure them.
Rungs of Expertise
Expertise is ladder-like. The first rung is a pair of hands; i.e., people who make a living doing what the client asks. Usually they have some practical experience with the subject (e.g., they know slightly more than their clients); but, they are actually just skilled individual contributors. You might think of them as knowing how to use the most common Word features.
The next type of consultant is a facilitator. He or she is not necessarily more knowledgeable than a pair of hands, but knows how to manage groups. Facilitators usually start life as a pair of hands but learn little more from every client engagement. Eventually they learn to help people solve their own problems and keep them on track through group processes.
This next rung is where the herd really thins out. This is where we find people who are subject-matter experts. Not only do they have all the practical experience possible, they understand the theory that supports it. You could think of them as knowing how to use all the functions of Word as well as being able to teach others how to use any function.
Lastly, we have the experts who not only know all the functions of Word, but they can tell you what's working, what's not, and why. Given the opportunity, they are capable of actually redesigning the software to make it more useful and efficient. These folks are few and far between. The developer of Internet protocol and the people who developed the concept of integrated office software belong to this group.
As a side note, clients should know clearly when they need a pair of hands or facilitator who can help them move from A to B; or, an expert who can move them from A to Z. For example, people who develop their own selection or competency systems often become very upset when a subject matter expert suggests major changes. This is unfortunate because in most cases a wrongly-designed system will inevitably fail within a few years. There is really only one best-practice way to select managers or individual contributor; like it or not, the rest are flawed in one way or another.
Careers, Degrees, and Experience
Moving up the expertise ladder involves a combination of practical experience (i.e., actually performing the work), mastering its theory and technical aspects, and working with a variety of clients to broaden exposure to multiple situations. In my profession, for example, one has to be thoroughly proficient in job analysis, competency measurement, ADA, validation, multi-trait-multi-method measurement practices, 1978 DOL selection guidelines, and APA test-development protocol. If someone does not master these basics, they are not qualified to do the work.
Another indication of a non-expert is when someone tries to enter the field with an unrelated degree. For example, all test developers are expected to complete graduate courses in test design, job analysis, assessment, validation, experimental design, statistics, and so forth. The objective of these programs is to minimize error and maximize accuracy. If your consultant's qualifications and experience are limited to recruiting, training, or an SPHR designation, their expertise is probably limited to being a pair of hands or facilitator. Don't misunderstand. There is nothing wrong with hiring a pair of hands or facilitator if that is all you need; but, if you want to test candidates for job skills, build an integrated competency-based system that actually works, or head off legal challenges at the pass, it is not enough.
People are also usually unaware that it takes more than a psychology degree to be a qualified job psychometrician. Out of a 72-hour graduate program, for example, industrial, clinical, and counseling psychologists have only about 25% of the courses in common. The other 75% are specialized. So, while psychologists might have the same letters on their diploma, counseling and clinical psychologists are trained to help people function in society, not predict job skills for the workplace. If your hiring expert provides you with a report that looks like a mental health evaluation, it's a clue you are dealing with the wrong kind of psychologist. Even within my field (i.e., the practical application of psychological principles to solve business problems) only a small percentage of graduates are true test experts.
As someone who both worked in the business school of a large urban university and earned two business degrees, I have also found that many business professors are better at theory than application (e.g., with the notable exception of accounting and computer science). For example, while my management professors often treated the MBTI, Hawthorne studies, and Maslow as sacred cows, my psychology professors conducted controlled studies and looked for proof. It was an eye-opening experience to read study after study debunking many business theories I thought were rock-solid.
As a case in point, you might recall Professor Mike Hammer's popular book on redesigning jobs. Hammer's career expertise and education included engineering and computer science. His co-author, Jim Champy, was an engineer and lawyer. It does not take a rocket scientist to realize that when jobs change, so do the skills necessary to perform them, yet these authors only devoted a page or two selecting people who could do the job (e.g., a field industrial psychologists spent the last 100 years studying). If you share Hammer and Champy's assumptions that people are sufficiently plastic to do almost any job; ask a few bankers how much time and effort they invest getting tellers to cross-sell, or how many successful individual contributors you know who are successful managers.
Bloggers are cropping up everywhere with opinions that are generally poorly informed. For example, one person commented on an article I wrote some time ago inferring they said as much the same thing in a presentation. I had never heard of the blogger and was curious about their qualifications to make this statement. I did a little background research and within about five minutes learned the blogger's competency qualifications were limited to working in HR and being a trainer. Sorry, folks, being trainer or working in an HR department does not qualify one to be an expert. Be wary of bloggers who have an ALL CAPS opinion, but lower-case expertise. That's why you will never see my name on an article about organizational development, reengineering, recruiting, compensation, or neurosurgery.
I would like to offer a few thoughts I have learned along the way: one cannot identify another technical expert unless they already are one (that means we are all pretty dumb about a lot of things); the more expert one becomes, the more they realize what they don't know (yep, it's OK to feel really stupid most of the time); an expert will have a combination of advanced academic education (to master theory) and practical experience ( to separate theory from reality); experts belong to professional associations where expertise is a condition of membership; an expert in one subject is usually painfully ignorant of another; real experts are the first to admit they don't know much; the more certain someone is about his or her expertise, the dumber they usually are; experts can always produce legitimate 3rd-party proof of their claims; strong (or loud) opinions are no indication of expertise; everyone has an opinion about something; and, be VERY wary of the person who does not know what he or she does not know (sometimes you need an expert to know an expert).
Thanks to ERE Media