For many years, I taught a series of workshops at a university on topics such as designing training, conducting interactive workshops and how to conduct a training needs assessment. The audience of Human Resource Managers and Training Managers were very new to their roles. Still others in the program were managers who were aspiring to be HR and training professionals and wanted to be in on the so-called secrets of our profession.
When we covered how to conduct a training needs assessment, I noticed that I often got tangled in one particular section: selecting the method to use to gather data. I would share all manner of options, including:
- One-on-one structured interviews
- Observation checklists of specific work tasks in critical areas
- Written surveys of needs within audience groups
- Knowledge-testing against procedures or policies
- Mock scenarios or other simulations
- Topical-based focus groups
- Benchmarking of similar or related roles in related industries
- Documentation reviews such as performance appraisals, internal resumes or profiles
- Multi-rater (360-feedback) survey results
- Stellar vs. under-performing observation comparisons
We would then practice; I’d give scenarios and they’d choose methods that might be effective. We’d gather situations they faced in their organizations and advise each other on how to be sure that we were addressing real needs. After all, I preached—it was a waste of time and resources to create a solution to a problem that wasn’t really relevant to the audience.
Rarely did we as a group disagree on the methods to use in a particular situation. In fact, we were so aligned that I wondered why these groups of ‘newbies’ even needed a program—they knew what they were doing.
This was where my tangling happened.
I believed that in-depth needs assessment was valuable—so much so that in our consultancy I was reluctant to complete instructional design with clients who didn’t conduct one. But, I also knew that in the “real world,” needs assessment could rarely happen the way we all just knew it should. There never seemed to be enough time, enough access to people, enough interest by stakeholders. There was never enough _________ (<<—– you fill in that blank).
Yet, here I was teaching what I knew the group would have difficulty pulling off. And this is where I made a change.
I continued to introduce the various methods, but would end that segment of the workshop by telling them the truth: all of this is good, but when it comes down to it, just ask. Ask the one doing the job. Ask their managers what they see their staff needing. Ask, and if you then have enough _____________ (<<—–time, access, interest, etc.) you can validate another way. Otherwise, run with what they tell you. Believe them. Conduct the training on what they say is needed and sprinkle in what your other data points tell you is relevant.
Yes, this advice flew in the face of what the other experts would recommend. But in the end, the simple act of asking about needs did two things that were equally important to the success of the training:
- Built buy-in and interest from the audience or their managers—something that is universally difficult to gain.
- Reallocated scarce instructional design resources for the designing, developing and implementation of training—rather than the determining of needs.
I no longer teach that training needs assessment program at the University, but, I do practice what I preach: when doing a needs assessment, I ask.
Now, how I ask—that’s an entirely different workshop.
PPS International Limited offers a free, online manager-driven needs assessment to select organizations. This tool asks managers about what they need in order to be successful in their roles and links this to how skillful they perceive themselves to be in this area. If you have a group that would benefit from this needs assessment, please let us know. This information can be especially helpful as you plan and establish budgets for your upcoming calendar year.