As part of a professional development initiative, Boise State University (BSU) Human Resources department approached the Organizational Performance and Workplace Learning department to request a two hour training program that would help BSU staff learn how to conduct a needs assessment within their respective teams. This training program would be delivered to two sessions of learners, one in-person session, and one virtual session that would also be recorded for future use and to be archived.
Since the topic of the learning program had already been identified by BSU, we did not conduct any assessment or analysis to verify that this training program would fulfil the organizational goals of the BSU Human Resources department. Instead, we conducted a learner assessment to identify potential advantages or disadvantages that might arise down the road. And while the identities of specific learners were not available to us prior to implementation, we were able to identify common systems and workflows that all learners would be familiar with due to their organizational roles. This analysis often drove the depth of the information provided as well as the technology we would use to develop, implement, and evaluate the learning program.
Our first step in the design process was to identify what needs assessment models/frameworks we wanted to use. After that, we needed to decide the vehicle we were going to use to present our information to the learners. We decided early on that a plausible scenario would be presented to the learners early in the training program and that the learners would alternate between receiving information from the facilitators and practicing new skills throughout the training program.
As we transitioned into the development phase, we discovered that our team had some radically different approaches to developing the learning materials. One approach was to flesh out the outline and script before starting the process of storyboarding. Another approach was to start storyboarding based on the rough outline and then make tweaks to the storyboard as the script and outline were fleshed out based on the storyboard.
These different approaches seemed to echo the differences between the ADDIE model approach and the SAM (Successive Approximation Model) approach that I generally prefer to use in most cases.
The traditional ADDIE model has instructional designers progress in a linear manner from one phase to another. Once the Analyze phase is done and all the preparatory materials and information has been collected, the instructional designer then proceeds to the Design phase where they start organizing the information collected and creating an outline.
The biggest problem with this approach is that there is no explicit evaluation of the materials until after the learning program has been implemented.
Many instructional designers I have worked with generally implement the ADDIE model in a more non-linear manner. After each phase, an evaluation is conducted to make sure that the data collected or materials produced will help to achieve the primary goals of the learning program.
By taking this modified, non-linear approach, modifications can be made earlier in the design and development process which often saves time in the project.
The Successive Approximation Model, or SAM is a bit more explicit in the need for regular evaluation during the design and development phases. Indeed, the core of SAM is rapid prototyping and iterative implementation.
I have found this method more effective in finding potential gaps earlier in the process.
Owning to the fact that our development team had different team members taking different approaches, we ran into a few delays during the development period. Some team members wanted to spend more time perfecting the storyboard before developing the final learning materials, adhering to a more traditional ADDIE approach. Other members wanted to start prototyping learning materials for evaluation by the project lead. Fortunately, our team didn't have much time to debate the issue. Self-imposed project deadlines were quickly approaching and we simply didn't have the time to continue perfecting the design documents before focusing on the development of the learning materials.
The implementation format had been dictated by the client, Boise State University which in turn, affected many of our creative choices during the development of the learning materials. We had been asked to facilitate two sessions, one in-person and one via video conferencing software. As such, learning activities that we were developing needed to work for both classroom and virtual settings. This affected what software we used, what learning activities we employed, and how we would encourage engagement throughout the learning experience.
For both sessions, we had about 1/3 of the prospective learners who pre-registered actually participate. From both a designer and facilitator perspective, this was pretty frustrating. Some of our activities were built with the understanding that a 50% participation rate was to be expected. There were several activities we had planned that needed to be adapted on the fly simply because we didn't have enough participants to do the activity as planned. We also discovered after our virtual session that a gas leak was discovered during our session. One of the buildings on campus had to be evacuated, forcing several of our participants to drop out early.
However, despite the lower than expected participation numbers, we found that there was excellent levels of engagement. We were able to adapt each of our activities with minimal impact on the overall learning goals.
In order to capture how effective our learning materials were, we developed an entrance and an exit confidence check. We asked participants how comfortable they were with the different topics and tools we planned on covering. On average, we found that confidence levels with targeted topics increased by over 120%. Unfortunately, a long-term study into how participants applied the topics into their respective teams was not part of the scope of this project.
I found this to be an extremely interesting project to work on. Not only are needs assessments an interesting topic from a performance improvement perspective, but approaching this topic from an instructional design perspective cause me to take a deep look into the tools and models I have been introduced to. As we were in the design and development phase of this project, we were constantly looking at which models for the needs assessment would translate well to a two-hour training session. In the end, some models that would have normally made more sense from a needs assessment perspective were rejected in favor for models that fit better into the instructional design constraints we were working with. However, the finished product accomplished the learning goals established by the client and the materials we created will be applied in future course offerings at BSU.