Is ADDIE still relevant or does it need a refresh?
Working in the L&D world I have been exposed to many models of creating learning, usually impactful and “Gold-Star” Learning. Some have some and gone and whilst working for companies that are not learning companies there is a careful balance about how we engage with our stakeholders. Earlier in my career I recall sitting in a meeting and asking the project sponsors “Ideally what level of Kirkpatrick’s Evaluation Model would you like to see reflected in this project. After the meeting my line manager provided great feedback. Our success relies on us to speak with our sponsor in a relevant language that reflects their understanding. Anything else makes it looks like we are trying to minimise their ability to understand, mitigating our ability to effectively partner with them
Our partners in most industries rely up on them to make these decisions for them and at the end of the day provide them with the greatest tangible impact we can provide (e.g. I found many obstacles to achieve L3 evaluations due to system issues or leadership buy-in, hence the need to be a greater partner to achieve the removal of these roadblocks).
So in reviewing ADDIE, I took some time to pen my thoughts about a way we may potentially approach this model. I must say this doesn’t close my eyes to alternatives but over the years I have altered my approach – particularly in roles where I lead teams and manage the end-to-end learning intervention.
A - (Assess) – still a critical need to uncover the training gap (if it is training gap). This has now evolved into performance consulting. Do I have the buy-in from the business? Do they support my decisions and have demonstrated my ability to take subject matter time away from their main job?
A quick-win, what low hanging fruit can I achieve in my 6 months to demonstrate impact and my value-add?
D – (Design) – Items to consider, what modalities have we considered with ongoing technology changes to rapidly happening. Have we added quality through a learner analysis to deliver a format that works for their learning need? During the design are we considering the ways that we intend to measure the successes discovered in the assessment and do we have the infrastructure to realistically deliver this? Getting people to training can be difficult but what are ways we can gather critical feedback especially for L1-3 to ensure we are getting tangible results (more than “the coffee was great”)? Finally do our leaders understand that getting these levels of evaluation done are more than a box-checking activity – hence our need to reinvent how we measure success.
D – Develop – During the development phase how engaged our project sponsor & SMEs to show new ways to showcase the design (i.e. more than a talking power point instead FLIP)? Using the design our we working with the design team to continue to ensure that this is leaner centric and work with the designer to different technology they may not be aware of (again a solid partnership and not something persons). Finally, are we aware of infrastructure challenges and are we documenting these to develop a future business case for change?
I – Implement – Gold star training or not, if the learner is not aware of the training then you will never see impact. During initial phase (Assess/Performance Consulting) how do we intend to work with our other partners like communications to ensure awareness of key critical training and the investment made to support our company up skilling? Learning alone cannot own this key function.
E – Evaluate – As discussed this should never be an after thought nor count on the last minute hand out of a L1 sheet to participants as they scramble in most cases to finish the day. Going back to the design and development are these being used along the way? Furthermore do we have the infrastructure in place to conduct more than an L2 (once again partnering and awareness of what the company can deliver including leadership buy-on; specifically when it comes to ROI)?
Lessons learned:
Successes to ADDIE from my personal experience rely on some key areas
• Can we do more around awareness & communication (i.e. more than a newsletter or email)
• Technology - Is it available and what is the ease of use (for the learning & creating the reporting)?
• Do you have senior leadership support and buy-in? As performance consults are we taking the time to ensure whom learning can impact, through score cards, small wins and support from champions in the business that can support
• Are we as learning champions keeping apprised of the latest technology whilst considering the learning analysis (i.e. not something we personally like, but feasible for our business)?
• Are we engaging our leadership to demonstrate the document KPI’s during our performance consulting phase, demonstrating the change?
• Ensure that ADDIE isn’t working in Silos
Whilst I see areas for ADDIE to evolve as they say, “the only constant is change” in many ways I have revisit ADDIE added over the years my experiences since I was first introduced it. This is to not to suggest ADDIE will be something I will carry throughout my career, but I wanted to demonstrated how I personally have evolved it in each company I have joined, each with their own challenges or roadblocks. However this keeps me on my toes and open minded to new ways of doing things.
Always welcome to thoughts and ideas as we are forever learning (yes, even L&D professionals)!
Whilst I see areas for ADDIE to evolve as they say, “the only constant is change” in many ways I wanted to revisit ADDIE and add over the years my experiences since I was first introduced it. This is to not to suggest ADDIE will be something I will carry throughout my career, but wanted to demonstrated how I personally have evolved it in each company I have joined, each with their own challenges or roadblocks. However this keeps me on my toes and open minded to new ways of doing things.
Always welcome to thoughts and ideas as we are forever learning (yes, even L&D professionals)!
Just love this. Love ADDIE and believe it’s the how you adapt it, that still makes it so relevant despite its 1970s roots in US military
Great points Patrick. I agree 'evaluate' should be used along the way of design and development.