How do you increase the impact of your live interventions?
When you’re in front of a group they’re inspired. They want to put the ideas they’ve learned into practice when they leave the room. This is, after all, why your clients hire a professional facilitator; not just to transfer knowledge but to help participants find the relevance in the content so that they want to change. It’s a powerful skill, and you’re good at it.
And yet.
As soon as your participants leave the room that good intention begins to erode. It’s not your fault. It’s not their fault. The busy, demanding schedules we all live, combined with the alluring power of the “status quo” can quickly drag your participants back to the way things always were. And your training becomes a happy (if hazy) memory.
So what do you do?
How can you, as a facilitator, increase the likelihood that they will actually follow through on their good intentions? How do you galvanize their plans; providing them with a shield to protect their fragile beginnings of changed behavior from the onslaught of demands on their time and attention?
Remember – you’ve inspired them to want to create change; change that will impact their business, their team, their boss – a change that can have – will have! – incredible ripple effects… if only they can stick with it long enough to establish new habits.
There are 11 factors that we study here at Actionable that support this behavioral change. One of those factors is what we call “micro-reflections”. They’re simple, quick and incredibly powerful. And the best part? You can implement them in your very next session.
Here are some specific tips you can implement with your own clients to help them put ideas into action, and make them stick through the power of micro-reflection.
First off, what is micro-reflection? Our definition is it’s 30 seconds of reflection on a single commitment to a new habit.
30. Second. Reflection.
It’s a quick pulse check – not an exhaustive analysis of how things have been going, but rather a reminder of the commitment someone made and a check in as to how that commitment is going so far.
And, as it turns out, there is no greater element for making change stick than deliberately focusing on micro-reflections following a learning event.
At Actionable we measure this in two ways. One, we use a numeric score from 1 to 10 for participants to rate their change. Two, we ask them to populate a “micro-journal”; a quick note as to why they rated their behavior change that particular score.
For example, if they rated their improvement on establishing that new habit a 4 one day, and a 6 the next, the journal prompts them to ask the simple question of “why?”, and to record it in the moment.
In a recent study, we examined over 25,000 commitments individuals have made in the Actionable platform following a session. We looked across that data for trends related to micro-habits and identified 4 ways that facilitators can directly improve change stickiness for their participants.
1. Reflection Cadence
Micro-reflection is important. Regular micro-reflection is even more important. In the Actionable platform we have seen a positive correlation between how often participants check in on their commitment and the resulting impact of that commitment in the data below.
The data confirms the greater the discipline individuals bring to their reflection by way of the number of times they engage with it, the greater the impact is going to be. The sweet spot appears to be between 5 and 10 times in a 30 day cycle (or roughly twice/week).
When you are encouraging your participants to make a commitment, try to build into your program a weekly reflection at minimum (2x/week is better!) for them to have the opportunity to pause and reflect on their progress to date.
“…try to build into your program a weekly reflection at minimum (2x/week is better!) for them to have the opportunity to pause and reflect on their progress to date.”
2. Commitment Duration
Within the Actionable platform, facilitators have the option to set up a commitment to run anywhere from 5 to 45 days. When we looked deeper into the varying commitment durations and normalized* the data we’re able to see that commitments should run a minimum of 10 days.
When we normalized the data it revealed commitments that lasted at least 10 days saw a rating change 0.6 higher, on average, than those with a duration of less than 10 days (p<0.0001)†.
Of note, commitments longer than 29 days do not see any significant increase in rating change. This demonstrates a diminishing return wherein there is no significant progress or increase in rating change experienced by those interacting with their commitments for longer than 29 days. On the other hand, if individuals aren’t provided with enough time, in turn it does not give them enough runway to build momentum. They don’t have a chance to get into a cycle of micro-reflections and therefore the improvement is not as great.
To implement this in your programs let’s use the example of a multi-session program. Set a check in call for 3-4 weeks after your main session. Encourage participants to stick with their commitment until then.
3. Notifications
The Actionable platform has a built-in nudge technology. This “Habit Builder” sends participants a reminder to check in on a schedule that they choose. As you would expect, not everyone who receives a nudge will complete a check-in each time.
There is, however, a very direct correlation shown on the graph below, that the more notifications they receive, the higher their rating change when evaluating the new behavior.
There was the thought that if individuals were to receive too many notifications this could lead to overload or annoyance but the data shows this is not the case. The data shows the more notifications participants receive, the more engaged they are in the process.
Whether you are using the Actionable platform or not, consider how frequently you are sending nudges or reminders for your participants to engage in those micro-reflections. If you are already encouraging them to do this weekly, that’s great! If it’s less, you will likely need to increase your notification frequency to help encourage more micro-reflections, and thus, impactful behavior change.
Provide a mechanism for daily reflection.
4. Time of Day of Notifications
The graph below shows the different times of day individuals are being prompted to engage in the micro-reflection process.
After 4PM is more effective than each other timeframe (p<0.0001 to p=0.002)†. Perhaps that isn’t surprising considering people get caught up in their day and if we assume a large percentage of people are on a nine to five schedule it makes sense to have these reflections bookending their workday. While the data shows ideal time frames, the most important aspect to highlight is that of participant autonomy. When participants take the additional step to actively select a time that works for them, rather than passively accepting the default time, this provides them with another small way to personalize the experience and set themselves up for success.
When asking your participants to set a commitment at the end of your session, encourage them to set a daily reminder for themselves – ideally after 4PM – to spend 30 seconds in micro-reflection.
Regardless of whether you use the Actionable system to streamline this process, these are all strategies you can implement right away, starting in the room during your session. You can get participants into the headspace of what that one thing is that they are going to commit to. You can help them plan how often they will reflect on that. Consider if you have a system by which they can do that reflection in an efficient way and how often you will be nudging them and what time of day those nudges will be received. All of this can have a large impact on their realized behavior change.
Looking at the 180,000 data points we have accumulated over the last year there is a very direct and linear correlation between the number of engagements in micro-reflections and the impact had.
If you would like to see how your own data could show up on these reports please book a discovery call with us today. Someone from our team would be happy to have a conversation with you. We can chat about how your programs are currently structured and where, or, if, Actionable could bring a layer of measurement and visibility to your process; further supporting your learners and helping you demonstrate the behavior change your programs are creating. You’re having an impact. Let’s help prove that.
Actionable is on a mission to help boutique consultancies scale their business by giving them the tools to prove and amplify their impact.
We can’t wait to meet you.
*When we say “normalize the data”, we mean extrapolating the number of check-ins in a fixed commitment window to what would have happened over 30 days. For example: if someone had set up a five-day commitment and checked in one time in that five day period (1 in 5), we’d “normalize” five days into a 30 day month, for the equivalent of 6 check-ins (1 in 5 = 6 in 30).
†To illustrate the strength of our conclusions, we include the p-value along with the raw difference between categories. The p-value shows the probability that the difference between two values could be a result of chance. P-values lower than 0.05 are generally considered to be statistically significant and the lower the p-value the lower the probability of chance explaining the difference.