Are you an impact-focused consultant?
Is your driving mission to create lasting, demonstrable change inside your client’s organization?
If the answer is a resounding yes, I want to introduce you to two words soon to become the most powerful words in your vocabulary.
So.
What.
You received a 9/10 average approval rating on your feedback form. So what?
100% of participants complete their homework assignment between sessions. So what?
“So what?” – directed inwards – isn’t typically a comfortable space to occupy. After all, high approval ratings and completion rates feel good! (and they should) But does it demonstrate impact?
How does this actually move the needle?
If the goal is true impact, then satisfaction scores and activity aren’t the goal. Important steps, no question. But they represent the beginning of the journey, not the end.
We need to dig deeper.
In the world of corporate training and people development, far too much of what’s measured is for compliance purposes or a CYA exercise.
In the absence of demonstrable impact, we focus on activity.
We focus on a new, broader set of tactics vs asking the hard questions about the impact of the ones we already use.
And I get it. We were guilty of it here at Actionable in the early days.
When we first launched the Habit Builder, we were initially incredibly focused on the number of interactions in the platform. Watching participants “check in” on their behavior change commitments is exciting, and kind of addictive.
As we designed and developed the platform, one of our most closely watched metrics was “how many times are people checking in on the commitment that they made?”
Our conversations at the time were something like:
“We have a facilitator who has a client who has checked in an average of 12 times.”
“Amazing – people are using it!”
In hindsight, I think we really just wanted to justify (to ourselves and our clients) that investing in Actionable to sustain learning impact was worth it. “More learners using the platform = greater justification”, right?
if additional activity is not creating value in one’s life then it’s essentially creating harm
It was a couple discerning clients who started politely challenging our focus on activity. (the best ideas often come from clients, don’t they?)
If our goal truly is impact (and not just activity), does it matter how often they’re interacting with our technology?
So we started including a followup question in our metrics discussions: “So what?”
The conversation shifted to:
“We have a facilitator who has a client who has checked in an average of 12 times.”
“Amazing – but wait, so what? What is the impact of that? What difference does that actually make?”
In our case, the number of interactions with a commitment, as a data point by itself, was just a vanity metric. A metric that told us people were using the platform. But it didn’t tell us how, why, or to what end.
After some reflection, we realized that if additional activity is not creating value in one’s life then it’s essentially creating harm. It’s just distracting them with more things, and – I believe we can all agree – the last thing we need in our lives is more distraction.
When we expanded our focus to include the “so what?” question, we identified something really interesting in the data set.
Looking at a data set of ~30,000 behavior change commitments, we plotted the number of interactions (how often a learner interacted with the platform) against the final rating change at the end of the commitment duration (their self-assessed rating on establishing a new behavior).
And here’s what we found:
When we took the time to ask “so what?” – with impact as our north star – the data became exceptionally clear:
The more times people interact with the platform, the greater the rating change on their commitment.
In other words, the more likely it stuck with them to effect greater change.
In this case, we were lucky – our earlier desire for “more interactions!” happened to align with the end goal of creating greater impact. But there have been numerous cases where this wasn’t true; times when we have been too swept up in the navel-gazing exercise of celebrating metrics that had no material impact on our end goal. At best, those situations were a waste of time. At worst, they were actually taking us further from our north star.
“So what?” reframed our product design strategy. It informed our new trainer onboarding program. We brought (and still bring) a discerning eye to the “why” behind anything we produce.
“Is this advancing our goal of creating greater, lasting change?”
“Yes?” How can we prove it? How can we improve it?
“No?” Cut it. The training world doesn’t need another shiny object.
I appreciate this is an Actionable “so what” story, but I’d encourage you to consider it for yourself.
The next time you’re celebrating a stellar metric, or considering a new tool (or content), ask yourself the question, “so what?”
It will keep you tethered to the ultimate goal, and help you rise above the noise.
Actionable is on a mission to help boutique consultancies scale their business by giving them the tools to prove and amplify their impact.
If you’re serious about focusing on impact, we’d love to show you how we can help. Book a time to talk with us.
We can’t wait to meet you.