Recently I’ve been seeing a growing trend from IT teams and data leaders preaching the importance of ‘data literacy’. Don’t get me wrong, there needs to be all round a better understanding of how data can be applied or misapplied to almost any problem, BUT and this is a big but, data is the means to the end, it’s the kinetic change to an outcome because of that data and calculus that creates value.
So why the big push for data literacy, and if it’s not the right push, just what do I mean by solution literacy?
Data Literacy — placebo for asking all the questions you should have
Once again this series is quickly becoming a confessional for the sins of my past. As technologists we love to design a technical solution and define the path to get there as a data problem. How many times have you sat there and presented your ‘must-be-perfect’ solution and watched your customers soul drain as they try understand just exactly what it is you’re saying to them. I can admit I’ve done this more than once.
We all need to recognize that to our customers, success is rarely a technical outcome, but rather something that creates a kinetic change in how a decision, process, or outcome is executed.
If you interrogate data long enough it will give any answer you want it to
The truth is, that it is far easier to define a problem that can be implemented if we define it as a data problem. After all that’s what most of us spent hours upon hours studying to solve these problems. We understand the tools, the teams needed, and most of us enjoy seeing it all come together. And while there is 100% the need to define a problem at the data layer at some stage, without clearly understanding what is being impacted, it gets very hard to actually measure or optimize for performance, and relate that back to the customer.
It’s at this point we need to recalibrate how we understand the transformation being asked for, and tackle this in a new way; enter Solution Literacy.
Yes, I made up Solution Literacy — but it makes sense when you think about it
Recently I’ve been spending a lot of time talking to executives that are working towards implementing Applied AI use cases, or even at the beginning of their journey creating or recreating an Applied AI strategy. Unfortunately there is a common thread with all of them I’ve spoken to, too much focus on the tools and tech, and not enough focus on the problem itself. So how do we judge the our teams level of solution understanding?
We just might have known the answer all along
It turns out there is a well documented framework for validating literacy, and understanding. Many of us will remember this approach from our childhood. If you’ve ever been asked what a book was about in school, or what the author intended by a passage of text, you’re participating in summarizing, or more broadly reciprocal teaching.
Reciprocal teaching, is the process of a dialog between teacher and student. It can be used to evaluate other levels of understanding. Specifically questioning, generalizing, clarifying, and predicting.
To apply this, try this experiment. Bring your project team into a meeting and then set about workshopping the following four questions:
- Write a 150 word outline of the project without talking about any technology (summarize)
- Ask the team to identify the areas that require clarification, and what options may exist for those areas that lack clarity (clarify)
- Give the team room to think bigger, question what else might be possible? Are there areas that may have been missed? (questioning)
- Finally ask the team to predict outcomes. What might come next? (predicting)
At this point review the four stages with the team. Does the team agree? Ask the team if they showed this to the project stakeholder would they understand it? If the answers are yes, and yes, write it up and share it.
Congratulations, you’re on your way to developing clear solution literacy and understand across your team. Now it’s time to unleash all the data literacy you can find.