How to Assess What You’re Working With (Without Judgement)
Part 2 of a six-part series on what happens when a capable Finance Manager gets handed risk management and no one gives them a map.

TL;DR
In Part 1, So, You Own Risk Now, Jess worked out something most accidental risk managers learn pretty quickly: the job is not to rush out a polished document and hope that counts as progress. The job is to understand the business properly first.
Now she is at the next point in the story. She knows more than she did a few weeks ago, but not enough to start building with confidence. Before she refreshes the risk register, rewrites the framework, or promises the Board better reporting, she needs a baseline. She needs to know what is actually working, what is patchy, and what level of uplift makes sense for this organisation.
A couple of weeks have passed since that Board meeting
A couple of weeks after that Board meeting, Jess is back at her desk with the shared drive open and a familiar feeling in the pit of her stomach.
On one screen there is an old risk register. On another, a policy that sounds tidy enough at a high level. In the folders around it are half-finished templates, old Board papers, and the usual collection of documents that look more complete than they really are. Somewhere in all of that, there is risk management happening. The question is whether any of it adds up to a system people can rely on.
She has done the listening. She has sat with people. She has asked questions. She has looked at what exists and, just as importantly, how people talk about it when nobody is trying to sound impressive.
So she knows more now. That matters.
But this is also the point where people usually get themselves into trouble.
Because once you have done the first round of discovery, the temptation is obvious. Start building. Tidy the register. Refresh the framework. Put some structure around reporting. Get something into the Board pack so it feels like movement is happening.
I wouldn’t do that yet.
The thing is, Jess still does not know what she is really working with. She knows there are documents, but she does not yet know whether they are any good. She can see that managers are handling issues, but she does not yet know whether that handling is consistent, repeatable, or visible beyond the people closest to the work. She knows the Board wants better line of sight, but she does not yet know whether the real gap sits in governance, process, culture, or some uncomfortable combination of all three.
That is not a small distinction. If you do not establish a baseline properly, you usually build the wrong thing. And once the wrong thing gets baked into a register, a framework, a dashboard, or a monthly paper, it becomes surprisingly hard to undo.
This is where maturity assessment earns its keep
I know the phrase “risk maturity assessment” can sound a bit consulting-heavy. It can sound like the sort of exercise that ends with a spider chart, a polished PDF, and a room full of people nodding their way through words like developing and embedded.
Done badly, that is exactly what it is.
Done properly, it is one of the most useful things you can do at this stage.
Jess is trying to answer a very practical question, even if the language around it sometimes gets dressed up: how mature is our current approach to risk management, really, and what should we do next because of that?
That is a much better question than asking whether the organisation has a risk register, a policy, or a quarterly discussion about risk. Most organisations have some kind of risk activity. That is not the point. The point is whether risk management is deliberate, aligned, usable, and influencing decisions in the way it needs to.
There is a big difference between “we have some risk artefacts” and “we have a functioning approach to risk management”.
I see organisations blur those two all the time. A spreadsheet in a shared drive is not maturity. A sensible manager who spots issues early is not maturity. A risk paper once a quarter is not maturity either, at least not by itself. One capable person carrying half the control environment in their head is definitely not maturity.
Maturity is when the organisation can do this work in a way that is structured, repeatable, understood, and genuinely useful.
That is what Jess needs to assess next.
What she can already see
By now, Jess can see the broad shape of things.
The organisation is not starting from zero. Very few do. Risk is already being managed, just not in a tidy or consistent way.
The CEO is carrying strategic risks in her head, because CEOs usually are. Finance has a pretty good feel for funding pressure, control weak points, and where the organisation would feel strain first. Operations knows where the near misses keep happening. IT has a mental list of exposures that nobody has really translated into Board language. And the Board is asking for better visibility, which usually means it can feel that the picture is incomplete even if nobody has named the gaps cleanly yet.
So there is capability here. Real capability.
The thing is, it is scattered.
And scattered capability is not the same as maturity.
That is one of the traps at this point. A decent policy, a few good instincts, and a manager who knows where the bodies are buried can make an organisation look more mature than it really is. Sometimes that holds together for years. Then someone leaves, a major decision gets made without the right conversation, or an incident lands in the wrong forum, and suddenly everybody realises how much of the system was running on memory and goodwill.
That is the gap Jess is trying to make visible.
This is not a vibe check
I wanted to be careful here because this step gets watered down all the time.
Jess is not wandering around the business collecting impressions and calling that a maturity assessment. She is not trying to get a general feel for whether risk management seems okay. She needs a structured way to test what is really there, how well it is working, and how consistently it is understood across the organisation.
At StartRisk, the way we do that is to use ISO 31000 as the backbone, then assess maturity through three practical lenses: principles, framework, and process. If you want the deeper methodology, I have written about it here: Risk Maturity Assessment for Small and Medium Businesses.
That matters because it stops the whole exercise turning into opinion.
Otherwise one executive says, “I think we’re doing okay.” Another says, “No, we’re miles off.” Someone points to a policy and says, “Well, we do have a framework.” None of that gets you very far.
A proper assessment asks better questions.
The first is whether risk management is actually creating and protecting value here, or whether it has drifted into a compliance side quest. That is the principles question. Does the organisation treat risk management as something that helps it make better decisions, protect the strategy, and avoid being blind-sided? Or is it something people remember when the Board pack is due?
The second is whether there is a real framework behind the work, or just pieces of one. That is the framework question. Are roles clear? Is leadership actually engaged? Is there a shared understanding of how risk is governed, escalated, and reported? Or is most of that being assumed?
The third is whether the day-to-day process is systematic enough to rely on. That is the process question. Can people identify, assess, treat, monitor, and report risk in a way that is repeatable, or does the quality of the whole thing depend on who happens to be in the room that day?
Those three lenses tell you far more than a yes-or-no answer to “Do we have a policy and a register?”
Where the evidence actually comes from
If this article drifts too far into theory, it stops being useful. Jess is not writing a dissertation. She is trying to work out what she should do next month.
So she needs evidence.
Most of that evidence comes from three places.
First, she talks to people across the organisation
Not just the executive team. Across levels.
This is where maturity gaps often show themselves, because the differences between what leaders think is happening and what staff experience in practice can be enormous. A Board may believe it has good visibility. Executives may believe reporting is broadly fine. Managers may describe escalation as patchy. Frontline staff may tell you bad news gets softened on the way up.
That is not just a process problem. That is culture telling on itself.
So Jess needs to ask the sort of questions that expose consistency, or the lack of it. What are the biggest risks to achieving this year’s objectives? How are those risks discussed? What gets escalated and what does not? If something starts going wrong, how quickly does leadership hear about it? How clear is ownership? Where do people go when they are unsure?
She is not just listening for content. She is listening for alignment. When different parts of the organisation describe the same system in completely different ways, that tells you something important.
Then she reviews the documents
This is the obvious part, but it is not the whole job.
Jess will look at the risk register, framework documents, Board papers, committee minutes, appetite statements if one exists, relevant policies, audit actions, and incident reports. Those artefacts tell her what the organisation says it is doing. They do not, by themselves, tell her whether any of it is alive.
That is the trap.
I have seen organisations with beautifully written documentation and very low maturity. The words are all there. The integration is not. Nobody is using the material to make better decisions. It is sitting in a folder performing seriousness.
So Jess is not asking, “Do we have documents?” She is asking whether they are clear, whether they are consistent, whether they line up with how people talk about risk in real conversations, and whether they show up when important decisions get made.
That is the harder test. It is also the only one worth doing.
Then she looks for operational proof
This is the part people skip when they are in a hurry, and it is usually the part that would have told them the truth.
You cannot assess maturity properly if you never look at how risk management actually shows up in the operating rhythm of the business. So Jess needs to look for evidence in the real world. How were recent incidents handled? What happened with audit findings? Where did escalation work well, and where did something sit too low for too long? How are key decisions recorded? How much of the control environment depends on memory, goodwill, or a long-serving manager who just knows how things work around here?
That is often where the real maturity level reveals itself.
Not in the framework diagram. In the operating rhythm.
Naming the current state matters
Once Jess has gathered that evidence, she needs a way to interpret it.
This is where a maturity model becomes useful. Not because every organisation needs a fancy scoring exercise, but because naming the current state stops everyone talking in vague generalities.
At StartRisk, we use a five-level model. At the bottom end, risk management is reactive, undocumented, and inconsistent. People deal with issues as they arise and there is no real structure holding it together. A step up from that, some framework elements exist, but they are patchy. A register may be there. Reporting may happen. It just is not consistent, and it is not well embedded. Further up again, the organisation has clearer roles, better alignment to recognised good practice, more structured reporting, and risk management is visibly part of governance and decision-making.
That progression matters because Jess is not trying to answer, “Are we good or bad?” She is trying to answer, “Where are we now?” And then, just as importantly, “What level do we actually need next?”
That second question gets missed all the time.
More maturity is not always better
A good maturity assessment gives you something that is surprisingly valuable: restraint.
Not every 40-person NFP needs gold-plated enterprise risk machinery. Some absolutely need stronger governance, clearer escalation, and better reporting. Very few need a heavyweight layer of bureaucracy for its own sake.
That is why target maturity matters just as much as current maturity.
Jess is working in a real organisation, with real constraints. Time matters. Budget matters. Capability matters. Board appetite matters. Operational pressure matters. The goal is not maximum maturity. The goal is fit-for-purpose maturity.
Enough structure that the Board can govern with confidence. Enough clarity that management can act early. Enough consistency that the system survives staff turnover, growth, pressure, and scrutiny. But not so much machinery that people spend their week feeding the framework instead of running the business.
That balance is where the real judgement sits.
What Jess is likely to find
If I am honest, most organisations in Jess’ position land somewhere between ad hoc and emerging. That is not a criticism. It is just common.
Usually there are decent people, decent instincts, and a few useful artefacts. What is missing is consistency across principles, framework, and process. The organisation cares about risk, but mostly when pressure is present. Leadership wants better visibility, but has not defined clearly what good visibility looks like. Managers are carrying real exposures, but ownership is more assumed than explicit. The risk register exists, but people do not fully trust it. Escalation happens, but not always early and not always cleanly. Board reporting touches risk, but does not yet give a clear line of sight.
That is usually the real starting point.
Not failure. Not chaos. Just an honest current state.
And that honesty is incredibly useful, because now Jess has something real to build from.
The output should be a roadmap, not a score
This part matters more than the label.
An assessment without a practical next step is just admin with better formatting. Jess does not need to walk away with a maturity score and nothing else. She needs to understand current maturity by element, what target maturity actually makes sense, where the biggest gaps are, what can be lifted quickly, and what has to happen first if the rest of the work is going to stick.
Sometimes the first job is governance clarity. Sometimes it is reporting. Sometimes it is risk ownership. Sometimes it is rebuilding the register because the current one is not usable. Sometimes it is all of those, but not all at once.
That sequencing is where the value sits.
Why this comes before the register
This is the point of the whole exercise.
Jess has not done this assessment so she can admire the diagnosis. She has done it because the next thing she builds has to match the organisation she actually has, not the one she wishes she had.
That is especially true for the risk register.
If she builds a register for a Level 3 or Level 4 environment when the organisation is still operating at an emerging level, it will collapse under its own weight. Too many fields. Too much precision. Too much reporting discipline assumed too early. On the other hand, if the Board genuinely needs stronger visibility and the organisation already has some decent operating discipline, then a very lightweight register can undershoot the need just as badly.
That is why maturity assessment comes first.
Not because it is theoretical. Because it changes what good enough should look like in practice.
Final thought
By this point in the story, Jess has done something that matters more than it might look from the outside. She has stopped confusing movement with progress.
Plenty of organisations respond to the handover of risk by producing artefacts at speed. A refreshed policy. A new register. A dashboard. A workshop. A demo of some shiny software. Motion everywhere.
Jess is doing something better than that. She is establishing a baseline properly. She is testing what is really there, how embedded it is, where the gaps are, and what level of maturity the organisation actually needs next. That work is not glamorous, but it is the reason the next step has a chance of being useful instead of performative.
And once she has that, she is finally ready to build something practical.
That is where we go next. Because the next challenge is not the assessment. It is turning that insight into a risk register that people will actually use.
If your organisation is at this point right now, this is exactly the kind of work we help with at StartRisk. Sometimes that means a structured maturity assessment with interviews, evidence-based scoring, and a clear uplift roadmap. Sometimes it means sitting down with management and the Board and working out what level of maturity you actually need, instead of chasing an enterprise model that was never built for your size or context. And sometimes it means putting the platform underneath the framework so the whole thing does not slide back into spreadsheets and good intentions.
If that sounds like your organisation, reach out. A conversation is usually the best place to start.