Part 3: Deploy, Measure & Improve

How I learn and adapt:

This is the part of the process where I often see the most gaps. It’s easy to focus all our energy on building the training, but if we don’t launch it well, measure the right things, or make space for iteration, we’re undermining the value of our work.


I make sure deployment is clear and user-friendly, measurement is planned from day one, and improvement is baked in – not squeezed in. I’ve built tools, templates, and habits that make it easier for teams to do this well and do it consistently.

Deploy:

Deployment is more than a tick box exercise (e.g. a final QA review or Go-Live checklist), but it’s often treated like it. I look at this differently; the way we launch the learning is the user’s first experience of the learning – and that experience is crucial.


It needs to build trust, remove confusion and set the tone for everything that follows. Therefore, I treat Deployment as a design and comms task, not just a technical one. That means, I ensure there is:


  • - A LMS/VLE design that makes the experience easy to find, navigate, and return to

  • - A clear comms strategy that introduces the learning in a timely and engaging way (see the image on the right for an example of a comms strategy brainstorm)

  • - Templated emails and instructions that are simple, visual, and easy to follow

  • - A cadence that makes sense – launch timing that aligns with team capacity, rhythms, and wider company priorities

Example: Created a Learning Hub

Our LMS didn’t support blended learning in a way that made sense for learners. Content was scattered across emails, calendars, and platforms – making it hard for people to know what applied to them, when, or why.


To fix this, I designed and built a Learning Hub. The UX and structure were shaped by user interviews and expert input, resulting in a one-stop hub with all learning curriculums, a training calendar, and – most importantly – role-specific pages that surfaced only the relevant training, news, and resources for each learner. This approach boosted completion rates by 20%.

Example of a Learning Hub

Example of an Role-Specific Hub Page

Building on the hub’s success, I expanded its use by adding full training programmes. As shown below, the pages go beyond basic access and contact information, clearly outlining the content, dates, time commitment, and all relevant links.


This made access easier for learners, improved collaboration with SMEs, and streamlined our workflows (
saving approx. 40 hours of admin per launch).

Example of a Training Programme hosted in the Learning Hub

Example of a Training Programme hosted in the Learning Hub

Example of a Training Programme hosted in the Learning Hub

Example of a Training Programme hosted in the Learning Hub

Measure

As you can see from the blended learning approach above, measuring success isn’t always straightforward – the data we have to consider is usually high in volume and complex to stitch together as it comes from multiple sources.


Therefore, I like to align metrics with business and performance goals from the start, so we have a clear, shared understanding of what success looks like and how we’ll collect the data that support it, and when we’ll stop tracking.


This approach helps ensure I’m not just gathering data for the sake of it – but using it to inform decisions, iterate where needed, and prove impact in a way that’s grounded and practical.


Examples of data I have looked at and built into a measurement strategy:

- LMS data: Completions, drop-off points, time spent, quiz results

- Assessments: Test scores or pass marks for certifications

- Manager input: Coaching feedback from line managers

- Passive and active feedback: Surveys, in-platform ratings, direct learner comments

- External metrics: Behavioural or performance indicators from platforms like Salesforce and Gong


To bring this all together, I partnered with data experts to design dashboards that surfaced the most relevant insights, then created a simple how-to guide for managers to access data for their team members. I also built a template to share high-level insights with leadership, making it easier to connect learning data to business outcomes.

Improve:

Continuous improvement is a fundamental part of how I work and what I value. Every project gives us insight into what worked, what didn’t, and what could be done better next time. But without structure, those insights often get lost in the rush to move on. To make improvement sustainable, I’ve created:


- Retrospective templates to guide honest, constructive team reflections

- A framework for identifying what to improve, when to improve it, and how to prioritise competing needs

- Simple project management tactics to schedule and deliver improvements without disrupting live experiences


I consider this more of an operational process so I won’t go into the detail of every tactic or workflow here. But I will share an example of a retrospective, to show how I turn reflection into something tangible and actionable.

How Did We Do?

An example of an internal Retro detailing what the challenges were, what went well and what we’d like to see improved for the next launch project.

Top 3 Shifts

Following a prioritisation exercise and discussion, we came up with top 3 ‘shifts’ we’d like to see and did some scoping around how we’ll make the change (e.g. critical dependencies and identifying stakeholders)