Reflections on Embedded Health Workforce Research & Evaluation
By Caroline Chamberland-Rowe | October, 2025
I have now been working as an embedded scientist within Nova Scotia Health for almost four years, first as a Health Outcomes Scientist, and now as the Scientific Director overseeing research and evaluation activities associated with the organization’s health workforce initiatives. In this blog, I reflect on some of the lessons I have learned using embedded research and evaluation to study (and support) workforce strengthening in a province that is actively pursuing health system transformation.
-
A critical challenge we are facing when evaluating workforce interventions rests in our (in)ability to isolate the true impact of a single intervention in an environment where so many complex interventions and contextual changes are occurring simultaneously. My experience in this space has highlighted the importance of 1) complementing quantitative key performance indicators with direct qualitative engagement with care teams, patients and families, and leaders to learn from their experiences, expertise, perceptions, and insights, 2) measuring the combined performance of the suite of interventions at the system level, while evaluating the implementation and impact of individual interventions at the micro and meso levels to inform their refinement, and 3) allowing data gaps highlighted by evaluation limitations to spur data infrastructure strengthening and progressively build organizational capacity for robust health workforce research, evaluation, and planning.
2. Despite significant gains, the full scope of our systems’ health workforce challenges will not be addressed overnight. As such, it is essential that we celebrate the smaller victories along the way. In the space of monitoring and evaluation, this means complementing long-term goals with short-term targets. These targets are not only more appropriate benchmarks to gauge progress and recognize the significant progress being made across the system, but also serve as a communication tool to establish clear, achievable expectations with partners and leaders.
3. Beyond recognizing progress, adopting a learning health system approach is about not shying away from reporting on the challenges experienced, but rather contextualizing them. I often prompt team members to ask “Why?” until we have uncovered the system problems and not the people problems. This not only encourages comprehensive and balanced findings but also promotes the inclusion of all perspectives, improving the acceptability of findings for all those involved in responsive actions and associated change management processes. Beyond Nova Scotia Health, transparent and contextualized reporting has increased the relevance, relatability, transferability and utility of our evaluation findings for knowledge users, and has encouraged crucial conversations about current health workforce challenges shared across organizations and jurisdictions.
4. When completed as an integrated component of a learning cycle, embedded research and evaluation can be leveraged as a workforce intervention in and of itself by serving as a process of engagement where local care teams, patient and family partners, and system-level leaders alike can have their voices heard. Knowledge user partners must, however, have readiness and capacity to respond to feedback collected to realize the full potential of this engagement and to maintain employee buy-in in research and evaluation activities.
5. Embedded science has a crucial role to play in moving the needle in health workforce evidence. Co-production comes naturally when knowledge users are your colleagues and system partners who invite you to their working meetings and ask for your assistance in identifying emerging workforce challenges, designing responsive solutions, and integrating best evidence in real time. If health system organizations - who are uniquely positioned to comprehensively report on the breadth and depth of the implementation of health workforce interventions - invest in sustainable, embedded scientific support and publicly report on their evaluation activities, our field will have the opportunity to learn not only from the featured success stories, but from their experiences of adaptation when interventions do not go as planned.
6. The pace and impetus for research are fundamentally different in an embedded role than in an academic setting. Pragmatic approaches are required to generate and mobilize rigorous evidence at the speed required to match the pace of action within health systems. This is, however, enabled and rendered feasible by the data infrastructure and access afforded to embedded scientists as internal members of the organization they are supporting. While external funding agencies have developed targeted opportunities to support embedded, impact-oriented and co-produced research, misalignments remain in the timelines associated with these opportunities (ie. when a need for evidence is identified, embedded researchers can’t wait 6+ months before evidence generation is even initiated), and reviewers’ understanding of the embedded research context (e.g. concerns about the feasibility of projects that stem from data collection and acquisition timelines for scientists that are external rather than internal to the system). Furthermore, the impetus for research in this type of role will always be system-driven rather than investigator-driven. This requires embedded scientists to reorient their sources of job satisfaction and professional fulfillment toward their ability to directly support knowledge users with the evidence they require to support health workforce strengthening and broader health system objectives.
In all, four years in, I am fulfilled in my embedded science role, and am excited about the opportunities that embedded science presents in supporting the development of learning health workforce systems.