Cailie S. McGuire1, Kelsey Saizew1, Alex J. Benson2, Jean Côté1, Karl Erickson3, Alex Maw1, Alex Murata1, Mitch C. Profeit1, Meredith Wolff4, Brandy Ladd4, & Luc J. Martin1
1 Queen’s University, 28 Division St., Kingston, Canada, K7L 3N6
2 Western University, 1151 Richmond St., London, Canada, N6G 2V4
3 York University, 4700 Keele St., Toronto, Canada, M3J 1P3
4 Ladd Foundation, Toronto, Canada
Citation:
McGuire, C.S., Saizew, K., Benson, A.J., Côté, J., Erickson, K., Maw, A., Murata, A., Profit, M.C., Wolff, M., Ladd, B. & Martin, L.J. (2025). A RE-AIM evaluation of the 1616 sport-based positive youth development program. Journal of Sport for Development. Retrieved from https://jsfd.org/
ABSTRACT
This study used the RE-AIM framework to evaluate the full-scale implementation of a sport-based positive youth development (PYD) program—the 1616 Program. The 16-week program was delivered to 88 ice hockey teams from North America who were introduced to PYD principles via storytelling by professional ice hockey players. Quantitative (retrospective pretest-posttest questionnaires [RPP]) and qualitative (e.g., focus groups) methods were used to collect outcome and process data, which were subsequently mapped onto each RE-AIM dimension. Reach – In total, over 1400 youth were registered in the program (participants were primarily boys who self-identified as White). Effectiveness – Although few significant pretest-posttest changes were observed, within-program RPP evaluations completed by a subsample of participants (n = 111) demonstrated significant changes in multiple dimensions of competence, confidence, and connection. Adoption – Whereas the majority of participants watched the program videos and completed the reflection activities, a 78% attrition rate for questionnaire completion was observed from pretest (n = 727) to posttest (n = 161). Implementation – Participants described having fun throughout the program and thought the content was relatable to their lives. Maintenance – Facilitators of program participation included the online delivery, while the length of the athlete surveys served as a barrier. This evaluation will inform future iterations and we put forth recommendations for similar program evaluation initiatives.
INTRODUCTION
Although sport participation can serve as an avenue to foster positive youth development (PYD; e.g., enhanced psychological, social, and physical well-being), mere involvement does not guarantee such outcomes (Côté & Fraser-Thomas, 2016; Eime et al., 2013). As such, sport programs should be designed to be developmentally appropriate and culturally relevant (Luguetti et al., 2022), facilitate quality social dynamics (e.g., coach-athlete relationship), and enable athletes to engage in meaningful activities that promote enjoyment and interest (Côté et al., 2020). In doing so, engagement in sport can promote short- (e.g., enhanced competence, confidence) and long-term outcomes (e.g., enhanced participation, personal development; Côté et al., 2016). By intentionally attending to these factors and deliberately providing sport programs grounded in PYD principles, researchers and practitioners can limit the challenges associated with adult-centric, professionalized youth sport models (e.g., injury, burnout; Bergeron et al., 2015; Coakley, 2011; Erdal, 2018).
With increased calls to develop evidence-informed sport programming that is intended to support the achievement of PYD (Bean & Forneris, 2016; Bergeron et al., 2015), it is important to determine whether these programs are promoting their anticipated outcomes (Gould, 2019; Holt et al., 2016). Program evaluation—including assessing relevant processes and outcomes—is critical to sport program development and implementation (Allan et al., 2024; Hummell et al., 2023; Shaikh et al., 2020). More specifically, program evaluation can provide both researchers and practitioners with insight regarding the degree to which a program is achieving its intended or potentially unintended outcomes, as well as how (e.g., program components) and why (e.g., program strengths and areas for improvement) those outcomes are being achieved (Patton, 2018).
The RE-AIM framework is an evaluation model that can guide researchers and practitioners through program assessment (Glasgow et al., 1999; Glasgow et al., 2019). This framework consists of five evaluative dimensions that assess both the implementation (i.e., process) and impact (i.e., outcome) of a program: (a) Reach (i.e., number, proportion, and representativeness of individuals willing to participate), (b) Effectiveness (i.e., positive and negative effects of intervention on outcomes of interest), (c) Adoption (i.e., percentage and representativeness of organizational adoption influenced by factors such as cost, resources, and location), (d) Implementation (i.e., cost, quality, and consistency of intervention delivery, and adaptations made), and (e) Maintenance (i.e., sustainability—long-term individual behaviour change; the extent to which organizations institutionalize the program).
Whereas the RE-AIM framework originated in the fields of public health and behaviour change research, it is well suited for use across a variety of physical activity settings given its numerous transferable strengths, including the consideration of individual and organizational factors, as well as the balanced approach to examining both efficacy and effectiveness (Gaglio et al., 2013). Indeed, the framework has been employed across a range of diverse contexts (e.g., community-based interventions; Jung et al., 2018) and with specific populations (e.g., Indigenous Peoples; Baillie et al., 2017). Recently, RE-AIM has been adapted for the purpose of sport program evaluation by several research groups (e.g., Lawrason et al., 2021; Saizew et al., 2022; Shaikh & Forneris, 2023). As an example, Lawrason and colleagues (2021) operationalized the RE-AIM indicators specific for multisport service organizations due to the framework’s applicability to sport, and in recognition of the broader need for evaluation tools that are both evidence-informed and practical within the sporting space. These authors created a template that provides researchers and practitioners with a tailored and pragmatic tool to conduct comprehensive, sport-based program evaluations.
Current Research Context: The 1616 PYD Program
Given sustained calls to conduct research with knowledge users rather than on or for them (Leggat et al., 2023; Smith et al., 2023)—while also considering the many benefits associated with conducting program evaluations—we (i.e., the research team) collaborated with a non-profit organization to co-create, implement, and assess a unique PYD program. The 1616 Program is a free online story-based PYD program targeted to youth ice hockey players aged 10 to 12 years across North America. This program is the primary initiative advanced by the Ladd Foundation, instigated by Andrew Ladd—a former National Hockey League (NHL) player—alongside his partner Brandy Ladd. Through the adoption of an integrated knowledge translation (iKT) approach (Graham et al., 2006), the 1616 Program was co-created by an interdisciplinary team (e.g., researchers, the Ladd Foundation, media specialists, story-telling experts) to provide free, accessible, and adaptive messaging to youth within the ice hockey context. Of note, the name of this program stems from Andrew’s playing number (i.e., 16), and 1616 was the year the term ‘buffalo’ was coined for the American Bison. The theme of the 1616 Program is to develop a ‘buffalo mindset’, as buffalo band together and move through a storm with their herd to overcome challenges more effectively. An in-depth description of the partnership process and the development and design of the 1616 Program can be found elsewhere (Martin et al., 2023).
The program is delivered online through interactive videos featuring ice hockey athletes from the highest levels of competition (i.e., Professional Women’s Hockey League, NHL, Olympics). These athletes, serving as role models, shared personal stories and anecdotes of evidence-informed PYD topics to inspire program participants, their parents/guardians, and coaches. Importantly, 1616 is grounded in the Personal Assets Framework (Côté et al., 2020), which suggests that through the optimal interaction of three dynamic elements (i.e., personal engagement in activities, quality social dynamics, and appropriate settings) sport participation can result in both positive short-term (i.e., the 4Cs: competence, confidence, connection, and character) and long-term outcomes (i.e., the 3Ps: participation, performance, and personal development).
As a brief overview of program content and delivery, a concept aligning with the 4Cs of PYD is introduced weekly by a short video (~five minutes) from an ice hockey role model to each athlete, their parent/guardian, and coach via the 1616 online platform. Through story-telling, the ice hockey role model discussed their connection to the week’s concept. Following each video, athletes were given pre-developed reflection and action-based activities (e.g., ‘Live it Outs’) through the online platform to reinforce and consolidate their understanding of the weekly concept. Parents/guardians and coaches were also given pre-developed resources (e.g., tip sheets, videos) to support their athlete. These resources were developed collaboratively between the Ladd Foundation, research team, and the creative committee (i.e., digital media content specialists: Anthem Creative [https://anthemcreative.ca/], Banner [https://www.banner.tv/], and The Post Game [http://www.thepostgame.com/]). Although simultaneously including athletes, parents/guardians, and coaches is a novel programmatic feature for PYD sport programs, this study solely reviews data collected pertaining to the athlete component of the program. Examples of program resources are available at the following link (McGuire et al., 2025): https://osf.io/2e3vr.
The 1616 Program underwent initial proof-of-concept testing during the 2021/2022 season to assess its preliminary impact and effectiveness (Côté et al., 2023). This condensed 5-week evaluation was conducted with 11 ice hockey teams (n = 160 youth, 93 parents/guardians, and 11 coaches). It included both quantitative (e.g., retrospective pretest-posttest [RPP] questionnaires) and qualitative (e.g., focus group sessions) outcome and process assessments. Overall, the results of the evaluation demonstrated that the program positively impacted the intended PYD outcomes (i.e., 4Cs) and that engaging in the program was deemed valuable and enjoyable for participants (i.e., athletes, parents/guardians, and coaches). Given the purpose of proof-of-concept testing, continued program development, full-scale implementation, and a more robust evaluation were justified.
Purpose: Full-Scale Implementation and Evaluation of the 1616 Program
In October 2022, the 1616 Program launched its inaugural season across North America. Over 16 weeks, athletes, their parents/guardians, and coaches were exposed to important PYD concepts using a role model story-telling approach. Within the remainder of the article, we (a) detail the RE-AIM indicators used for the program evaluation of the athlete segment, (b) present the results of the evaluation, and (c) discuss implications for improving and refining future iterations of this program, and with youth sport programming more generally.
METHODS
This study adopted a partnership-based pragmatic approach to program evaluation. Pragmatism is a research philosophy of knowledge construction that focuses on the development of solutions for practical problems within a specific context (Giacobbi et al., 2005). Unlike positivism (i.e., seeks to determine which version of the truth is more accurate) or constructivism (i.e., examines the existence of multiple realities), pragmatism considers the practical concerns of human lived experience. As such, we aimed to work collaboratively with invested partners to (a) understand participant experiences with the current offering and (b) develop practical, relevant, and feasible recommendations to enhance future iterations of the program (Giacobbi et al., 2005). In alignment with this approach, we used a mixed-method design to comprehensively evaluate relevant indicators within the RE-AIM dimensions (Holtrop et al., 2018).
Participants and Procedure
Following institutional research ethics approval, youth ice hockey teams across North America were recruited to participate in the full-scale implementation of the 1616 Program. Through purposeful (e.g., existing ice hockey partnerships), snowball sampling, and word-of-mouth (e.g., via 1616 sponsors) techniques, coaches of youth ice hockey teams were contacted directly to participate in the program (e.g., via e-mail). If interested, the coaches enrolled their teams and subsequent messaging was delivered to the parents/guardians of the athletes. To participate, athletes had to be between the ages of 8 and 14 years and registered with an ice hockey team located in North America. Parental/guardian consent and athlete assent were obtained through an online survey platform (i.e., Qualtrics) prior to participating in the program, completing the study questionnaires, and/or engaging in a post-program interview. For a breakdown of participant demographic data, please see the reach section of the Results.
Data Collection and Measures
The program spanned October 2022 to February 2023. Indicators for the RE-AIM dimensions and corresponding modes of data collection (Table 1) were collaboratively determined alongside the 1616 Program partners based on relevance (e.g., program needs) and feasibility (e.g., available resources). The entire program took place via the 1616 online platform with videos released weekly alongside supplemental activities and resources. Upon the videos being released, the material was accessed independently by the athlete at their own convenience. The research team was responsible for collecting quantitative (e.g., questionnaires) and qualitative (e.g., focus groups) data before, during (i.e., three time points), and after the 16 weeks via Qualtrics and Zoom to provide a robust assessment of program efficacy and effectiveness (Holtrop et al., 2018).
Table 1 – The Operationalized RE-AIM Indicators for the Full Evaluation of the 1616 Program
|
Indicator |
Definition |
Modes of Data Collection |
|
Reach: The number, proportion, and representativeness of individuals who engaged in the 1616 Program |
||
|
Indirect Reach |
How 1616 indirectly reaches the target population |
Total number of followers on social media platforms (Instagram, X) |
|
Intended Reach |
How 1616 intends to reach their target population |
Total number of subscribers (via SMS text message, e-mail, or both) |
|
Direct Reach |
How 1616 directly reaches target population |
Number of end-users (e.g., teams, athletes) who participated in 1616 |
|
Description of End-Users |
Demographic characteristics of 1616 target population |
Characteristics including numbers of years in sport, geographic location, gender, age, and race |
|
Effectiveness: Positive and negative outcomes of the 1616 Program |
||
|
Member Belief in Effectiveness |
Whether 1616 members believe 1616 helped them improve based on program goals |
Percentage of members believing 1616 improves initiative goals in members (e.g., through questionnaires and/or interviews) |
|
Other Benefits for Members |
Whether 1616 members believe initiative has other additional benefits for members (e.g., outside of hockey) |
Percentage of members believing 1616 has additional benefits for members (e.g., through questionnaires and/or interviews) |
|
Member Outcomes |
The effectiveness of 1616 programming for short-term member outcomes |
Definitions of effectiveness for 1616 members (4Cs, well-being/enjoyment, sport commitment) |
|
Negative Outcomes |
Negative outcomes associated with 1616 membership |
Post-program interviews to elicit feedback on program limitations/weaknesses |
|
Adoption: The number, proportion, and representativeness of members who have agreed to participate in the 1616 Program |
||
|
Adherence |
Extent to which 1616 adheres to the initiative goals |
Questionnaires that assess athlete engagement with program content/resources and post-program interviews Total video views and devices used to watch videos |
|
Attrition Rate |
The rate at which 1616 members are no longer participating |
Percent attrition for initiative to date |
|
Implementation: Extent to which the 1616 Program was delivered as intended |
||
|
Compatibility |
Evaluate actual implementation of 1616 compared to the initiative guidelines |
Interview participants about intended outcomes and ask how they engaged with program |
|
Delivery |
Measure the skills used by 1616 implementers to deliver initiative goals |
Conduct interviews with program participants about implementations of program goals |
|
Contact |
Count the number of direct contacts 1616 has with members |
Count total number of weekly resources sent out to end-users |
|
Maintenance: Degree that the 1616 Program is sustained, and participant outcomes are maintained over time |
||
|
Facilitators and Barriers |
Members’ experiences with 1616 and any factors that promote or inhibit their ability to participate |
Examine the personal facilitators and barriers to be an initiative member (post-program interviews) |
Note: This Table is adapted from the RE-AIM template developed by Lawrason et al. (2021).
Quantitative Data
All athletes were asked to complete pre- and post-program questionnaires (~20-25 minutes in length). At the beginning of Week 1, general well-being/enjoyment, connection, confidence, and character were assessed via existing validated questionnaires (see https://osf.io/2e3vr; McGuire et al., 2025). Of note, ‘competence’ was targeted across the entirety of the program as a typical objective for youth ice hockey teams. In addition, demographic information (e.g., age, race, team location, years of hockey experience) was also collected to contextualize participant responses and to link pre- and post-program data. The same questions were asked post-program with the addition of items targeting program processes (e.g., delivery) spanning (a) general program (e.g., how many videos were watched), (b) the 3Ps (e.g., if athletes learned something about themselves), and (c) program production and implementation (e.g., whether the athletes spoke to their teammates about the program). In total, 727 athletes responded to the pre-program questionnaire and 161 responded to the post-program questionnaire.
A subset of teams (n = 12; 111 athletes)—termed the ‘super-user’ group—was identified randomly and asked to complete three additional questionnaires, one after the completion of each ‘C’ component (every ~four weeks). This decision was shaped by results of the pilot testing (e.g., survey completion rates) and feasibility considerations (e.g., resources required to collect data at multiple time-points). For these time-points, the Cs (i.e., connection, confidence, character) were assessed using the same pre-program items but in RPP format. RPP has participants rate items at one posttest time-point based on two instances, ‘before’ and ‘now’. This format gauges the perceived degree of change more accurately between two time-points while being less time consuming for the participant (Little et al., 2020). Accordingly, athletes were asked to respond to each item based on how they felt ‘before’ being involved with the 1616 Program and ‘now’ after having completed each ‘C’ section. Based on each C, program process questions were also asked (e.g., ‘How many [connection/confidence/character] videos did you watch?’). The complete RPP questionnaires can be found here: https://osf.io/2e3vr (McGuire et al., 2025). Youth responses to the RRP connection (n = 77), confidence (n = 48), and character (n = 26) questionnaires varied across time.
Qualitative Data
Eleven athletes (n = 9 boys, 2 girls) were recruited via convenience sampling to participate in a focus group (n = 3 athletes) or individual semi-structured interviews (n = 8) online via Zoom. These interviews took place between February and March 2023 and aimed to explore participants’ experiences after having completed the program. The interview guide was developed based on the goals of the 1616 Program and PYD/evaluation literature (e.g., ‘What have you learned from the program?’; ‘What are your thoughts on how the program was delivered?’; Patton, 2014) and corresponded with the pre-determined RE-AIM indicators. The interview guide can be found here: https://osf.io/2e3vr (McGuire et al., 2025).
Research Team
Given that multiple partners were involved in the creation and implementation of the program (e.g., Ladd Foundation, research team), it is worth clarifying who was responsible for the primary data collection. The research team was comprised of nine sport psychology researchers with a range of experience (e.g., master’s and doctoral students, Ph.D. researchers) and expertise (e.g., qualitative and quantitative methods). CM, KS, AMa, Amu, and MP were responsible for data collection and the qualitative analysis whereas AB conducted the quantitative analysis. CM, KS, AMa, and MP are team dynamics researchers and AMu examines coach-athlete and parent-athlete dynamics. LM and AB are team dynamics researchers and JC and KE are experts in PYD through sport and coaching—all of whom oversaw the data collection and analysis processes. Altogether, the interpretations of the data were shaped by team dynamics, PYD, coaching, and leadership theory/research and ongoing conversations between the entire research team and Program partners.
Analysis
Quantitative data from the demographic, pre/post-program, and the three RPP questionnaires were analyzed using R statistical software (v4.2.2; R Core Team 2022). Descriptive and frequency statistics (e.g., means, standard deviations) were calculated. Paired-samples t-tests were conducted to determine between group comparisons for the pre/post-program data and three RPP responses. Given the pragmatic orientation of the evaluation, frequency of responses, means, and percentage of favourable responses were calculated and reported as an indication of program quality.
Interviews were audio/video recorded and transcribed verbatim. Using the Word comment/highlight functions, interviews were first inductively analyzed using reflexive thematic analysis (Braun & Clarke, 2021) pertaining to intended program outcomes (e.g., impact on 3Cs) and processes (e.g., perceived strengths and limitations of the program). Results were then deductively mapped onto the pre-determined RE-AIM dimensions and corresponding indicators. Representative quotes are presented with the participants’ assigned numbers (e.g., Athlete 4 = A4).
RESULTS
The Results are organized by the five RE-AIM dimensions and their indicators supported by quantitative and qualitative data (see Table 1 for the definition of each dimension and its indicators). A summary of key findings from each dimension is provided at the end of each section.
Reach
Reach was assessed using quantitative demographic data and web-based information. We computed indirect, direct, and intended reach as well as a description of end-users (Table 1). Pertaining to the latter, while the target population of the 1616 Program was youth aged 10 to 12 years, we allowed youth between the ages of 8 and 14 to enroll to enhance reach. In relation to indirect reach, 1616 had 1200 followers on Instagram and 43 followers on X (formerly Twitter) at the time of the study. Pertaining to direct reach, over 1400 athletes from 88 teams were enrolled in the program. A total of 727 athletes completed pretest measures. Of the 727 athletes (91% self-identifying boys), 79 athletes belonged to a girls’ team, 393 to a boys’ team, and 243 to a mixed team (n = 12 did not report; Table 2). The majority of athletes self-identified their race as White (83.4%) and fell between the target age group (10 = 27%, 11 = 34%, 12 = 24.8%). Seventy-eight percent of athletes were Canadian, while 22% were American.
The 111 ‘super-user’ athletes (72% self-identifying boys) had similar demographic trends, with the majority of athletes belonging to a boys’ team (n = 56; 50%), followed by mixed (n = 32; 29%), and a girls’ team (n = 12; 11%). Eleven athletes did not report (10%). Athletes primarily self-identified as White (68.5%), and most were 10 (24.3%), 11 (47.7%), and 12 years old (18.9%). Pertaining to the post-program interviews, athletes were 9 (n = 3), 11 (n = 5), and 12 years old (n = 3), with 36% being Canadian and 64% American. With regard to intended reach, parents/guardians registered their youth athletes to receive program communications via a text message, e-mail, or both. In total, 1,015 athletes were subscribed to texting and 1,191 to e-mail notifications. Key message: Whereas the 1616 Program successfully reached the target population (i.e., youth ice hockey players across North America), participants were primarily boys who self-identified as White. As such, purposeful restructuring and recruitment efforts are required to enhance accessibility of the program and develop content that is relevant and meaningful for diverse populations (see Discussion).
Table 2 – Participant Demographic Data
|
Sample (n = 727) |
‘Super-user’ Subsample (n = 111) |
|
|
Self-identified gender |
Boy 91% Girl 9% |
Boy 72% Girl 28% |
|
Racial Identity |
White 83.4% Multiracial 4.8% Asian (Eastern) 3.7% Preferred not to disclose 3.4% Indigenous 1.9% Hispanic or Latinx 0.6% Black or African American 0.4% Asian (Indian) 0.4% Did not respond 1.4% |
White 68.5% Multiracial 4.5% Asian (Eastern) 3.6% Preferred not to disclose 2.7% Hispanic or Latinx 1.8% Indigenous 0.9% Did not respond 18% |
|
Age (years) |
8 1.5% 9 12.1% 10 27.0% 11 34.0% 12 24.8% 13 0.6% |
8 1.0% 9 6.3% 10 24.3% 11 47.7% 12 18.9% 13 1.8% |
|
Nationality |
78% Canadian 22% American |
60% Canadian 40% American |
Effectiveness
Effectiveness was measured using athletes’ subjective perceptions of perceived benefits (i.e., beliefs in effectiveness via post-program interviews) and objective benefits of participating in the program (i.e., pretest-posttest, RPP surveys).
Pretest-Posttest Outcome and Process Scores
Table 3 provides the mean difference, t-test statistic, p-value, and confidence interval (CI) for questionnaire responses from pretest to posttest. Although most measures remained unchanged, perceptions of closeness to the coach (Mdiff: -0.13, p < 0.05, CI: [-0.22, 0.04]) and sport enjoyment (Mdiff: -0.10, p < 0.05, CI: [-0.17, 0.03]) decreased significantly from pretest to posttest while moral values increased significantly (Mdiff: 0.15, p < 0.05, CI: [0.01, 0.30]).
Table 3 – Youth Mean Difference, t-statistic, p-value, Confidence Interval for Pre-Post responses
|
Variable Name |
Mean Difference |
t-statistic |
p-value |
95% CI |
|
Emotional wellbeing |
0.02 |
0.22 |
.833 |
[-0.12, 0.15] |
|
Psychological wellbeing |
0.07 |
1.31 |
.194 |
[-0.04, 0.19] |
|
Sport enjoyment |
-0.10 |
2.74 |
.007** |
[-0.17, 0.03] |
|
Ingroup ties |
0.03 |
0.40 |
.691 |
[-0.11, 0.17] |
|
Cognitive centrality |
0.17 |
1.65 |
.101 |
[-0.03, 0.36] |
|
Ingroup affect |
-0.09 |
1.59 |
.114 |
[-0.20, 0.02] |
|
Coach closeness |
-0.13 |
2.79 |
.006** |
[-0.22, 0.04] |
|
Positive parental involvement |
0.01 |
0.25 |
.801 |
[-0.07, 0.09] |
|
Sport self-confidence |
0.01 |
0.16 |
.873 |
[-0.09, 0.11] |
|
Task-orientation |
0.04 |
1.02 |
.308 |
[0.04, -0.13] |
|
Competitive excitement |
-0.07 |
1.26 |
.211 |
[-0.18, 0.04] |
|
Mental toughness |
0.04 |
0.88 |
.383 |
[-0.06, 0.16] |
|
Sportspersonship |
0.01 |
0.42 |
.673 |
[-0.05, 0.07] |
|
Moral values |
0.15 |
2.15 |
.033* |
[0.01, 0.30] |
As shown in Table 4 for the post-program evaluation, the majority of athletes responded with a four (yes) or five (very much) on a 5-point scale pertaining to whether they had fun (78%), learned something about themselves (85%), and improved at hockey (75%). In addition, 76% rated a 4 or 5 that they would recommend the program to a friend.
Table 4 – Question, Mean (SD) and Percentage of Responses 4 and Above for Youth Post-program Process Questionnaires
|
Question Type |
Question |
# of responses |
Mean/5 (SD) |
% of Response 4 or 5 |
|
General Experience Questions |
1. Did you have fun? |
152 |
4.14 (1.05) |
78% |
|
2. Did you learn something about yourself? |
149 |
4.26 (1.00) |
85% |
|
|
3. Do you think you improved at hockey? |
150 |
4.01 (1.09) |
75% |
|
|
4. If a friend as about the 1616 Program would you recommend it? |
151 |
4.21 (1.11) |
76% |
|
|
5. If you had the chance, would you do the 1616 Program again? |
151 |
4.20 (1.18) |
79% |
|
|
6. Now that the program is done, do you think you will keep doing the things you learned? |
151 |
4.35 (1.01) |
83% |
|
|
7. Overall, how excited were you to take part in the 1616 Program every week? |
155 |
3.83 (1.12) |
66% |
|
|
8. Overall, how enjoyable were the videos? |
154 |
4.19 (0.96) |
82% |
|
|
9. Overall, how relevant were the stories to your life? |
154 |
4.06 (0.95) |
77% |
|
|
8. How much did the reflection items help you understand the main messages of the stories? |
152 |
4.18 (0.96) |
80% |
|
|
9. How much did the ‘live it outs’ help you develop a ‘buffalo mindset’? |
153 |
4.06 (1.13) |
75% |
|
|
Question |
# of responses |
Mean/16 (SD) |
% of Response 4 or 5 |
|
|
Process/ Engagement Questions |
1. How many of the 16 videos have you watched? |
150 |
13.29 (4.19) |
69% |
|
2. How many of the reflection questions did you listen to? |
149 |
12.88 (4.57) |
67% |
|
|
3. How many of the ‘live it out’ videos did you watch? |
148 |
12.98 (4.41) |
67% |
|
|
4. How many of the ‘live it out’ tasks did you complete? |
143 |
11.00 (5.00) |
45% |
|
|
Question |
# of responses |
Mean/5 (SD) |
% of Response 4 or 5 |
|
|
Production/ Implementation Questions |
1. Did you talk to your teammates about the 1616 Program? |
153 |
3.12 (1.38) |
44% |
|
2. Did you talk to your parents/guardians about the 1616 Program? |
152 |
4.12 (1.18) |
80% |
|
|
3. Do you think being part of the program was cool? |
151 |
4.07 (1.16) |
74% |
|
|
4. Did you like hearing from the professional athletes in the program? |
152 |
4.60 (0.85) |
91% |
|
|
5. What do you think about the amount of stuff that we shared? (Too much, not enough, just right) |
152 |
Too much = 12% Just right = 88% |
||
|
6. How was the length of the weekly player videos? (Too long, too short, just right) |
151 |
Too long = 11% Too short = 7% Just right = 81% |
3Cs RPP Outcome and Process Scores (beliefs in effectiveness/participant outcomes)
Table 5 provides the mean difference, t-test statistic, p-value, and confidence interval (CI) for the RPP measures and corresponding process scores for the super-user group. After the Connection segment, athletes reported a significant increase in perceptions of ingroup ties (Mdiff: 0.37, p < 0.001, CI: [0.21, 0.52]) and cognitive centrality (Mdiff: 0.24, p < 0.001, CI: [0.12, 0.36]). After the Confidence segment, athletes reported a significant increase in sport self-confidence (Mdiff: 0.46, p < 0.001, CI: [0.31, 0.60]), task orientation (Mdiff: 0.19, p < 0.05, CI: [0.01, 0.38]), and competitive excitement (Mdiff: 0.22, p < 0.05, CI: [0.05, 0.40]). After the Character segment, significant increases were observed for sportspersonship (Mdiff: 0.29, p < 0.001, CI: [0.14, 0.43]) and moral values (Mdiff: 0.35, p = 0.001, CI: [0.14, 0.54]).
Table 5 – Youth Mean Difference, t-statistic, p-value, Confidence Interval for RPP responses of Super-User Group
|
Variable name |
Mean difference |
t-statistic |
p-value |
95% CI |
|
Connection Questionnaire |
||||
|
Ingroup ties |
0.37 |
4.71 |
< .001*** |
[0.21, 0.52] |
|
Cognitive centrality |
0.24 |
3.96 |
< .001*** |
[0.12, 0.36] |
|
Ingroup affect |
0.04 |
0.48 |
.633 |
[-0.11, 0.19] |
|
Coach Closeness |
0.04 |
1.07 |
.289 |
[-0.03, 0.11] |
|
Positive parental involvement |
0.05 |
0.88 |
.383 |
[-0.06, 0.15] |
|
Confidence Questionnaire |
||||
|
Sport self-confidence |
0.46 |
6.32 |
< .001* |
[0.31, 0.60] |
|
Task Orientation |
0.19 |
2.06 |
.045* |
[0.01, 0.38] |
|
Competitive excitement |
0.22 |
2.55 |
.013* |
[0.05, 0.40] |
|
Character Questionnaire |
||||
|
Mental toughness |
0.27 |
1.11 |
. 276 |
[-0.23,0.78] |
|
Sportspersonship |
0.29 |
4.01 |
< .001*** |
[0.14, 0.43] |
|
Moral values |
0.35 |
3.58 |
.001** |
[0.14, 0.54] |
After each segment, athletes responded to process questions spanning their perceptions of enjoyment, whether they learned something new, improved at hockey, and were excited for the next section of the program (see Table 6). On a 5-point scale (from not at all to very much), athletes rated enjoyment on average as 4.13 (SD = 0.70) after the Connection segment, 4.32 (SD = 0.71) after the Confidence Segment, and 4.03 (SD = 1.0) after the Character segment. For learning something new, they rated 4.14 (SD = 0.86) after Connection, 4.34 (SD = 0.80) after Confidence, and 4.20 (SD = 0.96) after Character. Athletes also believed they improved at hockey after the Connection (M = 3.92; SD = 0.94), Confidence (M = 4.34; SD = 0.73) and Character (M = 4.03; SD = 1.0) segments. In general, athletes were excited about each program segment (Connection: M = 4.15, SD = 0.99; Confidence: M = 4.35, SD = 0.69; Character: M = 4.03, SD = 1.30).
Table 6 – Mean (SD) for 3Cs Within-Program Process Questionnaires
|
Question Type |
Question |
Connection Mean/5 (SD) |
Confidence Mean/5 (SD) |
Character Mean/5 (SD) |
|
General Experience Questions |
1. Did you have fun during this part of the program? |
4.13 (0.70) |
4.32 (0.71) |
4.03 (1.0) |
|
2. Did you learn something new about yourself? |
4.14 (0.86) |
4.34 (0.80) |
4.20 (0.96) |
|
|
3. Do you think you improved at hockey during this part of the program? |
3.92 (0.94) |
4.34 (0.73) |
4.03 (1.0) |
|
|
4. Are you excited about the next section from the 1616 Program? |
4.15 (0.99) |
4.35 (0.69) |
4.03 (1.30) |
|
|
5. How enjoyable were the videos on average? |
4.16 (1.85) |
4.39 (0.61) |
4.23 (0.97) |
|
|
6. How relevant were the stories to your life? |
3.93 (1.95) |
3.95 (0.82) |
3.97 (1.10) |
|
|
7. How much did the reflection items help you understand the story? |
4.20 (2.14) |
4.34 (0.84) |
4.20 (1.03) |
|
|
8. How much did the reflection questions help you understand relevance to you? |
4.11 (1.80) |
4.13 (0.75) |
4.17 (0.99) |
|
|
9. How much did the ‘live it outs’ help you feel connected to the people in your herd? |
4.04 (2.28) |
4.20 (0.84) |
4.20 (0.92) |
|
|
Mean/5 (SD) |
Mean/4 (SD) |
Mean/5 (SD) |
||
|
Process/ Engagement Questions |
1. How many of the videos did you watch? |
4.26 (1.23)/5 |
3.59 (0.87)/4 |
4.17 (1.58)/5 |
|
2. How many of the reflection questions did you listen to? |
4.11 (1.38)/5 |
3.34 (1.13)/4 |
3.90 (1.52)/5 |
|
|
3. How many of the ‘live it outs’ did you do? |
3.86 (1.56)/5 |
3.19 (1.23)/4 |
4.31 (0.97)/5 |
Post-program Interviews
Overall, participants found the program to be fun and engaging. They spoke about how much they liked the program’s theme (i.e., to develop a ‘buffalo mindset’) and how the motto could be readily applied within their teams. One athlete noted, “The buffalo mindset had the greatest impact on me because we go into and leave the storm together—meaning we go in as a team and we come out as a team. We are together all the time” (A4). Athletes described not only sport-specific benefits (e.g., enhanced hockey-related skills) but also the development of transferable skills (e.g., resilience, teamwork, mental toughness) that had implications within and beyond the sport context (e.g., school): “I’m going to continue to do everything that the program taught me…I feel like I became a better person. Now I try to get consistent 4s on my report card, and I don’t let anything get in my way of being me” (A2). Finally, they also thought that the program enhanced teammate connections including trusting one another more as well as aiming to serve as leaders for their younger teammates.
In relation to what could be improved, some athletes noted that the lack of a team component (i.e., athletes completed the program independently and on their own time) resulted in less discussion between teammates about the program. By providing opportunities for team-level engagement (e.g., watching the videos and discussing content together as a team), a greater sense of a ‘1616 community’ could be achieved: “It would be fun if we could all watch it together [the weekly videos] and reflect together…do it in the locker room, maybe before practice, we go early, and we all watch it” (A8)…“Yeah, being able to do it with your teammates” (A9). Additional suggestions included wanting ‘check-ins’ during the season to provide ongoing feedback (e.g., with Andrew Ladd and other role models from the videos) and to shorten the pre/post-program surveys as they were seen as too long and cumbersome. Key message: Despite minimal pretest-posttest changes, data from the RPP questionnaires (i.e., super-user group) and post-program interviews showed that athletes enjoyed participating in the program and believed they improved across all 3Cs.
Adoption
Adoption was measured via adherence and attrition rates in the post-program and 3Cs RPP questionnaires as well as web-based information. Pertaining to adherence, 69% of athletes indicated that they watched 13 or more of the 16 videos (see Table 4). Similarly, 67% of athletes noted responding to 13 or more of the reflective questions, while only 45% completed 13 or more of the LIO activities. For the 3Cs process questions, the super-user group responded to how many videos they watched after each section, reflection items they listened to, and LIOs they completed (see Table 6). For all three sections, on average, almost all videos were watched (4.26/5, 3.59/4, and 4.17/5 respectively). The majority of athletes also completed the reflection items and LIO activities. Pertaining to video analytics, there was a total of 12,199 video views with 75% of athletes viewing the videos on a mobile device, 21% on a desktop computer, 3% on a tablet, and 1% on a television. In relation to attrition, while 727 athletes completed the pretest questionnaire, only 161 completed the posttest, resulting in a 78% attrition rate. Key message: Although there was a high level of initial adoption (i.e., most athletes engaged with the weekly videos and reflection questions), adherence to the post-program questionnaire was reduced, as evidenced by the attrition rate.
Implementation
Implementation was measured using compatibility, delivery, and contact (see Table 1 for the description of each indicator). For compatibility, post-program responses indicated that the content was very relatable to the athletes’ lives (see Table 4). More specifically, 66% rated four or five (on a five-point scale) on their excitement to take part every week. Most athletes also found the videos enjoyable (82%) and the stories relevant to their lives (77%). The majority (75%) also thought the LIOs helped develop a buffalo mindset. Finally, 80% ranked the reflection questions as a four or five (on a five-point scale) for helping them to understand the main message of each video. The RPP questionnaires asked similar process questions but were tailored to each ‘C’ segment. Athletes, on average, found the videos very enjoyable and relevant to their lives (see Table 6). Generally, reflection items were seen as helpful for understanding the main messages and stories. The LIOs helped the athletes feel connected to their team, confident in themselves, and assisted athletes in working on their character.
Pertaining to delivery, 44% rated a four or five that they spoke to their teammates about the program, while 80% rated a four or five that they spoke to their parents/guardians (see Table 4). The majority of athletes thought being part of the program was ‘cool’ (74% rated a four or five), while almost all (91%) rated a four or five that they really liked hearing from professional athletes. Eighty-eight percent thought the amount of ‘stuff’ shared was ‘just right,’ and 81% thought the length of the weekly player videos was also ‘just right’. In terms of weekly cadence (i.e., contact with participants), athletes received two weekly communications—one on Sunday which included a graphic and video featuring an elite ice hockey player, and the second on Tuesday which included a LIO graphic and video. In total, there were 35 player communication outreach messages sent and 69 graphics or videos shared with the athletes over the course of the program. Key message: The program was delivered as intended, with athletes describing the resources (i.e., videos, reflection items, LIOs) as enjoyable and relevant, and the amount of content shared as ‘just right’. Whereas athletes engaged with their parents/guardians throughout the program, fewer teammate conversations about the program occurred.
Maintenance
Maintenance was assessed via facilitators and barriers (i.e., factors that promoted or hindered participation over time). In relation to facilitators, athletes enjoyed the online format. Given their busy schedules, the flexibility of an online delivery enabled them to watch the videos and complete the activities at their own pace. One athlete noted, “Sometimes, I was busy during the week, so I sat down and watched 2 or 3 videos a day. I had a big schedule this year. But I tried to watch the videos when I had free time” (A2). The athletes also described finding the player stories, reflection items, and LIOs to be age-appropriate, which enhanced the applicability of the material learned and their engagement with the resources/content. Moreover, the length of the videos was described as ‘just right’ in that they were long enough that the athletes could digest the PYD concept while also maintaining interest. One athlete stated:
I think the [length of the videos] was perfect because you don’t want it to be too long because then they can get boring. But if they’re too short, it’s like you’re not working [enough]. If it’s the middle mark, then you get the information you need, and it gets you. I really wanted the next week to come out. (A5)
An additional facilitator that promoted engagement was the incentives provided (e.g., a professional athlete-signed jersey). Indeed, athletes said that when they spoke to their teammates about the program, it was most often around the incentives.
Pertaining to barriers, while the online delivery was a facilitator for some (e.g., flexible scheduling), it was also associated with challenges. Some athletes were confused about how to receive the weekly e-mails/text messages with links to the videos. For example, some received the links directly whereas others’ parents/guardians opted to receive the link first; thus, the athletes depended on their parent/guardian to grant them access. Moreover, some athletes said that teammates received new e-mails/phone numbers throughout the year and as a result, were no longer receiving the video links: “I know a couple of kids were trying to do it [watch the videos], but they couldn’t…this one kid changed his phone number in the middle of it. So, he wasn’t getting sent the videos anymore” (A10). In terms of post-program data collection, the largest barrier experienced by the athletes was the length of the surveys. During the post-program interviews, athletes said that the surveys were interesting but were too overwhelming to complete in their entirety: “I think the whole [program] is really good. Besides the surveys, [they were] really long…they were sometimes difficult to understand” (A7).
In relation to future intentions, 79% of athletes rated a four or five (out of five) that they would want to engage in the program again if they could (see Table 4). Upon program completion, 83% rated a four or five that they would continue to do the things they had learned in this program. Key message: Athletes identified both facilitators (e.g., online delivery) and barriers (e.g., length of surveys) to fully engaging with the program across the 16 weeks. Despite the identified barriers, athletes demonstrated an eagerness to participate in the program again.
DISCUSSION
Researchers have called for evidence-informed evaluations of youth sport programs to monitor whether (and how) their intended outcomes are being achieved (e.g., PYD; Gould, 2019; Holt et al., 2016). In support of these calls, the purpose of this study was to conduct a RE-AIM evaluation (Glasgow et al., 1999) of the novel story-based 1616 Program. The results of this evaluation provide comprehensive insight pertaining to who participated in the program and youths’ perceptions of the outcomes of participating, program adoption and delivery, and the barriers/facilitators to engaging in the program. In the subsequent sections, we discuss the findings in relation to sport-based PYD programming and evaluation literature and highlight study strengths, limitations, and future directions for the 1616 Program. It is also our hope that researchers interested in youth development across contexts will find our process and recommendations useful for their research and evaluation practices.
Pertaining to reach, 88 teams with over 1400 athletes were enrolled in the program, with most falling within the targeted 10-12 age range. Pertaining to gender and race, more boys participated in the program than girls and primarily self-identified as White. Whereas there were targeted efforts to ensure that the program content remained inclusive, diverse, and accessible (e.g., representation of men and women professional, Olympic, and Paralympic athletes, free of cost; Martin et al., 2023), our sample reflects the general demographic trends across North America with girls and Black, Indigenous, and People of Colour continuing to be underrepresented in ice hockey (e.g., Kaida et al., 2021; Tous-Rovirosa et al., 2023; Wong & Dennie, 2021).
With this in mind, it is critical that the research team and program partners consider the ways in which 1616 could be culturally designed, implemented, and evaluated to target barriers of participation (e.g., Forsyth et al., 2021; Kabetu et al., 2021). As one example, researchers suggest that creating and maintaining trusting and respectful relationships with community partners who represent diverse perspectives, values, and beliefs is critical for enhancing program accessibility and sustainability (Whitley et al., 2015). Specific to 1616, collaborating with community sport organizations who seek to enhance representation and diversity both within ice hockey (e.g., Hockey Diversity Alliance: https://hockeydiversityalliance.org) and beyond (e.g., Aboriginal Sport Circle: https://www.aboriginalsportcircle.ca), could support the development of culturally-informed recruitment strategies and program content.
By providing opportunities for the voices of underrepresented groups (as indicated by the reach results) to be intentionally included within the research and curriculum teams could also assist in developing more inclusive and culturally relevant material (Strachan et al., 2018). One potential avenue to achieve this is through community-based participatory research (CBPR). Within this partnership approach, academic and non-academic partners collaboratively engage in all phases of the research process to ultimately, better understand and address community needs (Israel et al., 1998). By engaging in such an approach, program partners, community members, and researchers can draw upon mutual capacity-building (i.e., co-learning) to share and gain knowledge that best supports the development and implementation of programs that meet the priorities of a community (Coppola et al., 2020). For 1616, developing meaningful relationships with community sport partners could support the identification of mutual interests with regard to sport programming for underrepresented youth, while providing opportunities for community members to share resources to inform sustainable and culturally-relevant sport programming. Additionally, implementing culturally competent evaluation practices is also an important consideration. This could include the research team engaging in reflexive practices (e.g., considering their positions of power; Hall, 2020) as well as employing diverse theories and methods that are responsive to participants’ cultures (e.g., non-Western approaches such as talking circles; Kurtz, 2013).
With regard to effectiveness, few changes were observed from pretest to posttest. Whereas coach closeness and enjoyment decreased, moral value increased. Pertaining to the former, this program was self-directed and depended on the initiative of the coaches to discuss the program’s content in a team setting. In future iterations, developing coach-athlete specific content and activities may help to enhance coach closeness. Regarding the decrease in enjoyment, this could be due to a high baseline level of excitement for the program. The ‘welcome’ video was released to athletes before completing the baseline questionnaire. The eagerness to participate in the program upon receiving video content and the exposure to professional athletes before baseline questionnaire completion could have resulted in a ceiling effect, which left little room for improvement (Šimkovic & Träuble, 2019). Moving forward with this program specifically, but also for those interested in evaluating change in their programming, distributing the baseline questionnaire before accessing tailored resources may provide a more accurate representation of pre-program scores. In contrast, an increase in moral value was observed. Research suggests that when exposed to social agents who promote teamwork and prosocial behaviours, engagement in moral behaviours increases (McLaren et al., 2021). Accordingly, having athlete role models discuss the importance of these concepts with the youth participants could have positively influenced their moral value.
More broadly, it is important to highlight that there was a 78% attrition rate for pretest-posttest questionnaire completion and, thus, a small sample size for analyses. Given that efficacy measures often underestimate the impact of an intervention when attrition rates are high (Eysenbach, 2005), exploring ways to improve post-program questionnaire completion is critical for future iterations of the program (e.g., condensing the questionnaires). Of note, this lack of post-program questionnaire completion does not automatically suggest that the athletes were not engaged across the 16 weeks, as evidenced by the process assessments both within and post-program in addition to the total video views.
Along these lines, despite the lack of significant pretest-posttest outcomes, process-related questions and the super-user RPP data highlight that athletes enjoyed the program and thought they improved across all 3Cs. As one example, athletes reported feeling more connected with their teammates (ingroup ties) and that their team was now more important to them (cognitive centrality). Within this section of the program, activities and reflection questions targeted developing new friendships with teammates as well as team-building activities (e.g., committing to a team mantra), all of which could have positively impacted connection. Similarly, there were also increases in indicators of confidence (i.e., self-confidence, task orientation, competitive excitement) and character (i.e., sportspersonship and moral values). These findings align with our previous PoC testing (Côté et al., 2023), whereby only significant increases in RPP scores were observed. Importantly, RPP questionnaires have been found to elicit greater self-awareness pertaining to the degree of change that has occurred between time points and thus, overcomes various pretest drawbacks (e.g., unclear pre-test frame of reference, response shift bias; Little et al., 2020).
Pertaining to adoption, almost three quarters of respondents watched 13 or more videos (out of 16) and completed the corresponding reflection questions (i.e., adherence), yet less than half completed the same amount of LIO activities. Whereas this evaluation examined individual member adoption, future evaluations should examine adoption at the team and community sport league/organizational levels (e.g., Koorts & Gillison, 2015). In doing so, program partners may be able to identify context-specific factors that either supported or hindered individual-level adoption (e.g., coach and sport league ‘buy-in’).
Within the post-program questionnaires and interviews, athletes stated they were excited about each segment of the 1616 program but that each week required more time than expected to engage with all three resources (i.e., video, reflection, and LIO). Given that athletes found the program to be fun but could not fully engage due to time constraints (e.g., balancing athletics with school), altering the cadence of the program in the future would be worthwhile. For instance, implementing biweekly versus weekly programming could reduce participant burden while providing more flexibility as to when athletes choose to engage with the content. Moreover, having ‘check-ins’ throughout the season with the professional athletes could give the youth more opportunities to share iterative feedback that could be integrated during the program (rather than post-program) to meet the needs of current end-users (Shaikh et al., 2020).
In relation to implementation, post-program and across the 3Cs RPP questionnaires, athletes discussed that the content of each video was very relatable to their own lives, that the videos supported them in completing the goals of the program (e.g., developing a buffalo mindset), and the reflection questions were useful for understanding the main message of each video. In addition, although athletes spoke about the program with their parents/guardians, little communication occurred with teammates. Given that the underlying theme of 1616 is to band together with teammates, athletes did request more team-based program components (e.g., group activities and opportunities to watch the videos together). To address this feedback, developing additional coach/athlete leader-led activities and reflection questions to be completed within a team setting may help to facilitate teammate interaction. As one example, when introducing the concept of cohesion, various team-building strategies could be employed such as developing team goals or identifying team members’ roles and responsibilities (Bloom et al., 2008). Finally, despite athletes discussing the time constraints of their schedules, they reported that the amount of material shared was ‘just right’. Thus, and as previously alluded to, providing athletes with more time to complete the weekly activities (i.e., biweekly content), rather than weekly, or shortening the length of the program overall may improve engagement.
For maintenance, athletes discussed both facilitators and barriers to fully participating in the program. In terms of facilitators, the opportunity to win prizes from the program (e.g., a hockey jersey signed by a professional ice hockey player) was a highlight. Considering more frequent prizes in future iterations could enhance engagement throughout the program. In relation to barriers, given that this program involved athletes, parents/guardians, and their coaches, there was sometimes miscommunication between parties pertaining to who was to be receiving the online material. Streamlining the program’s online content/resource portal to clearly highlight how the athlete content would be delivered and to whom could enhance the online user experience. Specific to post-program questionnaires, athletes described completing the surveys as daunting and at times, difficult to understand. Whereas the research team sought to employ a rigorous, evidence-informed evaluation, the length and wording of the validated questionnaires may have negatively influenced post-program completion rates. A common challenge when conducting partnered research is balancing researchers’ goals (i.e., producing high quality research) with partners’ practical objectives and supporting the needs of end-users (Ettekal et al., 2017). As such, prior to the next iteration of 1616, it would be beneficial for the invested sport partners (e.g., Ladd Foundation, research team, youth athletes) to reflect on the ways in which the integrity of the research can be upheld without negatively impacting participant experiences. Reassuringly, despite the various barriers discussed, the majority of athletes stated they would engage in the program again and continue to implement what they had learned in their lives outside the program.
Strengths, Limitations, and Future Directions
A key strength of this study was the evidence-informed nature of the design, implementation, and evaluation of the 1616 Program. More specifically, researchers have called for sport programs that use role modelling to be: (a) underpinned by theoretical constructs, (b) co-created with invested partners, and (c) objectively assessed using appropriate evaluation methods (Kelly et al., 2023). Of note, 1616 supports all three recommendations. For instance, this program is grounded in existing sport PYD literature and more specifically, the Personal Assets Framework (e.g., Côté et al., 2020). Given that integrating theoretical frameworks within sport role model-led program designs has been found to positively elicit impact (Cleland et al., 2012; Kelly et al., 2023), within this type of programming it is important to consider which existing framework(s) may best support the intended outcomes from the outset of intervention development. With regard to co-creation, our partners (i.e., the Ladd Foundation) were included throughout program development, implementation, and evaluation. By engaging in a collaborative decision-making process (see Martin et al., 2023 for more detail), target knowledge user needs were integrated from the outset of program creation, thereby enhancing its relative impact (Kelly et al., 2023).
In relation to program evaluation, researchers have identified a lack of programmatic assessment for sport offerings that specifically use role modelling (Kelly et al., 2023). To address this concern within the current evaluation, we adopted a mixed-methods design using the well-established RE-AIM framework (Glasgow et al., 1999). In doing so, all five indicators were assessed resulting in a comprehensive evaluation of both program efficacy and effectiveness (Kessler et al., 2013). Given that not addressing all RE-AIM indicators has been cited as a limitation of sport-based program evaluations (e.g., Lawrason et al., 2021; Saizew et al., 2022), using both qualitative and quantitative methods throughout an evaluation is an important consideration.
Notwithstanding the many strengths of this study, there are nevertheless limitations that need to be addressed. First, while the adoption of a theoretical framework (the Personal Assets Framework) served as a study strength, it is also important to acknowledge the limitations of this framework and its generalizability across contexts and cultures. For instance, whereas the 4Cs of PYD have received support within sport literature, this research has been typically conducted in a Western context (Strachan et al., 2018). As a result, these outcomes may not resonate with the lived experiences of youth across diverse cultures, values, and belief systems. As one example, Strachan et al. (2018) engaged in talking circles with Indigenous youth to shed light on the meanings they attached to the 5Cs of PYD. Specific to connection, youth spoke about not only their relationships with teammates, but also a connection to land and culture. As such, including activities that incorporate diverse belief systems and ways of knowing, such as reflection questions that centre around youths’ connections to the land, may assist in developing relevant and inclusive 1616 programming. Similarly, such an inclusive approach to program development should be considered for any partnerships interested in facilitating youth development, regardless of context.
Second, even though purposeful efforts were made to reduce time constraints (e.g., shortened RPP questionnaires), we had large attrition rates across the pretest-posttest and 3Cs RPP measures. While condensing the questionnaires and shortening the total length of the program remain key priorities moving forward, it is also important for the researchers and invested partners to determine how to best integrate the evaluation into the program itself. For instance, by embedding questionnaire items into the end of each weekly video, end-users may be more likely to provide feedback. In addition, while focus groups were selected as the ideal method for post-program interviews with youth (e.g., enhanced feelings of safety, reduced power imbalance; Adler et al., 2019), scheduling conflicts resulted in the majority of interviews being completed individually. Thus, it is integral for researchers to discuss with program partners and end-users how to best facilitate post-program engagement. For instance, in this evaluation, athletes highlighted that providing incentives (e.g., signed jerseys) as well as athlete ‘check-ins’ (e.g., online video calls with program role models) could support participation in a more iterative approach to post-program feedback. Altogether, by harnessing the strengths of this study while addressing its limitation, we hope that this work can serve as a guide for researchers interested in co-producing and evaluating PYD sport programming in the future.
CONCLUSION
Sport-based PYD programs have the potential to foster many beneficial outcomes for youth. A key process of determining whether these programs are achieving their intended benefits is through program evaluation. This study provides a comprehensive RE-AIM evaluation of the 1616 Program. Youth who participated in the program highlighted numerous strengths including the online and free delivery as well as the benefits of participating (e.g., enhanced perceptions of their 3Cs). Importantly, the evaluation also shed light on various limitations of the program in its current form that must be addressed. This includes but is not limited to, developing team-oriented content to enhance teammate interactions, implementing evaluation strategies that are culturally-informed, and re-structuring the program to provide participants with the opportunity to meaningful engage with the weekly content/activities. It is the hope that by addressing these limitations in future iterations, 1616 can serve as an avenue for positive youth development for young ice hockey players.
AUTHOR NOTE
This work was supported by the Mitacs Accelerate Program (Application Ref. IT32637) to fund four internships (Cailie McGuire, Kelsey Saizew, Alex Maw, and Alex Murata). The authors report no conflicts of interests.
REFERENCES
Adler, K., Salanterä, S., & Zumstein-Shaha, M. (2019). Focus group interviews in child, youth, and parent research: An integrative literature review. International Journal of Qualitative Methods, 18(1), 1-15. https://doi.org/10.1177/1609406919887274
Allan, V., Bean, C., Kerr, B., & Gassewitz, D. (2024). Partnering for impact: A blueprint for knowledge translation initiatives in the Canadian sport sector. Quest, 76(1), 21-38. https://doi.org/10.1080/00336297.2023.2209331
Baillie, C. P., Galaviz, K. I., Emiry, K., Bruner, M. W., Bruner, B. G., & Lévesque, L. (2017). Physical activity interventions to promote positive youth development among indigenous youth: A RE-AIM review. Translational Behavioural Medicine, 7(1), 43-51. https://doi.org/10.1007/s13142-016-0428-2
Bean, C., & Forneris, T. (2016). Examining the importance of intentionally structuring the youth sport context to facilitate positive youth development. Journal of Applied Sport Psychology, 28(4), 410-425. https://doi.org/10.1080/10413200.2016.1164764
Bergeron, M. F., Mountjoy, M., Armstrong, N., Chia, M., Côté, J., Emery, C. A., Faigenbaum, A., Hall Jr., G., Kriemler, S., Léglise, M., Malina, R. M., Pensgaard, A. M., Sanchez, A., Soligard, T., Sundgot Borgen, J., van Mechelen, W., Weissensteiner, J. R., & Engebretsen, L. (2015). International Olympic Committee consensus statement on youth athletic development. British Journal of Sports Medicine, 49(13), 843-851. https://doi.org/10.1136/bjsports-2015-094962
Bloom, G. A., Loughead, T. M., & Newin, J. (2008). Team building for youth sport. Journal of Physical Education, Recreation & Dance, 79(9), 44-47. https://doi.org/10.1080/07303084.2008.10598246
Braun, V., & Clarke, V. (2021). Can I use TA? Should I use TA? Should I not use TA? Comparing reflexive thematic analysis and other pattern‐based qualitative analytic approaches. Counselling and Psychotherapy Research, 21(1), 37-47. https://doi.org/10.1002/capr.12360
Cleland, C.L., Tully, M.A., Kee, F., & Cupples, M.E. (2012). The effectiveness of physical activity interventions in socio-economically disadvantaged communities: A systematic review. Preventive Medicine, 54(6), 371–380. https://doi.org/10.1016/j.ypmed.2012.04.004
Coakley, J. (2011). Youth sports: What counts as “positive development?”. Journal of Sport and Social Issues, 35(3), 306-324. https://doi.org/10.1177/019372351141731
Coppola, A. M., Holt, N. L., & McHugh, T. L. F. (2020). Supporting Indigenous youth activity programmes: A community-based participatory research approach. Qualitative Research in Sport, Exercise and Health, 12(3), 319-335. https://doi.org/10.1080/2159676X.2019.1574880
Côté, J., Coletti, J., McGuire, C. S., Erickson, K., Saizew, K., Maw, A., Primeau, C., Wolff, M., Ladd, B., & Martin, L. J. (2023). A proof-of-concept evaluation of the 1616 story-based positive youth development program. Children, 10(5), 799-815. https://doi.org/10.3390/children10050799
Côté, J. & Fraser-Thomas, J. (2016). Youth involvement and positive development in sport. In P. Crocker (Ed.), Sport psychology: A Canadian perspective (pp. 256-287). Pearson.
Côté, J., Turnnidge, J., Murata, A., McGuire, C., & Martin, L. (2020). Youth sport research: Describing theintegrated dynamic elements of the personal assets framework. International Journal of Sport Psychology, 51(6),562-578. https://doi.org/10.7352/IJSP.2020.51.562
Côté, J., Turnnidge, J., & Vierimaa, M. (2016). A personal assets approach to youth sport. In A. Smith & K. Green (Eds.), Handbook of youth sport (pp. 243–256). Routledge.
Eime, R. M., Young, J. A., Harvey, J. T., Charity, M. J., & Payne, W. R. (2013). A systematic review of the psychological and social benefits of participation in sport for children and adolescents: Informing development of a conceptual model of health through sport. International Journal of Behavioral Nutrition and Physical Activity, 10(1), 1-21. https://doi.org/10.1186/1479-5868-10-98
Erdal, K. (2018). The adulteration of children’s sports: Waning health and well-being in the age of organized play. The Ronan & Littlefield Publishing Group, Inc.
Ettekal, A. V., Konowitz, L. S., Agans, J. P., Syer, T., & Lerner, R. M. (2017). Researcher-Practitioner collaborations: Applying developmental science to understand sport participation and positive youth development. Journal of Community Engagement and Higher Education, 9(2), 29-45.
Eysenbach, G. (2005). The law of attrition. Journal of Medical Internet Research, 7(1), 1-9. https://doi.org/10.2196/jmir.7.1.e11
Forsyth, J., McKee, T., & Benson, A. (2021). Data, development discourse, and decolonization: Developing an indigenous evaluation model for indigenous youth hockey in Canada. Canadian Ethnic Studies, 53(3), 121-140. https://doi.org/10.1353/ces.2021.0022
Gaglio, B., Shoup, J. A., & Glasgow, R. E. (2013). The RE-AIM framework: A systematic review of use over time.Public Health, 103(1), 38-46.
Giacobbi, P. R., Poczwardowski, A., & Hager, P. (2005). A pragmatic research philosophy for sport and exercise psychology. The Sport Psychologist, 19(1), 18-31. https://doi.org/10.1123/tsp.19.1.18
Glasgow, R., Harden, S., Gaglio, B., Rabin, B., Smith, M., Porter, G., Ory, M., & Estabrooks, P. (2019). RE-AIM planning and evaluation framework: Adapting to review science and practice with a 20-year review. Frontiers in Public Health, 7(64), 1-9. https://doi.org/10.3389/fpubh.2019.00064
Glasgow, R. E., Vogt, T. M., & Boles, S. M. (1999). Evaluating the public health impact of health promotion interventions: The RE-AIM framework. American Journal of Public Health, 89(9), 1322-1327.
Gould, D. (2019). The current youth sport landscape: Identifying critical research issues. Kinesiology Review, 8(3), 150-161. https://doi.org/10.1123/kr.2019-0034
Graham, I. D., Logan, J., Harrison, M. B., Straus, S. E., Tetroe, J., Caswell, W., & Robinson, N. (2006). Lost in knowledge translation: Time for a map?. Journal of Continuing Education in the Health Professions, 26(1), 13-24. https://doi.org/10.1002/chp.47
Hall, J. N. (2020). The other side of inequality: Using standpoint theories to examine the privilege of the evaluation profession and individual evaluators. American Journal of Evaluation, 41(1), 20-33. https://doi.org/10.1177/1098214019828485
Holt, N. L., Deal, C. J., & Smyth, C. L. (2016). Future directions for positive youth development through sport. In N. Holt (Ed.), Positive youth development through sport (pp. 229-240). Routledge. https://doi.org/10.4324/9781315709499-19
Holtrop, J. S., Rabin, B. A., & Glasgow, R. E. (2018). Qualitative approaches to use of the RE-AIM framework: Rationale and methods. BMC Health Services Research, 18(177), 1-10. https://doi.org/10.1186/s12913-018-2938-8
Hummell, C., Shaikh, M., & Bean, C. (2023). Current state and future directions for youth sport evaluation practices: An empirical study. Managing Sport and Leisure, 1-22. https://doi.org/10.1080/23750472.2023.2184714
Israel, B. A., Schulz, A. J., Parker, E. A., & Becker, A. B. (1998). Review of community-based research: assessing partnership approaches to improve public health. Annual Review of Public Health, 19(1), 173-202. https://doi.org/10.1146/annurev.publhealth.19.1.173
Jung, M. E., Bourne, J. E., & Gainforth, H. L. (2018). Evaluation of a community-based, family focused healthy weights initiative using the RE-AIM framework. International Journal of Behavioural Nutrition and Physical Activity, 15(13), 1-16.
Kabetu, V., Snelgrove, R., Lopez, K. J., & Wigfield, D. (2021). Hockey is not for everyone, but it could be. Case Studies in Sport Management, 10(1), 7-14. https://doi.org/10.1123/cssm.2020-0020
Kaida, L., Kitchen, P., & Stick, M. (2021). The rise of big-city hockey players and it’s implication for diversity in the National Hockey League. Canadian Ethnic Studies, 53(3), 141-161. https://doi.org/10.1353/ces.2021.0023
Kelly, E., Liston, K., Dowd, K., & Lane, A. (2023). A review of the impact of sporting role model-led interventions on physical activity and sport participation of female youth. Women in Sport and Physical Activity Journal, 32(1), 1-11. https://doi.org/10.1123/wspaj.2023-0010
Kessler, R. S., Purcell, E. P., Glasgow, R. E., Klesges, L. M., Benkeser, R. M., & Peek, C. J. (2013). What does it mean to “employ” the RE-AIM model?. Evaluation & the Health Professions, 36(1), 44-66. https://doi.org/10.1177/0163278712446066
Koorts, H., & Gillison, F. (2015). Mixed method evaluation of a community-based physical activity program using the RE-AIM framework: Practical application in a real-world setting. BMC Public Health, 15(1102), 1-10.
Kurtz, D. L. (2013). Indigenous methodologies: Traversing Indigenous and Western worldviews in research. AlterNative: An International Journal of Indigenous Peoples, 9(3), 217-229.
Lawrason, S., Turnnidge, J., Tomasone, J., Allan, V., Côté, J., Dawson, K., & Martin, L. J. (2021). Employing the RE-AIM framework to evaluate multisport service organization initiatives. Journal of Sport Psychology in Action, 12(2), 87-100. https://doi.org/10.1080/21520704.2020.1773592
Leggat, F. J., Wadey, R., Day, M. C., Winter, S., & Sanders, P. (2023). Bridging the know-do gap using integrated knowledge translation and qualitative inquiry: A narrative review. Qualitative Research in Sport, Exercise and Health, 15(2), 188-201. https://doi.org/10.1080/2159676X.2021.1954074
Little, T. D., Chang, R., Gorrall, B. K., Waggenspack, L., Fukuda, E., Allen, P. J., & Noam, G. G. (2020). The retrospective pretest-posttest design redux: On its validity as an alternative to traditional pretest-posttest measurement. International Journal of Behavioral Development, 44(2), 175-183. https://doi.org/10.1177/0165025419877973
Luguetti, C., Singehebhuye, L., & Spaaij, R. (2022). Towards a culturally relevant sport pedagogy: Lessons learned from African Australian refugee-background coaches in grassroots football. Sport, Education and Society, 27(4), 449-461. https://doi.org/10.1080/13573322.2020.1865905
Martin, L. J., Coletti, J., Saizew, K., McGuire, C. S., Maw, A., Primeau, C., Côté, J., Erickson, K., Wolff, M., & Ladd, B. (2023). An integrated knowledge translation approach to developing a story-based positive youth development program in sport: The 1616 program. Journal of Character Education, 19(1-2), 129-147.
McGuire, C., Saizew, K., Benson, A. J., Côté, J., Erickson, K., Maw, A., Murata, A., Profeit, M., & Martin, L. (2025, January 16). A RE-AIM evaluation of the “1616” sport-based positive youth development program. Retrieved from https://osf.io/2e3vr
McLaren, C. D., Boardley, I. D., Benson, A. J., Martin, L. J., Fransen, K., Herbison, J. D., … & Bruner, M. W. (2021). Follow the leader: Identity leadership and moral behaviour in social situations among youth sport teammates. Psychology of Sport and Exercise, 55, 101940. https://doi.org/10.1016/j.psychsport.2021.101940
Patton, M. Q. (2018). Evaluation science. American Journal of Evaluation, 39(2), 183-200. https://doi.org/10.1177/10982140187631
Patton, M. Q. (2014). Qualitative research & evaluation methods: Integrating theory and practice. Sage Publications.
Saizew, K., Turnnidge, J., Luciani, A., Côté, J., & Martin, L. J. (2022). Positive youth development in community sport: A program evaluation using the RE-AIM framework. Sinéctica, 59(1), e1422. https://doi.org/10.31391/s2007-7033(2022)0059-010
Shaikh, M., Bean, C., & Forneris, T. (2020). Six recommendations for youth sport stakeholders when evaluating their programs. Journal of Sport Psychology in Action, 11(3), 165-182. https://doi.org/10.1080/21520704.2020.1746709
Shaikh, M., & Forneris, T. (2023). A RE-AIM evaluation of a sport-based trauma-sensitive youth development programme. International Journal of Sport and Exercise Psychology, 22(4), 845-865. https://doi.org/10.1080/1612197X.2023.2180068
Šimkovic, M., & Träuble, B. (2019). Robustness of statistical methods when measure is affected by ceiling and/or floor effect. PloS one, 14(8), 1-47. https://doi.org/10.1371/journal.pone.0220889
Smith, B., Williams, O., Bone, L., & Collective, T. M. S. W. C. P. (2023). Co-production: A resource to guide co-producing research in the sport, exercise, and health sciences. Qualitative Research in Sport, Exercise and Health, 15(2), 159-187. https://doi.org/10.1080/2159676X.2022.2052946
Strachan, L., McHugh, T. L., & Mason, C. (2018). Understanding positive youth development in sport through the voices of Indigenous youth. Journal of Sport and Exercise Psychology, 40(6), 293-302. https://doi.org/10.1123/jsep.2018-0035
Tous-Rovirosa, A., Prat, M., & Dergacheva, D. (2023). The Hockey Girls. The creation of a new collective subject: Sisterhood and the empowerment of women. Feminist Media Studies, 23(5), 2063-2084. https://doi.org/10.1080/14680777.2022.2029526
Whitley, M. A., Forneris, T., & Barker, B. (2015). The reality of sustaining community-based sport and physical activity programs to enhance the development of underserved youth: Challenges and potential strategies. Quest, 67(4), 409-423.
https://doi.org/10.1080/00336297.2015.1084340
Wong, L. L., & Dennie, M. (2021). “I feel more Canadian with hockey.” Identity and belonging via ice hockey in a diverse Canada. Canadian Ethnic Studies, 53(3), 183-217. https://doi.org/10.1353/ces.2021.0025