Three Things to Know
Demonstrating the impact of workshops is increasingly crucial for securing funding, boosting attendance, and showcasing positive outcomes. Researchers, drawing on the experience of seasoned organizers, have developed ten practical rules to guide this process.
These ten rules emphasize setting clear, pre-workshop measurement goals, employing effective metrics, and utilizing diverse question types, including Likert scales and open-ended responses. Gamification can also enhance survey engagement.
Effective surveys should minimize bias and prioritize clarity. Further improvements to measurement include gathering multi-stage feedback and assessing changes in participant confidence and skills.
For Dog Welfare Practitioners
Dog welfare practitioners, particularly humane societies and trainers, frequently conduct educational workshops for various audiences, including dog owners, children, and youth. However, standardized methods for measuring the impact of these initiatives, especially on promoting responsible dog ownership, are lacking. This research offers a foundation for education providers to develop and refine their own post-course evaluations, ultimately contributing to a shared, standardized approach for assessing the impact on responsible dog ownership.
The Full Picture
This research, “Ten Simple Rules for Measuring the Impact of Workshops,” distills insights from experienced organizers into 10 practical rules for effectively measuring workshop impact. Measuring this impact is crucial for securing funding, boosting attendance, and demonstrating positive outcomes. Consistent measurement reveals workshop improvements or areas needing adaptation. Funders increasingly recognize workshops, especially for training and dissemination, as key pathways to impact. Effective measurement enables evaluation, demonstrating value to stakeholders.
This paper focuses on three workshop types:
- Exploratory: Analyzing ideas, exploring challenges, and identifying actions. Examples include keynotes, lightning talks, and discussion sessions.
- Learning: Teaching skills, applications, or techniques to increase knowledge, competence, or confidence. Examples include Software and Data Carpentry workshops.
- Creating: Collaboratively developing solutions (software, standards, resources, research outputs) by bringing together individuals with shared interests. Examples include hackathons and humanities translation/annotation workshops.
The Ten Rules
Planning & Context (Rules 1-2)
- Rule 1: Effective Goal Setting: Define clear outputs (what’s produced) and outcomes (intended impact). Involve attendees in goal setting (e.g., pre-workshop questionnaires) and adapt based on feedback.
- Rule 2: Balancing Resources: Impact measurement should align with the workshop’s scale, cost, and objectives. Short, free workshops may only need a simple post-event survey, while longer, more complex workshops require more detailed assessment. Partial evaluation is better than none.

Measurement Methods (Rules 3-5)
- Rule 3: Purposeful Metrics: Translate abstract concepts (e.g., “satisfaction”) into measurable data with a specific purpose like improving specific areas of the workshop or demonstrating knowledge gain. Effective workshop evaluation should focus on gathering meaningful data — whether through scoring, categorization, or free-text responses — without leading participants toward a predetermined conclusion.
- Rule 4: Understanding Bias: Recognize common biases in workshop evaluations and use careful survey designs to minimize them.
- Confirmation bias – Favoring data that supports pre-existing beliefs. Countered by ensuring questions allow for all possible responses.
- Sampling bias – When responses do not accurately represent the full workshop audience. Checking demographic distribution can help detect this bias.
- Social desirability bias – Participants providing answers they think are expected rather than truthful. Anonymous responses and framing questions to emphasize honesty can help mitigate this.
- Rule 5: Effective Survey Design: Use surveys at various stages (pre-, during, post-workshop, follow-ups). Balance quantitative (Likert scales) and qualitative (open-ended) questions. Avoid common pitfalls: compound / overly complex / leading questions, poor multiple-choice options, absolute wording (e.g. always, never), subjective terms (e.g. far, good), and lack of open-ended questions. Pretest surveys.
Improving Measurement (Rules 6-10)
(Rules 7 and 9 are especially relevant for learning workshops.)
- Rule 6: Participant Confidence: Measuring participants’ confidence before and after a workshop helps evaluate its impact. A simple question like “How confident are you about [workshop topic]?” provides insights into perceived changes in understanding or skill levels. Address potential fluctuations (e.g., decreased confidence due to increased awareness) by analyzing open-ended responses.
- Rule 7: Specific Skills: Assess skill acquisition related to learning objectives. Frame questions based on workshop type (learning, exploring, creating). Using Likert scale responses (e.g., strongly agree to strongly disagree), you can assess specific outcomes:
- I understand the purpose of [a particular technique].
- I can describe the [process].
- I can apply the [technique] to my work.
- I have a firm plan to integrate what I learned into my work.
Measuring skill retention months after a workshop can be challenging. But you can address this by considering the “write to their future self” approach (Rule 8).
- Rule 8: Multi-Stage Feedback: Gather feedback before (for tailoring content based on demographics, expectations), during (for real-time adjustments), immediately after (post-course surveys, “write to future self” activity), and long-term (4-6 month follow-ups, interviews, cohort meet-up recordings). Use online links and reminders.
- Rule 9: Gamification: Use games (e.g., Treasure Explorers) to assess skill acquisition in a less stressful way. Incorporate into long-term learning. Leverage existing toolkits.
- Rule 10: Measuring Wider Impact: Track impact beyond attendees via social media (hashtags, engagement), post-event reports (DOIs, Altmetric, Google Analytics), and referrals.
Conclusion
Measuring workshop impact requires thoughtful planning, execution, and evaluation. A well-balanced approach ensures that workshops not only meet their objectives but also demonstrate their value to funders and potential participants. Effective measurement validates the effort invested and highlights the broader benefits of well-designed workshops.
Importantly, some participants reported that their dogs played a crucial role in overcoming significant mental health challenges, including anxiety, depression, and even suicidal ideation.
Miscellaneous
Data From Study:
–
Year of Publication:
2018
External Link:
Sufi S, Nenadic A, Silva R, et al. Ten simple rules for measuring the impact of workshops. PLoS Comput Biol. 2018;14(8):e1006191. Published 2018 Aug 30. https://doi.org/10.1371/journal.pcbi.1006191