Pragmatic evaluation for large events with youth: use of a brief visual analog scale measure.
Abstract: Health educators and program evaluators struggle to balance collecting meaningful data while addressing pragmatics of evaluation data collection, particularly for large youth events. A local day-long community-based youth summit was held with 289 middle school students. Youth attended combinations of mass and break-out sessions. Planners and evaluators assessed proximal student outcomes throughout the day. A 2-question visual analog scale was developed and utilized. Each sessions-specific evaluation form was color-coded and distributed to participants prior to, and collected at the end of, each session. The evaluation tool assessed students' perceptions of learning and enjoyment. This method enabled evaluators to accurately calculate response rates and refusals. Importantly, the method was efficient, inexpensive, and simple.
Subject: Teenagers
Authors: McKyer, E. Lisako J.
Outley, Corliss
Smith, Matthew Lee
Pub Date: 09/22/2009
Publication: Name: American Journal of Health Studies Publisher: American Journal of Health Studies Audience: Professional Format: Magazine/Journal Subject: Health Copyright: COPYRIGHT 2009 American Journal of Health Studies ISSN: 1090-0500
Issue: Date: Fall, 2009 Source Volume: 24 Source Issue: 4
Product: Product Code: E121930 Youth
Accession Number: 307670894


An emerging body of research reveals multiple benefits of integrating youth participation in community health efforts (Breitbart & Kepes, 2007; Cahill, 2007; Checkoway, Dobbie, & Richards-Schuster, 2003; Goodyear, & Checkoway, 2003). Targeting youth as part of health and other risk-reducing and asset promoting community-based social capital enhancement programs show great promise. Organized youth activities have been shown to be popular and effective in delivering health education content and providing experiences associated with positive youth development (Hansen, Larson & Dworkin, 2003). Youth summits enable health professionals and volunteers to collaborate with youth to accurately identify relevant psychological and socioecological health concerns, express opinions about health programs and initiatives, and offer recommendations for program improvement.

One-day youth summits attract several hundred participants who convene in a single location. Often, large youth summits cater to participants in different grade levels and from different schools. Even with the most careful of planning, the nature of these youth gatherings may be chaotic, challenging to manage, and a logistical nightmare for event planners and volunteers. Imagine several hundred children and adolescents on a field trip! To complicate this potentially overwhelming environment, youth summit formats frequently incorporate multiple brief sessions hosted throughout the day, with participants rotating in attendance across sessions and venues.


Various evaluation approaches have been used by summit planners. Summits hosted for high school-aged youth often utilize questionnaires given to participants at the end of each session. While detailed questionnaires may work with older youth, these methods may not be practical with younger groups (e.g., primary, middle school-ages) due to event-related time constraints and literacy issues. Other approaches have included post-summit follow-ups with youth participants (i.e., retrospective approaches; Smith, Genry, Ketring, 2005). Although retrospective approaches allow for a more in-depth collection of information, such methods are vulnerable to session recall bias, small sample sizes resulting from limited resources, and may include data from sub-samples that are neither representative nor generalizable to the greater sample (Smith, Genry, Ketring, 2005). In short, retrospective evaluation formats may threaten evaluators' ability to determine the internal validity of youth summits, which make any hope of determining external validity to similar populations nearly impossible.

Ideally, data should be collected from summit participants at the end of each session. Evaluators with less stringent restrictions on funding and resources have accomplished this by providing youth participants with hand-held computer devices (Proescholdbell, Scott & Placona, 2005). Using hand-held computer devices to collect information is quick and effective; however, costs associated with this highly technical approach are prohibitive for most communities, which primarily rely on local contributions to meet basic summit needs. Although beneficial in some situations, these evaluations may detract from the effectiveness of youth programs due to designating a large proportion of allocated funds toward overly-sophisticated methods of data collection.

In consideration of various challenges to youth summit evaluation, a need exists for a simple, quick, inexpensive, yet practical and useful outcome evaluation tool, which can be used with large numbers of participants--regardless of participant literacy levels. This paper describes a successful, timely, simple and inexpensive method implemented by the Youth Summit Evaluation Team to assess summit participants' perceptions of how much they learned and their level of enjoyment during each session.



The Bryan/College Station 2008 Youth Summit (BCS Youth Summit) was based on the principles of America's Promise Alliance (1997), and aimed to deliver the Alliance's "five promises" to targeted disadvantaged youth. The Alliance's research-based framework for youth development focuses on every child having: 1) Caring Adults, 2) Safe Places, 3) A Healthy Start, 4) Effective Education, and 5) Opportunities to Help Others. The 2008 BCS Youth Summit included five sessions based on the aforementioned promises (i.e., one mass session and four break-out sessions; see Table 1). Each session lasted between 30 minutes to one hour. Sessions were held in different rooms throughout the summit's venue. Sessions integrated lecture, discussion, and activities to elicit participant engagement and allow them to apply the content and concepts presented. The 289 summit participants and volunteers were given approximately ten minutes to transition between sessions.


Participants were identified as eligible to attend the BCS Youth Summit by their middle school teachers. Teachers were provided a checklist using objective criteria to identify disadvantaged students. Based upon teachers' selections, students' parents were sent recruitment letters and consent forms to enroll their child in the summit. Active parental consent was required for students to attend the event. A total of 289 6th, 7th, and 8th grade participants from 4 local middle schools attended the BCS Youth Summit. Additionally, approximately 40 community volunteers, and 20 teacher volunteers donated their time to the day-long event. Of these middle schools, all four were rated as academically acceptable based on each campus' 2007-2008 TAKS performance scores.



Youth summit planners recognized the need for collecting evaluation data beyond merely tallying the number of participants who attended the event. Of equal or greater importance was the need to collect (1) information to assist in the improvement of subsequent summits, (2) data to document goal achievement; and (3) evidence which may be parlayed into securing additional event sponsorship, support, and funding in future years.

Brief Visual Analog Measure for Youth (Proximal Outcome Measure). A Brief Visual Analog Survey Measure for Youth (VAS) instrument was developed and utilized to assess participants' perceptions following each of the four youth summit break-out sessions. The instrument consisted of two questions: (1) How much did I learn this session? and (2) How much did I enjoy this session? Response options were on a 3-point Likert-type scale and enhanced by a vi sual analog to facilitate comprehension by children with wide-ranging reading abilities (see Figure 1). Nine forms fit on a standard 8 1/2 x 11 sheet of paper, thereby keeping printing expenses to a minimum.


Items needed to conduct the evaluation are low cost and easily obtainable. Materials needed include (1) Brief VAS Measure for Youth, (2) small colored gift bags (one differing color for each session type), and (3) small "golf" pencils.

Prior to the youth summit, volunteers attended an orientation to familiarize them with the BCS Youth Summit timeline, the event venue, and evaluation protocol. Event evaluators were instructed about the most effective times and locations to disseminate and collect evaluation materials to maximize efficiency. Following the orientation, volunteers organized Brief Analog Measures for Youth into color-specific stacks of twenty for easy dissemination. Each session's VAS were printed on different colored paper (i.e., Job Skills on blue paper, Safe Places in yellow paper, Healthy Start on white paper, and Ready for 21 on green paper). This enabled evaluators to quickly conduct visual checks to prevent cross-session data contamination.

On the day of the event, youth were handed the VAS upon entering each session. Disseminating VAS at the beginning of each session enabled evaluators to conduct a quick count of the number of evaluation forms distributed. At the conclusion of each session, youth were reminded to complete the VAS and submit it upon exiting the room. Participants were instructed to drop the completed VAS form into the colored bags, which were held by the evaluator(s) stationed in strategic locations. After all VAS forms were collected, the collection bags were sealed and labeled (see Table 2). Participants were then physically directed to the next session. Comparing the number of VAS forms disseminated with the number of forms collected enabled evaluators to simply calculate evaluation completion rates.


All analyses were performed in SPSS (version 16). Frequencies were calculated for participant perceptions of learning and enjoyment. Crosstabulations were conducted to identify frequencies of these perceptions for each session. A Spearman's rank correlation coefficient was calculated to identify the strength and direction of the relationship between these ordinally scaled perceptions of learning and enjoyment.



VAS's were disseminated to all 289 participants prior to each summit session for a total of 884 VAS's distributed across all sessions. Of the 289 middle school students who attended the BCS Youth Summit, a majority completed the brief VAS measure for each session. Evaluation completion rates ranged from 73.01% for the Safe Places session to 93.43% for the Ready for 21 session (see Table 3).


Most participants reported having learned "some" or "a lot" during the Job Skills, Healthy Start, Ready for 21, and Safe Places sessions (82.3%, 86.9%, 95.9%, and 96.2%, respectively). Further, a majority of students reported having enjoyed the sessions "some" or "a lot" (75.5%, 84.8%, 90.3%, and 94.3%, respectively). Table 4 provides a detailed description of the session-specific learning and enjoyment as reported by participants.

The VAS only included 2 items, measuring two independent concepts (i.e., learning and enjoyment). Although it is inappropriate to assess the internal reliability consistency of this 2-item instrument, a correlation coefficient was calculated to determine if perceived learning was related to enjoyment. A strong positive association was identified between perceptions of learning and enjoyment ([rho] = 0.69, p < 0.001).

The internal validity of the VAS (i.e., determining if the VAS accurately measured what the evaluators intended it to measure) was confirmed through the use of behavioral observations (McKyer, Outley, & Smith, 2009). Evaluators used a systemic approach to observe youth during sessions, and resulted in consensus that participants generally enjoyed the sessions and were generally engaged for the duration of the sessions. Behavioral observations were more favorable for reported engagement and enjoyment by participants in the Ready for 21 and Safe Places sessions than the Job Skills session (which was consistent with the self-report data collected with the VAS).


Perceptions persist regarding the disparate concerns of researchers and practitioners, particularly as they relate to balancing the needs of program evaluation. Researchers seek measurement validity, and although practitioners appreciate this need for validity, they must also deal with the realities of potential chaos commensurate with large-scale events involving youth. The authors conclude the VAS meets these oft competing needs, demonstrating that feasibility and measurement quality are not mutually exclusive conditions. The VAS protocol resulted in an extremely high completion rate, yielded no data collection errors, enabled investigators to calculate participation and response rates, and afforded investigators and event coordinators the ability to gauge program effectiveness on youth learning and enjoyment.

There are limitations to this tool. The current VAS included two items intended to globally capture two concepts: youths' perceived learning and enjoyment. Two items, particularly in the case where one item is used to measure each concept, is usually inadequate to establish measurement validity as determined with measures of internal reliability consistency. While using solely two items was appropriate for the BCS Youth Summit, the authors recommend that additional items be added in future administrations of this VAS (i.e., at least two items per concept) in order to improve the quality of the measures. This recommendation is accompanied, however, with a warning to refrain from adding too many items, and thereby defeating the purpose of a brief and quickly completable evaluation tool.

The formative application of a Brief VAS Measure in a large youth-focused setting has allowed the authors to illustrate and highlight both the benefits and limitations to its use (see Table 5).

A major strength of the VAS is its visual analog format. Research has shown that pictorial representations (i.e., visual analog scales) provides children and adolescents an easy opportunity to match their internal feelings concerning a specific context or state (Chambers and Craig, 1998), albeit most are used in clinical and medical settings. From the lessons learned during the BCS Youth Summit evaluation process, the authors conclude the visual analog format is appropriate for large-scale events as well.

A strength of the BCS Youth Summit evaluation design and protocol was its ability to triangulate other measures with the VAS. In this case, the use of behavioral observations enabled the Youth Summit Evaluation Team to effectively triangulate the accuracy of the VAS to measure outcomes associated with the youth summit's effectiveness (e.g., enjoyment). The high completion rates obtained from this exercise serves as additional support to the VAS's utility, as well as emphasizing the importance of careful advanced planning by the evaluation team. The completion rates were easily computed because the summit evaluation protocol afforded evaluators to efficiently track and compare the number of distributed VAS to the number of completed/collected VAS.

The utilization of the VAS as an evaluation tool demonstrates critical lessons. First, an extremely simple data collection method (such as the VAS) is a feasible, easy, and efficient means of collecting evaluation data. Second, the VAS's visual format, brevity of measure, and the data collection plan maximized the participants' completion of evaluation materials, ensured participants' comprehension of evaluation items regardless of literacy level, and enabled evaluators to accurately track participants' compliance without disrupting transition between sessions. Finally, color coding the VAS and strategically positioning members of the evaluation team during data collection was instrumental in eliminating potential summit fidelity issues and ensuring the accurate calculation of evaluation completion rates.

This VAS tool shows great utility and potential for use in an array of social contexts (studies or fieldwork) involving diverse participants of varying verbal or cognitive abilities. The ultimate usefulness of the brief VAS measure is dependent upon its application and protocol fidelity. The inexpensive nature and simplicity of the measure makes this, and similar, VAS a desirable research tool in most settings and contexts. The authors encourage other interested researchers to use the scale in their particular fields of interest.


America's Promise Alliance (1997). Available online at:

Breitbart, M. M. & Kepes, I. (2007). The Youth Power story: How adults can better support young people's sustained participation in community-based planning. Children, Youth and Environments, 17(2), 226253.

Cahill, C. (2007). Doing research with young people: Participatory research and the rituals of collective work. Children's Geographies, 5(3), 297-312.

Cahill, C. & Hart, R. A. (2007). Re-thinking the boundaries of civic participation by children and youth in North America. Children, Youth and Environments 17(2): 213-225.

Chambers CT, & Craig KD. (1998). An intrusive impact of anchors in children's faces pain scales. Pain, 78, 27-37.

Checkoway, B., Dobbie, D., & Richards-Schuster, K. (2003). Involving young people in community evaluation research. Community Youth Development Journal, 4, 7-11.

Checkoway, B. & Richards-Schuster, K. (2004). Youth participation in evaluation and research as a way of lifting new voices. Children, Youth and Environments, 14(2), 84-98

Checkoway, B. & Richards-Schuster, K. (2003). Youth participation in community evaluation research, American Journal of Evaluation, 24, 21-33.

Dworkin, J. B., Larson, R., & Hansen, D. (2003). Adolescents' accounts of growth experiences in youth activities. Journal of Youth and Adolescence, 32, 17-26.

Ginwright, S., Noguera, P., & Cammarota, J. (eds.) (2006). Beyond resistance! youth activism and community change: New democratic possibilities for practice and policy for America's youth. New York: Rout ledge.

Goodyear, L. & Checkoway, B. (2003). Establishing the importance of youth participation in community evaluation research. Community Youth Development Journal, 4, 5.

McKyer, E. L. J., Outley, C., & Smith, M. L. (2009). The use of behavioral observations for large youth focused events. Unpublished manuscript.

Smith, T., Genry, L., & Ketring, S. (2005). Evaluating a youth leadership life skills development program. Journalof Extension, 43(2). Available at:

Proescholdbell, S. K., Scott, S. A., & Placona, M. L. (2005). Above and beyond "it was great": Evaluation of youth summits using innovative methods. National Conference on Tobacco or Health, Chicago, IL.

E. Lisako J. McKyer, PhD, MPH, is an Assistant Professor and Director of Child & Adolescent Health Research Lab, Department of Health & Kinesiology, Texas A&M University. Corliss Outley, PhD, is an Assistant Professor and Faculty Affiliate of Youth Development Program, Department of Recreation, Park & Tourism Sciences, Texas A&M University. Matthew Lee Smith, PhD, MPH, CHES,is a Research Associate and Faculty Affiliate of Child & Adolescent Health Research Lab, School of Rural Public Health, Texas A&M Health Science Center. Please send all correspondence to E. Lisako J. McKyer, PhD, MPH, Assistant Professor and Director, Child & Adolescent Health Research Lab, Department of Health & Kinesiology, Texas A&M University, 4243 TAMU, Dulie Bell Bldg, Rm 217, College Station, TX 77843-4243. Phone: 979-8459280. Fax: 979-845-4941.
Table 1. BCS Youth Summit Sessions and Descriptions

Youth Summit     Description & Purpose

Ready for 21     Introduce participants to caring adults
                 working in professions of interest to youth.
                 Adults were invited to attend and discuss
                 their job responsibilities and day-to-day work

Healthy Start    Teach participants the importance of nutrition
                 quick, healthy, and low-cost snacks. Participants
                 were provided a snack while learning the benefits
                 of low fat diets, consuming recommended portion
                 sizes, and

Job Skills       Introduce effective education to participants
                 by focusing on the concept of personal finances
                 and techniques to manage their money. Lecture,
                 discussion, and a board game were used to allow
                 participants to apply knowledge and skills learned.

Safe Places      Teach participants the importance of internet
                 safety and skills to identify internet threats
                 and avoid internet predators. Participants were
                 led through discussions by facilitators to identify
                 inappropriate content in Facebook profiles and why
                 the inclusion

The Call         Introduce the concept of service learning to the
                 participants and provides an avenue for
                 opportunities to help others. This session
                 encouraged participants to volunteer

Table 2. Youth Summit Session Logistics

                                Ready    Safe     Healthy   Job
                                for 21   Places   Start     Skills

Number of Rotations             1        3        3         3
Number of Classrooms            1        6        1         2
Total Sessions                  1        18       3         6
Number of Room Exits Per Room   3        1        1         1

Total Number of Bags Needed *   3        18       3         6

Paper & Bag Color               White    Blue     Green     Yellow

* For Dual-wide doors, may want to have a bag on each side of the door.
Thereby double the number of bags per door.

Table 3. Youth Summit Session Evaluation Completion Rates

Evaluation Completion Rates (n = 289)

                Completion Rates (%)

Job Skills      73.34
Healthy Start   84.43
Ready for 21    93.43
Safe Places     73.01

Table 4. Reported Learning and Enjoyment Levels by Youth
Summit Participants

Session Frequencies: I learned during this session.

                 A Lot          Some          Not A Lot     Missing

Job Skills        89 (42.0%)    85 (40.1%)    30 (14.2%)     8 (3.8%)
Healthy Start    154 (63.1%)    58 (23.8%)    19 (7.8%)     13 (5.3%)
Ready for 21     182 (67.4%)    77 (28.5%)     5 (1.9%)      6 (2.2%)
Safe Places      170 (80.6%)    33 (15.6%)     2 (0.9%)      6 (2.8%)

Session Frequencies: I enjoyed this session.

                 A Lot          Some         Not A Lot      Missing

Job Skills        68 (32.1%)    92 (43.4%)   46 (21.7%)     6 (2.8%)
Healthy Start    146 (59.8%)    61 (25.0%)    22 (9.0%)     15 (6.1%)
Ready for 21     161 (59.6%)    83 (30.7%)    10 (3.7%)     16 (5.9%)
Safe Places      151 (71.6%)    48 (22.7%)     3 (1.4%)      9 (4.3%)

Table 5. Strengths and Limitations of the Brief VAS Measure


Easy to administer  The protocol utilized is very easy to oversee,
                    quick to complete, required very little
                    instruction and did not disrupting summit sessions

                    By providing simple instructions and materials
High completion     to volunteer data collectors and the youth
rate                participants resulted in very low response rates.
                    The addition of the colored coded sheets and
                    bags for collection increased the VAS measure
                    visibility/acknowledgment for completion as well

Less recall bias    By conducting the evaluation immediately
                    after each session participants did not have
                    to recall their perceptions of the sessions

                    The only expenses included paper, copying
Inexpensive         and collection bags and can be viewed as
                    being less expensive than other measurements.
                    This is especially true given that a typical
                    8.5 X 11 inch sheet could contain
                    4-5 VAS measures

Can calculate       The ability to determine the exact number
response rates      of VAS measures disseminated and collected
                    is vital in program evaluation. The protocol
                    used for this study enabled the researchers
                    to increase the response rate

Ability to assess   The use of "behavioral observations"
internal validity   enabled evaluators to triangulate data to
                    confirm the internal validity of data
                    collected with the VAS

Used for varying    The scale is beneficial for use in low
literacy levels     literacy or non-English speaking participants


3-point Likert-     The use of the 3-point scale does not
type scales         provide as much variance in responses as
                    a 5 or 7-point scale. Such that some levels
                    of psychological states or perceptions are
                    not easily captured

Session likes vs.   The VAS measure focused only on whether
Program             or not participants enjoyed the sessions,
objectives          whether or not the summit met its programmatic
                    goals. The addition of two or more questions
                    on the form could/should be expanded to
                    specifically measure program objectives

No comparison       Due to the one-time event the ability to compare
group (internal     with other participants was not included.
and external        The level of validity could not be measured

                    The intent of the Youth summit is to
No pre/post-test    introduce America's Promise 5 promises to
of behavioral       the participants. Given the event logistics
change              the incorporation of a pre-test/post-test to
                    measure knowledge and behavioral intent change
                    was not possible. In such, that there is limited
                    knowledge of how the event changed behavior of
                    the participants

Lack of             The VAS measured was designed to be very
demographic data    brief and did not include information about
                    school and/or sex for more comparison in analysis
Gale Copyright: Copyright 2009 Gale, Cengage Learning. All rights reserved.