OBJECTIVE: Poster quality at academic conferences has varied. Furthermore, the few poster-quality rubrics in the literature have limited psychometric evidence. Thus, we compared holistic vs mixed-approach scoring using a recently created poster rubric, scored by multiple raters, to evaluate validation evidence and time-to-score utility. METHODS: Sixty research posters were randomly selected from an academic conference's online poster repository. Using a previously created rubric (and without rubric training), 4 pharmacy education faculty members with varying levels of poster-related experience scored each poster. Initially, each rater holistically scored the posters, providing a single overall score for each. Approximately 1 month later, the raters scored the posters again using a mixed approach, assigning 4 sub-scores and a new overall score. We used the Generalizability Theory to assess the effect of rater experience and the Rasch Measurement Model to examine rating scale effectiveness and construct validation. Time-to-score for each poster was also compared. RESULTS: Generalizability Theory showed greater reliability with more experienced raters or when using the mixed approach. Rasch analysis indicated that rating scales functioned better with the mixed approach, and Wright maps of the construct provided useful measurement validation evidence. Raters reported scoring more quickly (30-60 s per poster) with holistic scoring, though differences in rater experience affected reliability. Meanwhile, mixed-approach scoring was slightly slower (60-90 s per poster), but the impact of the rater experience was reduced. CONCLUSION: Scoring was slightly faster with the holistic approach than with the mixed-approach rubric
however, differences in rater experience were lessened using the mixed-approach. The mixed approach was preferable because it allowed for quick scoring while reducing the need for prior training. This rubric could be used by students and new faculty when creating posters or by poster-competition judges. Furthermore, mixed-approach rubrics may be applied beyond posters, including oral presentations or objective structured clinical examination stations.