You are here

The Teagle Assessment Project: A Study of the Learning Outcomes for Majors in Classics

Michael Arnush and Kenny Morrell

This  presentation  will  discuss  a  project  to  assess  the  cognitive  outcomes  of  

undergraduate  students  who  major  in  classics,  which  has  been  underway  since  

2005  with  funding  from  the  Teagle  Foundation.  During  the  first  phase  of  the   project  (2005-­‐‑2007),  "ʺMeasuring  Undergraduate  Cognitive  Outcomes  from  a   Disciplinary  Perspective,"ʺ  a  group  of  faculty  members  in  classics  representing   both  undergraduate   and  research  institutions  in  consultation  with  a  social   scientist  identified  a  set  of  intellectual  skills  associated  with  the  study  of  classics.   Broadly  speaking,  these  skills  fall  into  categories  of  critical  thinking  and  post   formal  reasoning.  The  group  then  collaborated  on  the  design  of  an  instrument   to   measure  those  outcomes.  This  instrument  comprised  essay  questions,  portions  of   the  Sternberg  Triarchic  Abilities  Test,  and  a  survey  of  demographic  and   educational  information.  In  the  fall  of  2006  the  project  administered  the   instrument  to  a  group  of  forty-­‐‑six  undergraduate  students  and  developed  an   evaluative  rubric.  

This  work  formed  the  basis  for  a  second  phase  (2008-­‐‑2013),  "ʺA  Longitudinal   Study  of  Critical  Thinking   and  Postformal  Reasoning:  Assessing  Undergraduate   Outcomes  Within  Disciplinary  Contexts,"ʺ  which  differed  from  the  first  in  four   ways:   (1)  it  included  majors  in  political  science  to  determine  whether  instruments   that  feature  disciplinary  content  can  better  measure   the  ability   to  think  critically   than  interdisciplinary  methods  such  as  the  Collegiate  Learning  Assessment;  (2)  it   called  for  administering  the  instrument  at  two  times,  once  during  the  students'ʹ   first  or  second  years  and  again  during   their  senior  years  to  measure  change  in  the   development  of  the  intellectual  skills;  (3)  the  initial  cohort  of  participants   expanded  to  include  students  at  twelve  colleges  and  universities;   and  (4)  the   study  explored  the  connection  between  critical  thinking  and  postformal  

reasoning  through  the  use  of  the  Reasoning  about  Current  Issues  (RCI)  

questionnaire.  

In  the  fall  of  2009  the  study  administered  the  assessment  to  over  900  students.  To  

maintain  contact  with  the  initial  group   of  participants  and  maximize   the   response  rate  for  the  second  administration,  the  project  collected  information  in   the  spring  of  2011  about  the  types  of  courses  the  students  had  completed  and   other  campus-­‐‑based  educational  experiences,  which  could  account  for  changes  in   the  cognitive  outcomes.  In  the  fall  of  2011,  the  project   re-­‐‑assessed  those  who  were   sophomores  in  2009,  and  in  the  fall  of  2012,  those  who  were  first-­‐‑year  students  at   the  time  of  the  first  administration.  To  compensate  for  attrition  among  the  initial   group  of  participants  and  allow  for  cross-­‐‑sectional  analysis,  the  project  recruited  

additional  first-­‐‑year  and  senior  students  in  2011  and  2012.  This  group  of   respondents  included  majors  in  classics  and  political  science  who  had  not   participated  in  the  first  administration.  

This  presentation  will  address  three  topics.  First,  we  will  discuss  the  process  of   identifying  the  outcomes  and  formulating  the  essay  questions,  which  were  to   provide  novel  stimuli  and  require  students  to  make  interpretations,  derive   meaning,  draw  conclusions,  and  make  persuasive   arguments  but  did  not  assume   any  prior  knowledge  of  the  discipline.   We  will  also  describe  the  steps  in   developing  rubrics   for  evaluating  the  essays.   This  will  be  of  particular  interest  for   those  who  wish  to  collect  baseline  information  about  their  students.  The  second   topic  will  concern  issues  associated   with  administering  the  assessments,  such  as   complications  related  to  institutional  review  boards,  calibrating  incentives,  the   participants'ʹ  levels  of  interest,  engagement,  and  "ʺassessment  fatigue."ʺ  Finally,  we   will  present  some  of  the  findings  from  the  study,  offer  a  few  interpretive   observations,  and  comment  on  potential  directions  for  further  research.   

Session/Panel Title

Demystifying Assessment

Session/Paper Number

58.3

© 2020, Society for Classical Studies Privacy Policy