Entry Date:
January 25, 2017

The Role of Instructor and Peer Feedback in Improving the Cognitive, Interpersonal and Intrapersonal Competencies of Student Writers in STEM Courses

Principal Investigator Suzanne T Lane

Project Start Date September 2015

Project End Date
 August 2018


The Promoting Research and Innovation in Methodologies for Evaluation (PRIME) program seeks to support research on evaluation with special emphasis on: (1) exploring innovative approaches for determining the impacts and usefulness of STEM education projects and programs; (2) building on and expanding the theoretical foundations for evaluating STEM education and workforce development initiatives, including translating and adapting approaches from other fields; and (3) growing the capacity and infrastructure of the evaluation field.

This project will have critical significance for Science, Technology, Engineering, and Mathematics (STEM) educators by increasing writing and collaboration skills in students, areas of importance to economics, science, and national security. This study focuses on teacher and peer interactions and writing quality and improvement in the context of undergraduate STEM courses. Specifically, the project will map the development of three competency domains (cognitive, interpersonal and intrapersonal) by researching the effects of teacher and peer response on writing improvement and knowledge adaptation in STEM courses. The project utilizes a web-based assessment tool called My Reviewers (MyR). The tool will be piloted by STEM faculty in college-level Introductory Biology or Chemistry on the campuses of University of South Florida (USF), North Carolina State University (NCSU), Dartmouth, Massachusetts Institute of Technology (MIT), and University of Pennsylvania (UPenn). Research domains include both academic performance and inter/intra-personal competencies. Project deliverables will provide new tools and procedures to assist in the assessment of students' knowledge, skills, and attitudes for project and program evaluation.

Approximately 10,000 students enrolled in STEM courses at USF, NCSU, Dartmouth, MIT, and UPenn will upload their course-based writing to My Reviewers, an assessment tool, and use the tool to conduct peer reviews and team projects. This information is supplemented by surveys of demographics and dispositions along with click patterns within the toolset. Researchers will subsequently analyze this wealth of data using predictive modeling of student writing ability and improvement, including text-based methods to identify useful features of comments, papers, peer reviews, student evaluations of other peers? reviews, and instructor and student meta-reflections. Outcome goals are to (1) demonstrate ways the assessment community can use real-time assessment tools to create valid measures of writing development; (2) provide quantitative evidence regarding the likely effects of particular commenting and scoring patterns on cohorts of students; (3) offer a domain map to help STEM educators better understand student success in the STEM curriculum; and (4) inform STEM faculty regarding the efficacy of peer review.