Administration

Assessment informs improvements in Brandywine’s Psychology program

The program learning assessment process helps to ensure students are gaining the knowledge, skills, and abilities they need to succeed both in and beyond the classroom

At Penn State, all undergraduate, graduate, and for-credit certificate programs are required to assess how well their students are achieving key PLOs (program learning objectives). Credit: Getty Images / PeopleImages. All Rights Reserved.

MEDIA, Pa. — Penn State Brandywine’s Psychology baccalaureate program recently reexamined student learning related to a key program learning objective (PLO): the ability to differentiate among and apply psychological research methods, as well as analyze and interpret quantitative data. PLOs state what students should know and be able to do by the end of the program. This reassessment followed changes to both the program’s assessment method and instructional practices, based on insights from previous assessment findings. 

At Penn State, all undergraduate, graduate, and for-credit certificate programs are required to assess how well their students are achieving key PLOs. Each year, programs identify at least one PLO to assess. They collect and analyze data to determine how well students are meeting that objective and then use those findings to inform any changes — whether in pedagogy, curriculum, instruction, student support, or assessment methods. This helps ensure that students are gaining the knowledge, skills, and abilities they need to succeed both in and beyond the classroom. 

To gather evidence of student learning, the program administered a multiple-choice survey focused on core research concepts to students approaching the end of their degree. Joshua Marquit, a teaching professor of psychology, noted that the Psychology program’s program learning assessment strategy aims to make sure that graduates are well-equipped to apply their knowledge and skills in both professional and academic settings. 

“Our approach emphasizes immediate action based on assessment findings,” he said. “When practical, we reassess the following year to determine whether our changes are having a measurable impact.”  

Using assessment to guide meaningful changes 

Informed by previous assessment data, program faculty made several targeted changes to strengthen students’ research skills. First, they revised several survey items to improve clarity after identifying that some questions were “double-barreled” — asking about more than one concept at once. 

“We applied the same critical-thinking principles we aim to teach our students and realized that some survey items were inadvertently testing multiple concepts simultaneously,” Marquit explained. “We revised those to better isolate specific skills.” 

In addition, Marquit said, the program revised upper-level course content by integrating more frequent, focused engagement with research concepts. These adjustments were intended to reinforce foundational material introduced in earlier methods and statistics courses and to promote longer-term retention. 

“Our goal is to help students confidently apply research, statistical, and psychometric concepts in real-world contexts after graduation,” said Marquit. “We’ve added authentic experiences, such as student research presentations at symposia, as well as more opportunities for formative feedback on work in progress to address areas where students have historically struggled.

Ongoing use of assessment to inform changes 

According to Marquit, the most recent assessment findings showed a modest but meaningful increase in student performance following the implemented changes. Assessment remains central to the program’s ongoing instructional decisions. Faculty reported that reinforcing key concepts has begun to mitigate challenges some students have faced when applying foundational knowledge in more complex settings. 

“By closely examining how students performed on individual survey items before and after making pedagogical changes, we’ve gained a clearer understanding of where they’re excelling and where further support is needed,” Marquit said. “This continuous cycle of assessment, reflection and revision underscores our commitment to using evidence-based practices to improve student learning and ensure that the content of the program’s courses stay aligned with our program level-objectives.”   

About program learning assessment at Penn State  

The assessment success stories featured in this series highlight how Penn State programs are using assessment findings to improve student learning. These stories typically involve a full cycle of assessment: identifying an area for change, implementing an action plan, and reassessing a program learning objective to see whether there’s evidence that the change or changes made a difference. This process plays a central role in Penn State’s commitment to continuous academic improvement and is commonly referred to as “closing the loop.”  

Visit this link for more information about the program learning assessment process, or email assessment@psu.edu with any questions. 

Last Updated December 8, 2025