Scranton

IT instructor earns best in track at international conference

Penn State Scranton Lecturer of Information Sciences and Technology Fred Aebli’s contribution to the OLC Accelerate 2025 Conference — “How ChatGPT Turned My Final Exams Into AI-Powered Learning Labs" received a Best-in-Track designation in the Innovative and Effective Digital Learning Design track. As a result, he will be giving his presentation a second time via a live OLC webinar on Feb. 17 as part of a series this month featuring other OLC Best-in-Track honorees. Credit: Morgan Sewack / Penn State. Creative Commons

DUNMORE, Pa. — Penn State Scranton Lecturer of Information Sciences and Technology Fred Aebli’s contribution to the Online Learning Consortium (OLC) Accelerate 2025 Conference — “How ChatGPT Turned My Final Exams Into AI-Powered Learning Labs" -— received a Best-in-Track designation in the Innovative and Effective Digital Learning Design track. As a result, Aebli’s session was identified as one that OLC leadership felt would resonate with the consortium’s community, and he will give his presentation a second time via a live OLC webinar on Feb. 17, as part of a series featuring OLC Best-in-Track honorees.

OLC Accelerate showcases groundbreaking research and highly effective practices in online and digital learning across K-12, higher education and corporate learning and development.

Aebli, who has taught at Penn State Scranton for over 20 years, came up with the subject matter of his presentation while grappling with the question: How do we get students to a fundamental knowledge level to use artificial intelligence (AI) AI effectively without using AI?

“Since the arrival of OpenAI’s ChatGPT, and having spent a great deal of time teaching and using tech, I realized we still need to have good creative thinkers and problem solvers,” he said. “Although we all have an AI in our pocket with our phones, our prior knowledge of certain subject matter allows us to use AI quicker to solve problems. I then looked at how we assessed knowledge with exams and then looked at the goals and objectives of the course.”

Aebli subsequently built exams that met the goals and objectives of the course, “but then, I also felt that since [the students] are heading into the working world, they need to be challenged while being allowed to use AI,” he explained. “So, the split exam/quiz idea surfaced.”

Students were given a closed resource exam component, which tests their fundamental knowledge, and then moved to the next part of the exam where they were able to use any resource from the internet to AI chatbots to formulate a solution, with a stipulation that they had to reflect on what the solution provided.

Administering the exam was daunting, at first, Aebli said.

“I had never used the Respondus LockDown Browser in a fully in-person setting, so I wasn’t sure how it would behave with multiple students working simultaneously in the room," he said. "That uncertainty added some tension going in. However, once the exam began, it ran smoothly. The technology held up, the logistics worked, and the experience was far more seamless than I anticipated.”

One of the most striking things he said he observed, particularly in his upper-division courses, was how the students engaged with the reflective portions of the exam.

“When students reached those questions, they didn’t rush,” he said. “They leaned back in their chairs rather than leaning forward to simply ‘get it done’. They paused. They thought. I even allowed students to step into the hallway, provided they did not interact with one another, and many took advantage of that space to reflect. Watching that happen in real time confirmed what I was hoping to achieve: deeper thinking rather than reactive answering.”

When asked if he had any concerns with students cheating, or not learning the material, Aebli emphasized that while there is a significant amount of discourse online about that topic, the reality is that even when students use external tools, they must still understand what those tools produce and, more importantly, how to apply that information correctly. Without foundational knowledge, the outputs are meaningless, he said.

Based on the recent successful outcome he experienced with his students, Aebli plans on using this approach again.

“This approach is now a standard component across my courses. I still require assessments that test foundational knowledge — such as short, unassisted quizzes or longer ‘big quizzes’ delivered through lockdown browsers," he said. "Those fundamentals matter. At the same time, students also need to learn how to leverage AI appropriately, treating it as a smart assistant rather than a shortcut. Helping students develop that balance is now an intentional part of my teaching philosophy.”

Aebli said he sees AI as a valuable tool for both educators and students, and one that, if utilized with care and responsibility, can enhance the educational experience.

“This tool has the ability to be a great helper and like many tools has the ability to transform the way we think and do many of the things we are doing in the classroom and out of the classroom," he said.

Aebli currently teaches courses in database systems, programming, cybersecurity, project management and human-computer interaction, while also serving as the information technology program co-coordinator and overseeing its internship program.

In addition to those duties, he is a member of Penn State Scranton’s Speaker’s Bureau and has served as a guest speaker to educators, community leaders, faith-based organizations and professional audiences, with the aim of demystifying AI, exploring its ethical and societal implications, and helping organizations adopt AI thoughtfully without losing the human element.

Contact