Tuesday, November 28, 2023
12:45 p.m. Online via Zoom
Attending Voting Members: Promila Dhar (BME), David Gidalevitz (PHYS), Boris Glavic (CS), Erin Hazard (HUM), Stephen Kleps (CAE), Kathiravan Krishnamurthy (FDSN), Eva Kultermann (ARCH), Yuting Lin (BIOL), Yuri Mansury (SSCI), David Maslanka (AMAT), Erdal Oruklu (ECE), Victor Perez-Luna (CHBE), Jacob Thomas (SGA), Ray Trygstad (ITM/Secretary), John Twombly (SSB), Murat Vural (MMAE), Fred Weening (Chair)
Also Attending: Anri Brod (Libraries), Roland Calia (SSB), Jennifer DeWinter (LCSL), BJ Engelhardt (Career Services), Diane Fifles (University Accred), Natalia Gallardo (Registrar), Kyle Hawkins (AMP), Mary Haynes (UGAA), Pam Houser (INTM), Hao Huang (SSCI), Jasmine Johnson (Pathways & Bridging), Melanie Jones (Armour Academy), Sang Baum Kang (SSB), Carly Kocurek (LCSL), Christopher Lee (Registrar), Melisa Lopez (Student Success & Retention), Gabriel Martinez (Armour Academy), Abby McGrath (Enrollment Services), Nick Menhart (DVP Accreditation), Kathleen Nagle (ARCH), Nichole Novak (Libraries), Shamiah Okhai (LCSL), Joseph Orgel (VPAA), Ayesha Qamer (Registrar), Kelly Roark (CLI), Zipporah Robinson (Academic Success), Gabrielle Smith (AA), Katie Spink (BIOL/Past Chair), Mary Jorgenson Sullivan (ELS), Liad Wagman (SSB/CSL), Jeff Wereszczynski (PHYS/BIOL)
Approval of minutes from the 11/14/2023 meeting.
Since the minutes were posted late, after some discussion it was decided to delay approval of the minutes until the next meeting. Joseph Orgel suggested approving the areas of the minutes constituting policy approval so they could be reported to the University Faculty Council.
Ray Trygstad moved that the 11/14/2023 Minutes section 3 concerning a program revision to the BS in Material Science and Engineering and section 4 concerning the elimination of the Minor in Game Studies and Design be approved. Kathiravan Krishnamurthy seconded.The motion to approve the 11/14/2024 Minutes sections 3 and 4 was approved by common consent.
Updates from Academic Affairs.
Joseph Orgel’s remarks in summary:
I suggest that this committee spends some time discussing approaches to dealing with the continuing situation developing with academic honesty. In this semester a dozen or so Teaching Assistants were reported for Academic Honesty violations. So coupled with generative AI this has becomne an increasing concern. But if discussions about dealing with academic honest only focus on detection and discipline, our efforts are going to fail. We must educate students about academic honesty, which should be a really strong component of education of the education our students receive. Our focus should be about learning and ensuring students understand that, that it's not just about grades. As we discuss, we learn more about each other's practices across disciplines, not only with academic honesty but about the use to tools suce as AI, which can be used as a tool by students to generate really mediocre long answers, but as a tool for faculty to use to assess student work and provide feedback to students.A second issue is about how various units are dealing with the struggle and the adjustment of increased enrollment. There is increased use of co-terminal students as Teaching Assistants which technically speaking, right now, is not allowed. But Nick Menhart and I are seeking workarounds that would allow this and not run afoul of HLC, ton include the possibility of demonstrating that fifth-year co-terminal students are the equivalent of graduate students for these purposes. I'm trying rapidly to see if we can update our policy to to make that completely, clearly allowable, especially since I'm aware a number of academic units are depending on that at this moment in time.
Kathiravan Krishnamurthy had a question about what types of academic dishonesty we are observing.
Joseph Orgel responded it is a good question and an important one. Low level cheating is making use of assignments that other people have submitted, Medium level cheating is simple cut and paste from uncited sources. Oh, and there are some programs that have what looks like a substantial part of their curriculum, including assessment materials, on open websites that some students are making use of. Higher level, large level cheating is use of generative AI to answer short format questions. There are challenges that involve the way in which we practice the assignments and the grading of those assignments, and if we are reliant only on short answer formats, but never change, that's one kind of challenge. Before generative AI really hit the scene, we were looking at the lifespan of an original question being about 30 min, but it's now down to less than 5, depending on how original that that question was. And by lifespan, I mean time in which we can detect a posted answer for a posed question.
Kathiravan Krishnamurthy asked for specifics about the TAs.
Joseph Orgel replied that I cannot be more specific. That's a policy.
Boris Glavic asked why co-terminal students are not considered graduate students for TA assignment.
Joseph Orgel responded that they are explicitly undergraduate students of this institution. But we are seeking exceptions to our own rules that would allow this.
Nick Menhart commented that he agreed with Joseph that we need to have a clear policy. Some people enter co-terms in their third year. That would probably be inappropriate. So we need to just define a better, clear policy on that. He then went on to ask Joseph Orgel about use of AI for grading and assessment, and that we need to begin to investigate this a lot more. Do you have any specific resources on that? There's there's companies out there selling such tools as well, so we we might follow an approach that we've done already with a plagiarism detector, a batch process.
Joseph Orgel responded that I've personally bench tested a number of services and I keep coming back to Chat GPT. Generative AI is less good at creating something new, but has proven to be veru good at recognizing and evaluating and is able to give great feedback. There is training for faculty on how to use these tools for this use. A pilot program to train faculty is underway, and if it judged successful I'd open that up immediately to people who are interested at a scale that we can manage to afford right now. Secondarily. I would work with a couple people in the in the pilot group to stand up a tool for the rest of us to have access much like we have the batch processing tool for plagiarism detection.
Ray Trygstad commented that he had had students submit AI-generated papers, and that the fabricated citatins made it easy to detect that. Joseph Orgel replied that if students are willing to pay a little bit of money, they get access to a tool that will that will change that. A possibly greater concern though, is AI generation of code comments, and and that's a real challenge for assessment.
Fred Weening asked what is the course of action if you feel fairly certain that somebody's used one of these tools?
Joseph Orgel replied in any instance where, as an instructor, you've decided to take an action because you've detected a problem, the actions should be: are you going to take discipline? Are you going to dock points? If you're going to dock points, then you have to report it to my office—that is a university policy, and my office will follow up with them in addition to what you do. But the first thing we'll do is follow up with you and ask you, did you sit down with the student and give them a chance to explain themselves? Because that's our policy. We need to have that interaction. It's a good practice to do to make sure that students understand how to do things appropriately. After that it's pretty rare that we see a student come in for a second citation. First citations more than I want to name right now. But second and third citations are still extraordinarily rare, which suggests that once a student has got an official notice if you keep doing this, you're going to face severe action. They do take that seriously.
Report from the Core Curriculum Assessment Subcommittee on (S) Assessment as presented by Mary Jorgenson Sullivan.
Mary Jorgenson Sullivan remarked that this is the fifth assessment that completed in this cycle of core curriculum assessment. CS and ITP were completed and this was the most recent one completed in the spring of 2023. At the beginning of the report are the Core Curriculum learning goals corresponding to the social science learning outcomes, followed by a discussion of the assessment methodology. This assessment generalized the courses to include courses from economics, social science, psychology and removed identifying markers for the classes to shift the focus to the data and not attribute it to any particular class instructor or other factors. It was noted for the S designation that there was a large issue with receiving artifacts that were aligned with rubrics, so that only 50% of students could be evaluated within the parameters that were established. The report indicates the reasons and also follows up with some recommendations. While compliance was much better than it has been for other designations, there seemed to be a larger issue with understanding of the assessment process, how the rubrics and artifacts needed to be aligned, and how they needed to be evaluated. Specific details of the assessment results in the report were discussed. It was determined that there were a large number of artifacts and rubrics that did not align sufficiently, and that it is necessary to increase awareness of the learning objectives—specifically, the scientific study and the one related to communication. There is evidence that for many people, there were issues related to understanding of the assessment process as well as the criteria for artifacts and rubric development. Consequently a revised assessment website is being developed in which instructors can access Frequently Asked Questions as well as a glossary and an overview of the process. The Committee has also reached out to academic unit heads to ask for support, underscoring the the need to follow the process and to ask for support where it is needed.Fred Weening asked are there any further developments in notifying instructors that are assigned to teach a course in a Core Curriculum category that these are the learning objectives?
Mary Jorgenson Sullivan replied that prior to the semester in which assessment takes place, we invite all the instructors who are teaching a Core Curriculum course in that designation to a reach out in which we walk them through the learning objectives. We talk about the artifacts, the criteria for artifacts, and then also give guidance on developing rubrics. Then we have members of the CCAC who liaise with those faculty members to check in with them, and to ensure that they're coming along in the process, that they're aware of what they need to do and to help them troubleshoot. We have ongoing support from the CCAC. We also do another reach out in the semester when the assessment takes place. Our next assessments will be of IPRO, and so Nick and Eduardo are working actively on creating a framing documents and tools that people can use for assessments, and making sure that the learning objectives are clarified.
This was followed by some discussion as to how each discipline is involved the revision of Core Curriculum learning objectives related to those objective.
Kathiravan Krishnamurthy moved to approve the report and Eva Kultermann seconded.
The motion to approve the Report from the Core Curriculum Assessment Subcommittee on (S) Assessment passed unanimously with a vote of 15 to 0.
(S) Subcommittee recommendation on course SSCI 492 seeking (S) designation was presented by Fred Weening. There was some discussion as to why a Core Curriculum course in a particular discipline had to be approved by the Undergraduate Studies Committee and not just within the faculty for that discipline. It was pointed out that while we may not have done approvals this way this in the past, this is being done to comply with specific expectations of and instructions from our institutional accrediting agency. Boris Glavic moved to approve the (S) designation for the course and Yuri Mansuri seconded.
The vote was 13-0 to approve the proposed (S) designation of SSCI 492.
As time for the meeting was at an end, Ray Trygstad moved to adjourn and the Chair adjourned the meeting at 1:44 pm.