Comment Analysis with U-M Maizey

Body

Overview

U-M Maizey is a tool that enables U-M faculty, staff, and students to use custom data sets to enhance their GenAI experience, helping them extract insights, discover patterns, and gain deeper knowledge from the data. It is part of a generative AI platform being offered by U-M ITS. The Office of the Registrar created the Comment Analysis project in U-M Maizey to help instructors glean productive and meaningful insights from student feedback. 

Using the Comment Analysis tool 

  1. Navigate to https://umgpt.umich.edu/maizey/commentanalysis
  2. Greet the AI or simply get down to business.
  3. Provide the question text and the comments given in response to that question. We recommend copying the question and responses directly from the Instructor Report.
  4. Repeat for each open-ended question.
  5. Provide feedback on your experience. 

FAQ

Q. What is the purpose of this tool?

A. The purpose of the Comment Analysis tool is to help instructors digest student feedback in productive ways. Individual comments may sometimes contain strong language or specific experiences that can overshadow broader, more common sentiments among the class. Additionally, the sheer volume of comments left by a large class may make identifying common themes difficult. This tool is intended to assist instructors in gleaning meaningful and high-level insights from student feedback.

Q. I would like to keep these comments private. Should I be concerned with feeding them into this tool?

A. No. What you feed into the tool cannot be seen by anyone else, not even the project owners. 

Q. Why would I use the Comment Analysis tool instead of U-M GPT?

A. The Comment Analysis tool has been lightly coached on how to understand and analyze the content you provide. Its job is to provide a summary of the comments, frequently mentioned positive impressions, and frequently mentioned suggestions for improvements. We've also asked the tool to indicate whether or not there was sufficient data to make generalizations about the course. In this way, the Comment Analysis tool has been tailored to provide a more nuanced and controlled analysis. That said, you may find it interesting to try both the Comment Analysis tool and U-M GPT to see the differences and similarities in their performance.

Q. I only have a handful of student comments. Should I use the Comment Analysis tool? 

A. You are welcome to use the Comment Analysis tool for a small number of responses, but bear in mind that its job is to provide generalizations. A small data set may already be clear and specific enough that generalization isn't beneficial to you.

Q. Can I copy and paste my entire Instructor Report into the tool?

A. No. The tool is designed to analyze only feedback on open-ended questions. It is not built to analyze responses to single-selection questions. Additionally, the tool is designed to consider the question when analyzing responses. Therefore, we strongly recommend analyzing one question (and its responses) at a time. For example, if your report includes Q900 (with 20 responses) and Q908 (with 22 responses), we recommend feeding Q900 and its 20 responses to the AI tool. Once the analysis is complete, then feed the AI tool Q908 and its 22 responses. 

Q. I asked the AI tool a follow-up question and it asked me to provide comments, which I already have. What's happening?

A. The AI tool is structured as one query: one response. The chat tool does not incorporate or reference the chat history in its responses. This feature, which would allow for follow-up questions, is currently being developed by ITS.

Q. The AI analysis is completely inaccurate. Now what?

A. We appreciate that you've taken the time to use the tool and make a critical assessment of its efficacy. If you have a moment, please complete the feedback form. Our use of this tool is exploratory and evolving, and your feedback may help it develop into something more useful and accurate.

Q. The AI analysis is spot on! Now what?

A. That's great to hear! Please consider completing the feedback form, so that we have a comprehensive understanding of the user experience.

Q. Is this tool the official AI companion to teaching evaluation feedback moving forward?

A. No. The Office of the Registrar is exploring multiple artificial intelligence and machine learning resources. This project is a dip of the toe into AI waters and is in very early stages. If the Office of the Registrar does establish an AI companion, it could be a more developed version of this tool or a different one altogether. 

Details

Details

Article ID: 13195
Created
Thu 12/19/24 9:56 AM
Modified
Mon 12/23/24 9:22 AM