Guidance on AI Settings in U-M Maizey

Environment

U-M Maizey

Issue

How do the different options under AI Settings work in a Maizey project?

Resolution

System Prompt Augmentation

System Prompt Augmentation allows you to define and fine-tune how the system responds to user prompts. 

  • You can assign a “persona” for the system (e.g., You are a biology professor and expert in evolutionary biology).
  • You can define how the system should answer questions (e.g., Suggest other related resources; Include supporting web links and prove your work; If you don’t know the answer, do not guess).

View the Maizey System Prompt Library for additional guidance and examples.

A default System Prompt is provided automatically and is suitable for most general use cases.

Depending on the option you select for Choose a Chain (see below), there are elements that must be included in order for the tool to function properly. If you edit the System Prompt, be sure to include the following text at the end of your prompt.

For Retrieval Qa, copy and paste the following:
{context}
Question: {question}
Helpful Answer:

Temperature

Temperature allows you to adjust the sensitivity and output-randomness of the model.

  • A medium temperature is recommended for most use cases, but you are encouraged to experiment with temperature settings to fit your needs.
  • The higher the temperature is set, the more “creative” a response it might provide to a query (i.e., you’re giving it permission to “think outside the box”). However, a high temperature might result in answers that are too unusual. Higher temperatures are generally better suited for use cases where creativity is valued (e.g., generating song lyrics, poetry, etc.).
  • The lower the temperature is set, the more “conservative” of a response it might provide to a query (i.e., you want it to stick to safe answers with a high probability of being accurate). However, a low temperature might result in the AI model missing some correct answers because it is not certain enough. Lower temperatures are generally better suited to use cases where accuracy is valued (e.g., responding to questions with one definitive answer).

Choose a Chain

The Chain is how the system tracks and responds to previous queries in a chat conversation. 

  • Retrieval QA is essentially one query, one response. The chat tool does not incorporate or reference the chat history in its response. Best option for a simple information-retrieval chat tool.
  • Conversational QA allows the chat tool to reference and incorporate the chat history into its response. Best option for more advanced chat tools, allowing you to ask follow-up questions that reference previous interactions from within a single chat conversation.
    • Note that with Conversational QA, the system can only reference interactions from within a single chat conversation, not from previous conversations saved in the left side-bar of the chat tool.

Choose a Chain Type

Stuff: Stuffing is a straightforward way to pass all relevant data when generating text. This is the most basic and fastest Chain Type.  Currently Stuff is the only Chain Type option, but future updates will include additional options.

Return data sources in responses?

Choose whether or not the AI model includes the names of source documents or source URLs used to respond to the query within its reply.

Additional Information

Need additional information or assistance? Contact the ITS Service Center.

Details

Article ID: 10708
Created
Thu 8/17/23 6:07 PM
Modified
Thu 1/25/24 9:46 AM

Related Articles (2)

Instructions for using Maizey with Dropbox, Google Drive, or a public website.
If you enable Lecture Capture in your Canvas course, U-M Maizey can index the transcripts of your recordings and include timestamped links when responding to a student's query.