Person using AI assistant on smartphone

Generative AI services at UW–‍Madison

The University of Wisconsin–‍Madison is committed to responsibly harnessing the power of generative artificial intelligence to enhance teaching, learning, research and university operations. The Division of Information Technology (DoIT) oversees university-wide AI initiatives focused on providing secure enterprise AI tools.

Explore this page to learn about generative AI tools, resources, policies, events and support that are available to the UW–‍Madison community. Whether you’re an AI novice or an expert, we invite you to explore how AI can accelerate your work and studies.

Enterprise AI tools

UW–‍Madison has vetted and secured contracts for the generative AI services below. These tools are available university-wide for free and provide higher data security and privacy protection than public services. Please consider these options before exploring unvetted generative AI services for university work.

 

Microsoft Copilot logoMicrosoft Copilot

Microsoft Copilot is an AI-powered digital assistant that can help you with tasks like answering questions, creative writing, sample coding and creating images. When you log in with your NetID, Copilot offers commercial data protection, meaning it won’t use your prompt data to train its large language models.

OK to use with: Public and internal data only

Explore Copilot

Webex AI Assistant iconWebex AI Assistant

Webex AI Assistant is an AI-powered meeting tool that offers real-time translation, transcription, voice commands and post-meeting summaries, allowing you to stay on top of the conversation and make virtual meetings more efficient and inclusive.

OK to use with: Public, internal, sensitive and restricted data (know your unit’s guidance before using)

Explore Webex AI Assistant

Zoom AI Companion logoZoom AI Companion

Zoom AI Companion can automatically create meeting summaries, identify action items and quickly answer questions about what happened during meetings. UW–‍Madison has 2 tiers of Zoom accounts—standard for most students and staff and elevated security for employees working with HIPAA-protected health data.

OK to use with: Public, internal and sensitive data (know your unit’s guidance before using); Secure Zoom is also OK to use with restricted data, including HIPAA health data

Explore Zoom AI Companion

Amazon Bedrock logoAmazon Web Services (AWS) Bedrock

Amazon Bedrock is a cloud service that makes it easy to build AI applications using foundation models from leading AI companies. It provides a single way to access and use these models and tools to customize them with public or internal data and create AI assistants that can interact with university systems and information.

OK to use with: Public or internal data but requires evaluation to handle sensitive or restricted data

Explore AWS

Microsoft Azure AI Services

UW–‍Madison’s Azure service offers cloud computing resources, including access to OpenAI’s API, for processing and storing public or internal data. It’s available to faculty and staff but requires Cybersecurity evaluation for uses involving sensitive or restricted data.

OK to use with: Public or internal data but requires evaluation to handle sensitive or restricted data

Explore Azure

AI and institutional data

Never enter sensitive or restricted information like student records, health data or unpublished research into unvetted AI services. If you have questions about data classification, contact the appropriate data steward.

Contact university data stewards

Policies & ethics

This is an accordion element with a series of buttons that open and close related content panels.

University policies

University policies safeguard institutional data, which everyone at UW is legally and ethically obligated to protect. Remember, misuse of AI could violate university rules and legal obligations.

UW AI policies

Ethical considerations

The ethics of using generative AI tools in higher education are evolving. Employees and students should reflect on how AI impacts academic integrity, attribution, privacy and equity.

Keep in mind:

  • AI tools can be biased and make mistakes
  • You are responsible and accountable for what you produce when using these tools
  • AI tools should enhance, not replace, your own analysis and reasoning

Generative AI ethics

 

Cybersecurity & data protection

When misused, generative AI tools can expose institutional data that the university must protect. For guidance about protecting university systems and data, read the statement from Jeffrey Savoy, UW–‍Madison chief information security officer. If you have questions about classifying data, contact the relevant data steward.

CISO statement on generative AI

Departmental restrictions

UW employees are responsible for knowing and following their department or unit’s rules for using AI tools, in addition to following university policies. Make sure you understand the rules before using AI for work. If you’re unsure, ask your supervisor or department head for clarification.

Events & community

Upcoming AI events

Recent AI updates

Frequently asked questions

This is an accordion element with a series of buttons that open and close related content panels.

What AI tools are provided by UW–‍Madison?

UW–‍Madison provides access to Amazon Web Services Bedrock, Microsoft Azure AI Studio, Microsoft Copilot, Webex Assistant and Zoom AI Companion. Azure AI Studio and Amazon Web Services offer pay-as-you-go access to tools like OpenAI’s API and Anthropic’s Claude, while Copilot is an AI-powered digital assistant for various tasks. Webex AI Assistant and Zoom AI Companion provide AI-enhanced features for virtual meetings.

How can I access and use AI tools for my research or coursework?

You can use Microsoft Copilot by logging in with your NetID at copilot.microsoft.com. You can find more detailed instructions and documentation for each tool in the university’s KnowledgeBase: Getting started with Zoom AI Companion, Getting started with Cisco AI Assistant in Meetings. You can request access to Microsoft Azure Studio and AWS Bedrock.

Are there any restrictions on using AI tools like ChatGPT for teaching, learning and research?

Yes, there are restrictions on using AI tools at UW–‍Madison, particularly regarding data protection. You should only enter public information into these tools unless the specific AI service has been officially reviewed and approved by the university. Never input any sensitive, restricted or protected data into generative AI tools to prevent unauthorized sharing of confidential university information.

What policies govern the use of AI at UW–‍Madison?

UW–‍Madison has specific policies governing the use of AI, focusing on responsible use and data protection. These policies emphasize the need to protect university data, use only public information in AI tools unless officially approved, and consider ethical implications. Read our Generative AI @ UW–‍Madison: use & policies page for more.

What should I do if I'm concerned about my data privacy when using AI tools?

If you’re concerned about data privacy when using AI tools, only use university-approved services and follow the guidelines provided by the university. Do not input any sensitive, restricted or protected data into AI tools. For specific guidance, refer to the statement from Jeffrey Savoy, UW–‍Madison’s chief information security officer.

What kind of data can I use with these services?

Below is a summary of the data types the university has approved for enterprise-licensed AI tools. If a tool is not in this list, then it is approved for public data only.

Your department may have additional restrictions on when, how and what kinds of data you can use in enterprise generative AI tools. You are responsible for knowing and following your department’s rules and guidelines.

Enterprise service Public Internal Sensitive Restricted
AWS Bedrock OK OK Requires evaluation Requires evaluation
Microsoft Azure OK OK Requires evaluation Requires evaluation
Microsoft Copilot OK OK Not approved Not approved
Webex AI Assistant* OK OK OK OK
Zoom AI Companion* OK OK OK Only Secure Zoom accounts

* Your unit may have specific guidance about when and how you may use AI tools in Zoom and Webex.

How do I request a risk assessment to use sensitive or restricted data in AWS Bedrock or Microsoft Azure?

Before using sensitive and restricted data in Bedrock or Azure, you must complete Cybersecurity and the Public Cloud risk assessments and receive permission from your unit’s risk executive—typically your dean or chief administrator.

Request Cybersecurity review

Request Public Cloud review

Are there any AI-focused organizations or groups at UW–‍Madison?

Yes, there are several AI-focused groups at UW–‍Madison. These include AI@UW (a student organization), the AI interest subgroup within the Instructional Technologists Group, and the Machine Learning @ UW–‍Madison community of practice. These groups host events and projects for members interested in AI from various backgrounds and experience levels.

What support is available if I encounter issues with university-supported AI tools?

Check the KnowledgeBase articles for help with Azure, Amazon Web Services, Webex and Zoom. If you’re having trouble with university-supported services, contact the DoIT Help Desk.

Contact & support

Contact the Help Desk