Artificial intelligence tools can help you work smarter — drafting emails, summarizing documents, brainstorming ideas. But when you’re working with university data, choosing the right AI tool matters.
Here’s what you need to know to use AI responsibly at UW–Madison.
Enterprise AI tools: better for you and the university
UW–Madison provides free, powerful AI tools for students and employees:
These aren’t just free alternatives to consumer AI services. They come with university-approved contracts that include crucial safeguards like enhanced data security and privacy protections. That means they won’t keep, use or share what you enter into them.
Consumer versions of the same tools will use what you give them to improve their systems. Entering university data into unvetted AI services puts UW–Madison’s students, employees and intellectual property at risk.
Explore AI tools, policies & guidelines
How to protect university data and your privacy
- Make sure you’re logged in with your NetID when using AI tools like Gemini and Copilot. The tools will display a confirmation that enhanced data protection is active.
- Never enter sensitive or restricted information into AI tools—not even the approved tools listed here—unless you’ve received explicit permission for research purposes. Contact your university data steward if you’re not sure what’s allowed.
- Use approved AI meeting tools (Zoom and Webex) in virtual meetings instead of 3rd-party bots like otter.ai, read.ai, fireflies.ai, etc. (Related: How to protect meetings from AI assistant bots)
- Follow university policies and your department’s rules for AI use. If you’re unsure, ask your supervisor or department head for clarification.
You may use AI tools for:
- Drafting a public-facing article or webpage
- Summarizing meeting notes
- Brainstorming project ideas
- Editing your own writing
- Improving accessibility of digital content
You may not use AI tools for:
- X Student records, assignments with names or grades
- X Health information protected by HIPAA
- X Unpublished research or intellectual property
- X Employee performance information
- X Confidential proposals or reviews
Remember: even data that seems anonymous can be risky. Federal privacy law requires removing all direct and indirect identifiers, and technology makes individuals increasingly identifiable over time.
When in doubt, don’t enter it. Contact the appropriate data steward at data.wisc.edu for guidance.
Understand AI’s limitations
AI tools can produce errors, biases and completely fabricated information presented as fact. They often accept information as true without evaluation and can sometimes agree with you at the expense of accuracy.
University employees have reported finding AI summaries of their own work that contained significant errors. This happens because most popular generative AI tools generate text by predicting what word should come next based on patterns in training data — not by understanding meaning or verifying facts.
Always verify AI output before using or sharing it. Check facts, review for accuracy and use your judgment. AI should enhance your work, not replace your analysis and critical thinking.
Best practices for responsible AI use at work
- Log in with your NetID: Access enterprise versions to protect institutional data.
- Know your data classification: Stick with public or internal data only. Your department may have additional restrictions on when, how and what kinds of data you can use with AI tools.
- Verify everything: Never trust AI-generated content without review.
- Consider context: AI summaries can miss important nuances or introduce errors.
- Be transparent: Consider how and when to acknowledge AI use in work products, following your department’s guidelines.
- it.wisc.edu for updates. Stay informed: AI policies and tools continue to evolve. Check
Get help
Having trouble logging into AI tools? Contact the DoIT Help Desk.
Check out Learn@UW’s free virtual Demystifying AI workshop on November 12 at noon (add to calendar).