Generative artificial intelligence (GAI) has arrived and has affected virtually every field, including higher education. ChatGPT, the first large language model (LLM), took just over two months from release to get to 100 million users. That’s a growth rate without precedent for any technology or application.
At UWM, instructors, students and researchers are already using various forms of GAI in their work. To best leverage AI and guide its campus-level implementation, UWM formed an AI task force in spring 2024. The group includes 50 campus stakeholders to identify existing and potential “use cases,” develop standards to ensure the responsible use of AI, identify tools needed to implement AI solutions, define pilot projects and recommend GAI tools that UWM should make available to students, faculty and staff.
Town hall on Friday
The campus community has a chance to hear more about this work during a Town Hall on Generative AI on Friday, Oct. 11, from 10:30-noon in NWQ Building D, Rooms 3835/45. Or you can join virtually with this Teams link. To submit questions in advance, email ai-uwm@uwm.edu.
The task force, led by Scott Genung, associate vice chancellor and chief information officer, and Purush Papatla, Northwestern Mutual Data Science Institute professor of marketing and co-director of NMDSI, is organized in six work groups:
- Education & Curriculum: This group builds on the existing work in the Center for Educational Technology and Learning to identify existing use cases in instruction, as well as opportunities and future needs. Lead: Sarah Riforgiate.
- Student Experience: This group identified ways that UWM could use AI to better improve the student experience, from streamlined access to information to improved connections with advisors and resources. Lead: Dave Clark.
- Research: This group examined existing research in AI and tools needed for UWM to continue discovery with AI. Lead: Kris O’Connor.
- Business Processes: This group identified opportunities to streamline our internal processes to improve efficiency and leverage UWM talent. Lead: Amanda Obermeyer.
- Infrastructure and Data: This group built on the various use cases identified to determine which of the quickly evolving tools could be implemented while ensuring that data is protected. Lead: Beth Schaefer.
- Responsible AI: This group explored ethical considerations and best practices at other institutions to propose governance frameworks, policies and practices at UWM. Lead: Thomas Malaby.
Proposed tools, policies and practices
The task force identified nearly 60 existing and proposed applications for GAI by surveying members of the campus community who already use GAI. They ranged from navigation bots to help students find resources such as financial aid information to research tools to identify funding opportunities.
Task force members also talked with and researched other institutions, such as UW-Madison, the University of Michigan and Washington University St. Louis, about their deployment of AI. The task force leadership also has had ongoing meetings with key vendors such as Microsoft, OpenAI, Apple, Adobe, Dell and Google to understand the platforms’ features and costs.
The AI Taskforce will recommend a pilot set of tools that will be made available to campus. Selection will depend on several factors: data security; the types of data involved (publicly available vs. privileged); and platforms’ ability to ensure the use of data is consistent with requirements and guidelines. For example, the LLM in some cases will integrate data from queries into the model to train it further and make it more intelligent. While this is acceptable for publicly available data, it is not acceptable for much of UWM’s data, which is privileged and sensitive information that shouldn’t be used to train models.
The appropriate GAI tools will depend on the level of security required for the data:
- Low risk – data that is publicly available, such as information found on the UWM website. This data can form the basis for AI chatbots.
- Moderate risk – confidential or proprietary data at UWM. This data typically should use tools offering enterprise-level security.
- High risk – information that UWM is obligated to protect, such as data covered by FERPA (Family Educational Rights and Privacy Act) and HIPAA (Health Insurance Portability and Accountability Act).
Tool tiers
AI tools from various vendors are evolving quickly, and the AI Taskforce is working to identify which tools it can offer and support. UWM must balance the availability and the cost of service with the benefits to campus. The proposed offerings will include three service tiers:
- Equitable – basic tools made broadly available for UWM users with basic service. These tools are expected to be part of the Teams/Windows license offerings for any user with UWM credentials.
- Targeted – expanded tools and services deployed for targeted applications, including business operations.
- Special – customized services and tools for research and other applications. These tools might be supported through grants or offered on a limited basis using campus or department funds.
Tools not only should ensure the security of moderate and high-risk data but also be consistent with UWM’s mission and values. This was the focus of the Responsible AI workgroup, which has completed its initial review of ethical considerations in the use of GAI at UWM. The AI Taskforce leadership is reviewing the workgroup’s recommendations for policy, governance and education. In addition, UWM also has an IT and data workgroup that is reviewing the use of GAI and is working on adding areas of responsibility to the Information Technology Advisory Council, including subcommittees on data oversight and data governance.
Recommendations
This fall, the task force leaders will present recommendations on pilot tools to support various use cases. This toolset would be phased in over time and may include:
- Microsoft Co-Pilot for Enterprises, which would be broadly available to UWM users.
- Ida, which would support enterprise-level chatbots.
- ChatGPT for EDU, which offers additional functionality around generative AI.
- Maizy, which provides a framework for instructors to create their own course-specific chatbots.
- Azure AI/Open AI, which would provide computational resources to support many of these services.