Generative artificial intelligence (AI) tools, like ChatGPT, have the potential to transform our world.
They can draft a story or essay, explain quantum physics in simple terms, create a new piece of music, code a new computer program, and even create an image, drawing upon all of the data from the internet to form their response to a given prompt.
So naturally, when these tools became available last fall, it caused instructors across the world to question the originality of their students’ work. But rather than fret about this novel technology or require students to complete all their assignments and tests in person, University of Miami leaders hope that faculty members will embrace generative AI tools in their classes to fully prepare students for the future.
“This is an important change in our technology ecosystem, and it’s crucial for us to understand and talk about these new tools with our students—to learn about its affordances and constraints—instead of pretending it’s not there,” said Kathi Kern, the University’s vice provost for educational innovation, who also oversees the Platform for Excellence in Teaching and Learning (PETAL). “We’re in a sandbox moment, so let’s dip our toes in and be part of that conversation because we don’t know yet all of the ways that this might reshape our work lives, cultures, and our educational practices.”
Faculty members across the University already are experimenting with ways to incorporate generative AI into their classes, said Allan Gyorke, associate vice president for information technology and assistant provost for educational innovation. Computer science instructors in the College of Arts and Sciences have asked students to create code using generative AI, and then during class, they discuss whether the code is accurate, optimized, and if it could be improved. Marketing courses at the Miami Herbert Business Schoolare planning to test the strength of AI tools to generate campaign or product pitches, and Frost School of Music faculty members are using it to deconstruct, analyze, and compose music.
However, Gyorke and Kern caution that these tools are not foolproof. At times they offer inaccurate results and cannot distinguish between fact or fiction. Free tools like ChatGPT also collect personal data from everyone using them, which is why the University currently encourages the use of software like Adobe Firefly and Bing Chat Enterprise, since they do not collect a user’s personal information.
“With a tool like ChatGPT, you don’t have control over how your data is used,” Gyorke said, adding that Bing Chat Enterprise uses similar technology to ChatGPT but instead keeps users’ data private. “We wanted to provide an AI chat tool that could be used in a more secure way.”
Still, there are plenty of ways to safely utilize the new tools. And both Gyorke and Kern look forward to hearing how other faculty members are exploring generative AI. When two of Kern’s students missed an in-class quiz in a course she was teaching in Rome last spring, she asked them to use AI tools to respond to a written prompt she had given the rest of the class. Then, she challenged the students to critique the response as if they were teaching assistants and to offer suggestions about how to expand on it.
Kern also envisions students using image generating tools—like Adobe Firefly—to create illustrations of abstract topics, like photosynthesis or the theme of a novel.
“Teaching students how to create prompts for generative AI tools is something we can do right now that’s exciting,” Kern said. “But the availability of this technology puts the onus on us as instructors to make our assessments of student learning more authentic and less task oriented. This means it’s more important than ever to teach critical thinking skills, content knowledge, and powers of analysis. So that students can evaluate the output of generative AI for themselves.”
Because the technology is still evolving, administrators also want to offer guidelines for students and faculty and staff members about the use of generative AI on campus. Chiefly, they caution anyone in the University community against putting personal student or patient information into these tools. They also ask faculty members to avoid using generative AI detection tools for assessments like papers and tests, because these programs are often prone to error and unfairly target students whose home language is not English, Kern pointed out.
The rapid availability of new generative AI tools has sparked new collaborations among University faculty members. The following are some generative AI resources for students and faculty and staff members, as well as opportunities to join the campus conversation about these new tools.
- Navigating AI, from the Division of Continuing and International Education, includes general information about AI tools as well as recommendations about the safest ways to use them for teaching and learning.
In addition, University faculty members can join PETAL’s monthly working group called the AI Teaching Exchange or attend some related events to learn more about utilizing these tools for teaching, learning, and research.
- Friday, Nov. 3: The University’s Faculty Showcase at the Lakeside Expo Center will feature keynote speaker Betsy Barre at 9:45 a.m. speaking about “AI and the future of Learning: Key Questions for the Academy.” Barre is the executive director of the Center for Teaching and Learning at Wake Forest University. Later in the afternoon, a panel of faculty members also will focus on “Teaching in the age of AI.” Registration for the Faculty Showcase is open.
In addition, last spring, PETAL brought Bill Hart Davidson, professor and senior researcher in the Writing in Digital Environments Research Center and associate dean for Research and Graduate Education at Michigan State University to speak to the faculty about the use of AI. A recording of his lecture “When Robots Learn to Write, What Happens to Learning?” is available online.