Artificial intelligence, particularly Generative AI, is increasingly common on university campuses. At USask, a working group is exploring practical applications, emphasizing responsible use and transparency in communication. AI tools like CoPilot promise to streamline workflows but also raise questions about authorship and ethical practices.
The use of artificial intelligence tools, particularly Generative AI (Gen AI), on college campuses is becoming more prevalent and interesting. A recent discussion revealed that the current state of Gen AI resembles the early days of the internet, according to Kyla Martin, the digital strategy director at the University of Saskatchewan (USask). She likens the situation to the transformative era of the early ’90s, giving a useful perspective on just how far the technology has come.
In late 2024, USask formed a working group of communications and digital strategy experts tasked with exploring Gen AI’s applications in their field. Their goal? To understand industry standards and identify practical uses. The findings were quite varied, ranging from basic tools like CoPilot for crafting emails to more subtle everyday applications like spellcheck or design assistance via Canva.
The blurred line between human creativity and AI-assisted work poses a challenge. Martin articulates this well: “AI is a tool that we can use in our work, just like Microsoft Word can help me organize my thoughts, or even spell check, but at the end, still my thoughts, still my work.”
This raises a significant point about disclosure. In some instances, it’s essential to be transparent about AI’s involvement, especially when sending important communications. For example, while it might not be necessary to reveal that Gen AI helped draft a casual email, it is critical to disclose its use for messages representing senior leadership. Martin advises caution in this area, emphasizing it’s important to ask whether AI is being used as a collaborative tool or if it’s simply doing the work.
The working group aims to facilitate ongoing discussions around Gen AI literacy, helping staff learn how and when to harness these tools effectively and responsibly. The pace of Gen AI advancement is another hurdle users face, and adapting to that rapid change is vital. As Martin points out, the quick development of Gen AI could be likened to the swift rise of the internet in the early stages. The blend of enthusiasm and concern surrounding its use is palpable as users navigate these new waters.
Lastly, it’s crucial for USask community members to utilize the CoPilot tool available in PAWS, which provides a secure way to manage data while using Gen AI resources. Working together, the academic community can inspire and support students, ensuring they have what they need to succeed now and in the future.
In summation, the increasing presence of Gen AI tools in academic settings like USask marks a significant shift in how communication and digital strategy professionals work. With careful navigation of ethical considerations and a focus on literacy in these emerging technologies, staff can enhance their productivity while fostering transparency. The promise of Gen AI remains bright, especially as the academic community adapts and grows with it.
Original Source: news.usask.ca