
Download the 10 things... as PDF
1. Don’t cheat your future self
Generative AI tools are here to stay and will likely be a part of the way you learn and work at university and beyond. You should see these tools as ways to supplement, deepen and enhance your learning, not as shortcuts to get things done. Learning at university, developing real expertise in a discipline, is challenging and involves time and struggle. While AI tools can support that process, they should never replace the effort it requires. Relying on them as a substitute not only risks potential academic consequences but also limits your opportunity to build the intellectual habits that are essential to your growth as a student. UBC takes a balanced approach: recognizing the potential of these tools to enhance learning, while emphasizing that they must not come at the expense of the human effort and social engagement at the heart of intellectual development.
Resource link:
- This quick start guide to learning with generative AI will help you be better prepared to use GenAI thoughtfully and responsibly during your time at UBC.
2. Build your generative AI literacy
These tools are rapidly evolving and becoming part of the digital platforms you already use. Gaining a basic understanding of how they work helps you use them more thoughtfully and effectively. You don’t need to be an expert, but a solid foundation is essential. As the tools evolve, so will the skills required to use them well. Building critical AI literacy means understanding how these tools function, using them appropriately and ethically across different tasks, and recognizing how they connect to your field of study.
Resource link:
- This summary gives you some of the technical basics, with a minimum of jargon. You can also test-drive a student readiness assessment of gen AI skills that we built in partnership with the University of Sydney.
3. Explore different tools and models, and different ways to use them in learning.
It’s tempting to go back to the tool that you know the best and have used for some time, but there are many different tools that have been built to excel at specific tasks. Not every use will require the most recent, most advanced model: be curious and open to different tools for different purposes. Below are examples recommended by members of UBC’s Student AI Council. These tools can do so much more than just generate text, images, or videos. Try experimenting with different ways to use them, such as solving practice problems, getting explanations from an AI tutor, or asking for feedback to deepen your understanding of academic content. You can also use them to explore new topics through research or to support your academic workflow with tools for time management, language translation, and more.
Resource links:
- ChatGPT / Claude / Gemini: Versatile AI tools that are good at many things, like a ‘Swiss Army Knife’, best used for summarizing readings, generating study questions, brainstorming ideas, explaining concepts, and simulating tutoring. Helpful across subjects for writing, planning, and problem-solving.
- GitHub Copilot: An AI coding assistant that helps you write, debug, and understand code directly in your text editor).
- Connected Papers: A visual tool that helps you explore the research landscape around a topic or article).
- Elicit: An AI research assistant that finds and summarizes academic papers based on your research question.
4. Avoid prompt laziness
Consider these tools as smart (but fallible) assistants, rather than a glorified search engine. The usefulness of their output depends on the clarity and depth of your instructions. A short prompt might produce something passable, but more detailed guidance, along with your feedback and follow-up questions, will lead to better results. Think of it as a collaborative process, where you're shaping and refining a response over time, not just grabbing a quick answer on the go. Promoting is both something we approach individually as users, but there are also tips for ways we can prompt these tools effectively (prompt engineering)
Resource link:
- “Anthropic’s Guide to Prompt Engineering” (Business Insider, 2025): This guide emphasizes that AI performs best when given clear roles, structured tasks, and specific context. Techniques like role prompting, step-by-step reasoning, and asking for sources can reduce hallucinations and improve reliability.
5. Be curious and cautious with all outputs
Whatever tools you use, for whatever purpose, you should always remember that these tools are far from perfect, even though they produce grammatically correct, convincing-sounding sentences or other outputs. The underlying models are trained on data full of real-world human biases and dominant perspectives. They can fabricate information (“hallucinate”), reproduce biases and dominant views, ignore other perspectives and be too quick to agree with your suggestions. If you use these tools, you are responsible for fact-checking and validating the accuracy of any output as well as considering what perspectives are not represented in the response.
Resource links:
- The UBC Library’s research guide on generative AI and ChatGPT lists several methods for fact-checking generative AI outputs.
- How to Reduce Bias in AI with Prompt Engineering (Latitude Blog, 2025) offers clear, student-focused strategies for writing prompts that reduce bias that are practical, actionable, and easy to apply across academic tasks.
6. Understand the privacy and data implications, for you and for others
Generative AI tools may process data in jurisdictions outside of Canada, meaning your interactions may be subject to other laws and data practices. This raises important questions around who owns, controls, or uses the data you share, especially when it involves sensitive or culturally significant content. Treat anything you share with generative AI tools as if you were posting in a semi-public forum, not a private diary. Familiarize yourself with the application settings to see how information you submit into any tools you use is shared back with the developers. Avoid sharing personal or private information. If use of a particular tool is required as part of a course at UBC, it will have undergone a Privacy Impact Assessment by the university before it can be used; you should still familiarize yourself with the safe use of these tools. Remember that course materials (lecture notes, problem sets, videos etc.) provided to you remain the intellectual property of the instructor and should not be uploaded into an AI tool without the instructor’s explicit permission.
Resource links:
- UBC has provided guidance on how to mitigate the risks of using GenAI, including data security and privacy issues, copyright issues, and the negative social and environmental impacts of GenAI use.
- UBC’s Privacy Impact Assessments for Generative AI Tools: This resource lists which tools have been reviewed and approved for required course use, along with guidance on safe use, data handling, and recommended syllabus language. Tools like ChatGPT 4o-mini and Microsoft Copilot are approved with conditions.
7. Consider how AI intersects with Indigenous rights
AI poses challenges to Indigenous rights of data sovereignty and can contribute to the perpetuation of digital colonization. AI tools are not epistemically neutral; the way people build and use AI is influenced by the value systems and world views they have, including views on knowledge that can conflict with Indigenous Knowledge systems. AI also poses threats as it can be used to commodify Indigenous Knowledge and resources without consent or benefit to Indigenous peoples. There is an ongoing need to consider how our use of AI intersects with Indigenous rights and UBC’s commitments to Truth and Reconciliation.
Resource links:
- UBC’s GenAI Guidelines for Teaching & Learning emphasize that AI use must respect Indigenous data sovereignty and community protocols. Users must not input indigenous works or knowledge without express permission and should avoid relying on AI-generated representations of Indigenous content.
- UBC’s Indigenous Strategic Plan emphasizes the imperative of free, prior and informed consent and protocols on the ownership, control, access and possession of Indigenous data.
8. Recognize the significant environmental impact of AI use
The impacts of climate change are becoming increasingly severe and are disproportionately affecting vulnerable populations in Canada and around the world. Significant energy and resources are required to support rapid expansion of the use of AI tools, in particular the construction of large datacenters and the power and cooling resources needed to operate them. Whilst the individual ‘cost’ of a single query is small, users have a responsibility to be aware of this as a component of their overall digital footprint which includes other resource-intensive activities (eg streaming video) and making choices about the way they use these tools. As a university community we have a responsibility to consider our institution’s digital environmental footprint. As we collectively learn about this new technological and educational landscape we need to understand and critically evaluate the risks and benefits of these technologies.
Resource link:
- As we provide access for wider adoption of these tools, here is our developing thinking
9. Understand that faculty have different perspectives of and approaches to using AI
Across the different courses you take, instructors have the authority to decide if, how and where these tools can be used for assessments (e.g. assignments and exams). Some of your courses may allow use of AI tools in some assessments but not others, some may permit them freely or others not at all. This may also vary according to discipline and the types of knowledge and information you are engaging with in a course. The course syllabus will detail what is and is not appropriate in a given course, and you should discuss this with your instructor if it is not clear to you.
The same goes for research. Students engaging in research should familiarize themselves with their supervisor’s views on AI and what expectations are for how these tools can be used responsibly in their particular field and projects (if at all). Knowing, and following, your course instructor’s policies and/or research supervisor’s expectations on AI is an important part of academic and scholarly integrity.
Resource links:
- There are extensive Academic Integrity resources for students, including self-paced courses on the Academic Integrity website.
- The Graduate School at UBC Vancouver has developed resources and guidance to support faculty and grad students using these tools in their work.
10. Be transparent about what you do
If you are permitted to use these tools for assessments, document how you have interacted with the tools and what you have done with the outputs from them. Some tools allow you to save and share links to conversations; in other cases, it is worth keeping a ‘gen AI lab book’ that details what was done and how it was used in preparation for an assessment. You should transparently acknowledge the use of these tools in cases where they are permitted (such as we have done in the footer of this document). This transparency is an important component of acting with integrity as a scholar.
Resource links:
- The Artificial Intelligence Disclosure (AID) Framework gives good examples of how to describe your use of these tools for education and research.
- UBC Library’s guide on citing generative AI use.