News

Learning with AI in 2025: The New Academic Skill of AI Literacy

If you’re studying at Barossa Regional University Campus in 2025-2026, you’ve almost certainly crossed paths with artificial intelligence (AI). Maybe you’ve used ChatGPT to help brainstorm an essay, asked an AI tutor to explain a tricky concept, or used a design generator for a presentation. These tools can be incredibly useful—but only when they’re used thoughtfully.

AI is now part of university life, much like the internet was for previous generations. The challenge is no longer whether to use it, but how to use it well—and that means developing a new core academic skill: AI literacy.

What is AI Literacy?

AI literacy is the ability to understand, question, and use artificial intelligence responsibly. It sits alongside digital literacy and academic integrity as a fundamental skill for university learning.

At its heart, AI literacy is about three things:

  1. Understanding how AI works and where its limits are.
  2. Using AI tools to support—not replace—your learning.
  3. Recognising the ethical and environmental costs behind every output.

You don’t need to be a computer scientist to be AI literate. You just need curiosity, critical thinking, and a willingness to slow down and reflect.

Ask Good Questions

Generative AI is only as good as the questions you ask. A vague prompt (“write about communication in healthcare”) will produce a bland answer. A thoughtful prompt (“summarise three communication models used in allied health and suggest how they differ when working with older clients”) will give you something richer.

But asking good questions is about more than efficiency—it’s about learning. The process of crafting a precise question helps you clarify what you actually want to know. Try to:

  • Start with verbs of understanding: compare, evaluate, explain, critique.
  • Give the AI context: your discipline, the level of study, and your goal.
  • Iterate: refine your prompt as you learn more.

Think of prompting as a form of dialogue, not command. You’re not outsourcing thinking—you’re steering a conversation.

Evaluate the Answers

AI is confident, but not always correct. Tools like ChatGPT can mix accurate information with subtle errors or invented citations. They’re trained on vast text datasets, not on understanding truth.

Before you rely on any AI-generated material:

  • Cross-check facts using textbooks, scholarly databases, or official guidelines.
  • Question bias—whose perspectives might be missing?
  • Look for surface gloss—AI is good at sounding plausible, less good at being nuanced.

If you can’t explain or verify a claim in your own words, you don’t understand it yet. That’s your cue to dig deeper.

Use AI to Support, Not Substitute

AI can accelerate your process—brainstorming topics, organising ideas, or checking grammar—but it can’t replace the thinking work that makes study transformative.

Try using AI at the edges of your work:

  • Brainstorm examples, but select and refine them yourself.
  • Ask for structure suggestions, then fill them with your own evidence and analysis.
  • Use feedback tools to identify unclear sections, but rewrite them in your own voice.

Remember: at uni, you’re assessed on what you understand, not what a system can produce. The best learning happens when AI amplifies your effort, not replaces it.

Respect Data, Authors, and Energy

Every AI tool has a hidden cost. Most are trained on massive datasets scraped from books, articles, and websites—often without the consent of the authors. The systems that generate answers for us also rely on energy-hungry data centres, cooled with water and powered by electricity that’s not always renewable.

You can practice ethical and environmentally aware AI use by:

  • Choosing tools that disclose their data sources or use open, consented datasets.
  • Using AI purposefully, not habitually—don’t generate five drafts when one will do.
  • Citing AI where required, just as you would any other source.
  • Combining AI work with human conversation—your peers, tutors, and community are irreplaceable sources of insight.

AI literacy, in this sense, isn’t just about being savvy. It’s about being a responsible participant in the knowledge ecosystems that sustain learning—and the planet that sustains us all.

Be Transparent About Your Use

Academic integrity policies now expect students to declare when and how AI was used in an assignment. This isn’t a trap—it’s an invitation to reflect. Try keeping a brief AI use log:

  • Which tool did you use?
  • What was your prompt or question?
  • How did it change your understanding or process?

Being transparent protects your integrity and shows that you’re engaging critically with the technology rather than leaning on it.

Learn Slowly, Think Deeply

There’s a movement called slow AI that encourages people to engage with technology more deliberately—to pause, think, and reflect before acting. You can adopt the same attitude as a student. When you use AI, take a moment to ask:

  • What am I trying to learn here?
  • Is AI the best way to do it?
  • What might I lose if I move too fast?

Slowing down doesn’t mean rejecting technology. It means letting your curiosity and ethics lead the process.

AI will keep changing how we learn, but it doesn’t have to change who we are as learners. You have agency, creativity, and moral judgement—qualities no algorithm can replace.

Use AI to expand your understanding, not to outsource it.
Ask questions that matter. Check answers that sound too easy.
And remember that learning, like sustainability, is a long game: it thrives when we stay thoughtful, grounded, and human.