AI tools can be such amazing time savers for educators helping to generate assessments, create lesson plans, develop rubrics, grammar check passages, draft emails, and so much more. Similarly for a student, it’s almost 11:59 and that assignment is due! It is so convenient to just run that peer review document, that research proposal or essay into Chat GPT without a second thought…
However, in the excitement of running everything through an AI chatbot it can become easy to get careless with the information we share with it.
So, how do you make sure you’re using AI responsibly while protecting both your own work and the work of others? It’s a big question, especially in academic spaces where originality and integrity are key.
How do I safeguard my own work and that of others when using AI?
WHY NOT TO SHARE? WHAT NOT TO SHARE?
- Avoid uploading other people’s work: That means no pasting full articles, classmates’ essays, or unpublished work into AI systems. Many AI tools store and learn from the data they receive, which could lead to privacy violations or even accidental plagiarism down the line.
——————————————————————————————————–
- Think before you share: If you’re working on something sensitive, whether it’s your own research, a group project, or even brainstorming notes, consider whether AI is the right place for it. Some AI tools store and analyze the information they process, meaning your ideas or unpublished work could become part of a broader dataset without your knowledge. This is especially important when dealing with research that isn’t yet public, personal reflections, or any collaborative work where others haven’t given consent to share.

A good rule of thumb? If you wouldn’t upload it to a public forum, it’s best to keep it out of AI systems.
WHAT NOT TO SHARE?
- Personally Identifiable Information: Avoid inputting anything that could identify yourself or someone else like student IDs, addresses, etc.
- Educational Records or Sensitive Academic Data: Transcripts, student work without consent, Peer Content
- Private Classroom or Ongoing Research Data: Raw data tables, private classroom recordings, survey responses
- Confidential University Documents: Unpublished research, syllabi, evaluations
HOW CAN I SHARE IT?
ANONYMIZING DATA
This means removing or altering any information that could identify a person or institution or any kind of private information. This includes names, ID numbers, locations, contact details, and any unique characteristics that could be traced back to a specific individual or group or idea.
What can I do?
- Replace names with generic labels (e.g., Student A, Teacher 1)
- Remove specific dates or locations unless necessary for context
- Obscure identifying references (e.g., “a university in Boston” instead of “Tufts”)
- Aggregate data where possible (e.g., summaries instead of individual responses)
- Create hypothetical situations to describe and provide context instead
Remember: Even partial or seemingly harmless details can sometimes be pieced together to reveal someone’s identity or hurt someone’s work. When in doubt, leave it out.
Anonymization isn’t just a technical step, it’s an ethical responsibility. Whether you’re a student, educator, or researcher, always treat data with the same care and respect you’d expect for your own.
CUSTOMISE PRIVACY SETTINGS & OPT OUT OF TRAINING
It is always helpful to take a pro-active effort to make sure your data is safe and you can start right in your own data and privacy settings on your device or even clearing chat histories regularly so past conversations, preferences and interactions are not stored for long.
It is important to be aware that you can also adjust data training and sharing options. What does this mean? You can choose not to allow the generative AI platform to use your data to train future models and versions, ensuring that your inputs remain private and are not used to improve or fine-tune the AI.
Taking the time to review and modify these settings empowers us to protect our digital footprint while still benefiting from AI’s capabilities.