Day 7: What to Keep Private
The Concept
Using AI well includes knowing what not to share with it. Most people using consumer AI tools are doing something they do not fully realise: their conversations may be used by the companies behind those tools for training, quality improvement, or human review. This is disclosed in the terms of service — which most people have not read. None of this makes AI dangerous. It does mean that some information belongs outside of it.
The three categories to protect
Personal identifiers are the first. Your full name combined with date of birth, home address, passport number, national insurance or social security number, or any government-issued ID reference. These are the data points that enable identity theft. No task you are doing with AI requires this level of personal detail. If you need to refer to yourself in an example, use your first name only, or use a placeholder entirely.
Financial and access credentials are the second. Passwords, PINs, bank account numbers, card details, and any credentials that grant access to accounts. There is no legitimate AI prompt scenario that requires these. If a workflow seems to need them, the workflow needs to be redesigned, not the privacy rule.
Confidential professional information is the third. Details about your employer's unreleased products, client names and data covered by confidentiality, internal strategy that has not been made public, legally privileged communications, and anything covered by a non-disclosure agreement. Many organisations are developing explicit AI use policies — if yours has one, it tells you what is and is not permitted. If it does not have one yet, the conservative rule is: would you be comfortable if this information were accessible outside your organisation? If not, keep it out.
What is completely fine to share
The vast majority of AI use involves information that is entirely appropriate: general descriptions of tasks and goals, publicly available text you want summarised or explained, writing you want improved, problems you want to think through, topics you want to learn about. None of these require personal or sensitive data to be useful.
Consumer versions of AI tools — the free tier of ChatGPT, standard Claude.ai, and standard Gemini — have terms of service that permit some use of conversations for model improvement. Enterprise and business versions — ChatGPT Enterprise, Claude for Enterprise, and Gemini for Google Workspace — typically operate under different contractual terms with stronger privacy protections. The distinction matters particularly when you are using AI for professional purposes.
Before you paste anything into AI, ask one question: would I be comfortable if this appeared in a different context? If yes, share it. If you would hesitate, generalise it or leave it out. That question takes three seconds and removes the most significant privacy risk associated with everyday AI use.
Prompt of the day
Copy this into your AI tool and replace any bracketed placeholders.
Prompt
I want to use AI to help me with: [DESCRIBE YOUR TASK]. Here is the context you need: [DESCRIBE THE SITUATION USING GENERAL DESCRIPTIONS — replace any names with 'a colleague' or 'a client', replace any financial figures with approximate ranges, and leave out any passwords, ID numbers, or confidential business details]. Please help me with this.
Your 15-minute task
Think of a task you want to do with AI that involves some personal or professional context. Write the context once with everything included, then write it again with all sensitive information replaced by general descriptions. Notice that the second version gives AI everything it needs without the risk. Use the second version in your actual prompt.
Expected win
A clear personal policy for what you share with AI — and the habit of generalising or redacting sensitive information before it reaches any chat window.
Power user tip
If you use AI for work, spend five minutes finding out whether your organisation has access to an enterprise version of the tool — ChatGPT Enterprise, Claude for Work, or Gemini for Workspace. Enterprise versions typically offer contractual privacy protections that consumer apps do not. One email to your IT team is worth the clarity.