5 Documents You Should Never Upload to ChatGPT
Your most sensitive documents are one upload away from a data policy you never read
ChatGPT is genuinely useful. Summarising a contract, pulling key dates from a lease, explaining insurance jargon in plain language — it handles all of this well. The problem is not what it does with your question. The problem is what happens to the document you attached.
When you upload a file to ChatGPT, the full content is sent to OpenAI’s servers in the United States. By default, that content can be used to train future models. You can opt out, but the opt-out is buried in settings most people never find. And even with training disabled, OpenAI retains your data for up to 30 days for “safety monitoring.”
Some documents are fine to upload. Meeting notes, recipes, random research — low stakes. But five types of family documents carry risks that most people do not think about until it is too late.
1. Your will
A will contains full legal names, addresses, Social Insurance Numbers, asset inventories, beneficiary details, and executor appointments. It is one of the most information-dense documents a family owns — and none of it changes until someone dies.
When you paste your will into ChatGPT to ask whether the executor clause covers a specific scenario, every detail in that document is sent to OpenAI. Your mother’s name, your children’s names, your home address, the account numbers for your RRSP and TFSA, the name of the person you trust to handle everything.
This is not data that expires or becomes irrelevant. A leaked will provides a complete identity theft kit that stays valid for decades.
Upload your will to Archevi instead. The AI reads your document locally, replaces personal details with anonymous surrogates before processing, and never sends the original content to any external service. You can ask the same questions and get the same answers — without your executor’s home address sitting on a server in San Francisco.
2. Insurance policies
An insurance policy contains your full name, date of birth, address, policy number, coverage amounts, beneficiary names, and often your health history or property details. People upload these to ChatGPT to understand coverage limits or check renewal dates.
Insurance policies reveal both your assets and your vulnerabilities. They show what you own, what you are worried about losing, and exactly how much it is worth. Combined with the personal identifiers in the document, this is everything a social engineer needs.
Renewal dates are particularly dangerous when missed. A lapsed policy can leave your family unprotected at exactly the wrong moment. ChatGPT can tell you when your policy renews, but it will not remind you when that date approaches.
Archevi extracts renewal dates and sends you reminders before they expire. Your insurance details stay in Canada on encrypted servers — never sent to an AI training pipeline.
3. Tax returns
Your tax return is a financial fingerprint. It contains your Social Insurance Number, employer details, income figures, investment accounts, RRSP contributions, medical expenses, charitable donations, and dependants’ information. People upload these to ChatGPT to check deductions or understand CRA notices.
A tax return uploaded to ChatGPT gives OpenAI your SIN, your income, your employer, your investment accounts, and your family structure. Even if training is disabled, this data is retained on US servers for up to 30 days under OpenAI’s safety monitoring policy.
The CRA takes SIN exposure seriously for good reason. A compromised SIN can be used to file fraudulent tax returns, open credit accounts, or redirect government benefits — and the damage takes years to unwind.
Your SIN is sent to OpenAI servers in the US | Your SIN is hard-blocked before it ever reaches any AI service
Data retained for 30 days minimum | Documents stored in Toronto, Canada only
No expiry date reminders | Automatic extraction of key dates and deadlines
Training opt-out buried in settings | Zero training on user data, ever
4. Medical records
Prescription lists, lab results, diagnostic imaging reports, specialist referrals, vaccination records. People upload these to understand test results or prepare for appointments.
Medical records are among the most sensitive documents under Canadian privacy law. They reveal conditions you may not have disclosed to your employer, your insurer, or even your family. Once this information leaves your control, you cannot take it back.
Under PIPEDA, health information is considered sensitive personal information requiring a higher standard of protection. Uploading medical records to a US-based service means your health data is subject to US law, including potential government access under the CLOUD Act — regardless of OpenAI’s privacy promises.
Archevi processes your medical records entirely within Canada. Personal health information is anonymised before any AI interaction, and the original documents are stored on encrypted Canadian servers subject to PIPEDA — not the Patriot Act.
5. Property deeds and mortgage documents
A property deed contains your legal name, your property address, the purchase price, the legal description of the land, and sometimes your mortgage details. Mortgage documents add your income verification, credit information, and banking details.
People upload these to ChatGPT to understand clauses, check encumbrances, or prepare for refinancing conversations. The risk is that these documents reveal both your most valuable asset and the financial details needed to impersonate you in a real estate transaction.
Title fraud — where someone impersonates a homeowner to sell or mortgage their property — is one of the fastest-growing forms of fraud in Canada. A property deed uploaded to ChatGPT provides exactly the information a fraudster needs: the legal description, your name as it appears on title, and the property address.
Upload property documents to Archevi instead. Your deed stays encrypted in Canada, the AI can answer your questions about easements and encumbrances, and your property details never leave the country.
What you can do right now
Check your ChatGPT history. Search for any uploaded files containing personal information. Delete conversations that include sensitive documents.
Turn off training. Go to ChatGPT Settings > Data Controls > Improve the model for everyone. Toggle it off. This does not delete data already collected, but it stops future uploads from being used for training.
Move sensitive documents to a private tool. Archevi gives you the same AI-powered document search without the privacy tradeoff. Your documents stay in Canada, personal details are anonymised before AI processing, and nothing is ever used for training.
ChatGPT is a remarkable tool for many things. Your family’s most private documents are not one of them.
Upload your first document and ask it a question. See how it works with your privacy intact. No credit card required.


