Your Data Stays Yours. Period.
Every time you type a prompt into an AI platform and hit “Generate,” something happens that most people never think about. Your words — your ideas, your creative direction, your private thoughts — leave your device and land on someone else’s server. What happens next depends entirely on whose server it is.
Most people assume the transaction is simple: I give you a prompt, you give me a result, and that’s it. But the fine print tells a very different story.
The Real Cost of “Free”
When a platform gives you free access to GPT-4, free image generation, free voice cloning — ask yourself: who is paying for the GPUs? An H100 costs over $30,000. A cluster of them costs millions. Inference isn’t cheap. If you’re not paying, you’re not the customer. You’re the product.
This isn’t paranoia. It’s the business model, written plainly in terms of service that nobody reads.
“You grant us a worldwide, non-exclusive, royalty-free license to use, reproduce, modify, and distribute your content for the purpose of improving our services.”
That sentence — or something functionally identical — appears in more AI platform terms of service than you’d be comfortable knowing. “Improving our services” means training. Your prompts, your uploaded images, your voice recordings — they become training data. Your creative work improves their model, which they sell to other customers, who compete with you.
What Actually Happens When You Hit Generate
Let’s trace the journey of a single prompt on a typical “free” AI platform:
1. Your prompt leaves your browser, encrypted in transit.
2. It arrives at the platform’s API gateway.
3. It’s logged — often with your user ID, timestamp, IP address, and session metadata.
4. It’s processed by the model.
5. Both your prompt and the response are stored — sometimes for 30 days, sometimes indefinitely.
6. Your prompt enters a training pipeline, where it’s used to fine-tune the next version of the model.
7. If the platform has a “community” or “explore” feature, your generated images might be publicly visible by default.
Steps 5 through 7 are where the real cost lives. Not in dollars. In ownership.
We Read the Fine Print
We went through the terms of service and privacy policies of a dozen major AI platforms. Here’s what we found:
• Training on user inputs unless you explicitly opt out (and the opt-out is buried 4 clicks deep)
• Retaining prompts and outputs for “safety research” with no expiration date
• Sharing anonymized data with “partners” (the definition of anonymized is conveniently vague)
• Default-public galleries where your generated images are visible to everyone
• Analytics trackers from Google, Meta, and data brokers on the same page where you enter prompts
• Broad IP licenses that survive account deletion
The opt-out pattern is particularly insidious. The default is always “yes, use my data.” The opt-out is always buried in settings, behind a confirmation dialog, sometimes requiring you to email support. Some platforms make the opt-out conditional — you can opt out of training, but only if you accept reduced functionality.
The message is clear: your privacy is a premium feature, not a right.
How Zubnet Is Different
When we built Zubnet, we made a decision on day one that shaped everything that followed: your data is yours, and we don’t want it.
This isn’t a marketing line. It’s an architectural decision. Here’s what it means in practice:
• No training on your inputs. Your prompts are never used to train any model. Ever. Not ours, not anyone else’s.
• No data selling. We don’t sell, share, or “anonymize and partner” your data. There are no data brokers in our stack.
• No surveillance analytics. We use self-hosted Plausible — a privacy-focused, open-source analytics tool that runs on our own servers. No Google Analytics. No Meta Pixel. No third-party trackers. Zero.
• No dark patterns. There’s no opt-out to find because the default is already privacy. There’s no “share to community” checked by default. There’s no gallery of your work visible to strangers.
• Your files, your control. When you delete something, it’s deleted. Not “marked for deletion in 90 days.” Deleted.
The Infrastructure Behind the Promise
Promises are easy. Infrastructure is hard. Here’s what backs ours:
Our application server runs in Canada, on infrastructure powered by Québec’s hydroelectric grid — one of the cleanest energy sources on the planet. Our GPU inference partner in Finland runs on wind energy. We chose these locations deliberately. Not because “green” is trendy, but because the energy you consume for AI should come from sources that don’t accelerate the climate crisis.
We run our own analytics. Plausible is self-hosted on our infrastructure. The data never leaves our servers. We can see page views and referrers — enough to improve the product — without knowing who you are, what you searched for, or what you clicked on before you got here.
We don’t run ads. We don’t have an ad-tech stack. There’s no retargeting pixel following you around the internet after you visit Zubnet. The business model is straightforward: you pay for the service, and we deliver it. That’s the entire transaction.
Why This Matters More Than You Think
AI is becoming infrastructure. It’s moving from “fun toy” to “how I do my job.” Designers use it for concepts. Writers use it for drafts. Developers use it for code. Lawyers use it for research. Therapists are starting to use it for session notes.
The prompts people write to AI systems are among the most intimate data they produce. More intimate than search queries. More revealing than purchase history. A prompt to an AI is a window into how someone thinks — their unfiltered ideas, their half-formed questions, their creative instincts before they’re polished for the world.
That data deserves protection, not monetization.
A Values Decision, Not a Marketing One
We could make more money if we tracked everything. We could improve our models if we trained on user data. We could reduce costs if we used free analytics from Google. Every privacy decision we’ve made has a financial cost.
We made them anyway.
Not because we think it’ll win us customers (though it should). Not because regulations require it (though they’re catching up). Because it’s the right thing to do. Because when someone trusts us with their creative work, their business ideas, their private thoughts — the only acceptable response is to honor that trust completely.
This isn’t a feature announcement. It’s a statement of values. The way AI platforms handle data will define the next decade of digital trust. We’ve made our choice. Now it’s your turn.
Zubnet — 361 models, 61 providers, zero data harvesting. See for yourself.