Let’s talk about the ancient art of the thumbs-up and thumbs-down — once the sacred, clumsy handshake between user and algorithm.
In the early days of the web and AI, we thought feedback would save us: “Tell us what you like,” they said. “Rate your experience,” they begged. “Give us a thumbs-up if we’re doing a good job,” they smiled, desperate for validation.
But here’s the controversial reality no one wants to admit: explicit feedback is a dinosaur. It’s slow, biased, and, frankly, a lie.
The Illusion of Choice
Think about the last time you gave something a thumbs-up. Was it a genuine reflection of your emotions? Or a knee-jerk reaction, a social habit, a nudge to the algorithm gods so they’d stop bugging you?
User feedback today is riddled with cognitive distortions: recency bias, fatigue, apathy. People click thumbs-up because it’s easier than dealing with a pop-up. They smash thumbs-down because they’re having a bad day — not because the recommendation was bad.
We are not the rational agents UX designers dream we are. We’re messy. We’re tired. We’re distracted.
And AI knows it.
From Consent to Covert Surveillance
Here’s where it gets uncomfortable: the future of feedback is invisible. AI won’t ask for your opinion. It will take it — by studying you.
Instead of pestering you for a thumbs-up, it will watch what you actually do:
- How long you linger on a video.
- The micro-hesitations in your scrolling.
- The slight velocity change in your mouse movements.
- The way your pupils dilate when you see something you love (hello, camera data).
In short, your devices will quietly record your behaviors and reactions — the ones you don’t even realize you’re having.
Feedback will be stolen from your subconscious, not politely requested.
Why This Is Better (And Worse)
On one hand, this is what good design has always aimed for: a system that understands you without you needing to tellit. A recommendation engine that feels magical because it gets you.
On the other hand, this is surveillance capitalism on steroids. If you thought cookies were invasive, just wait until your browser tracks your micro-expressions.
Ethical AI? Forget it. Once companies realize that studying your unconscious behaviors gives them 100x better data than a thumbs-up, the floodgates will open. Consent will become ceremonial. Terms of service will mention it, but no one will read them — and even if they did, what choice will they have?
Try finding a platform that doesn’t do it in five years. You won’t.
The End of the Feedback Era
In the old days, we thought of AI as a student. We had to teach it — show it examples, correct its mistakes. Thumbs-up and thumbs-down were the clicker training of early AI.
But modern AI, fueled by deep learning and oceans of behavioral data, is graduating. It doesn’t need our breadcrumbs anymore.
The question is no longer “What do you think?” but “What are you doing when you think no one is watching?”
And guess what? Someone always is.
Designers, Wake Up
If you’re a UX designer, product manager, or data scientist still building interfaces around asking for user feedback, you’re designing for a past that no longer exists.
In five years:
- Ratings will be obsolete.
- Feedback pop-ups will vanish.
- Your job will be to interpret behavioral data — not survey responses.
The UX of invisibility is the next frontier: how to collect everything without appearing to collect anything. How to feeluser-centered while actually being AI-centered.
It’s a paradox — and it’s where the future is headed.
Final Provocation
We used to think choice was the foundation of good design.
But AI’s new creed is different:
Don’t ask the user what they want. Just know.