top of page

The Illusion of Control: Designing for the Mind, Not the Algorithm

  • Writer: Alex Shilman
    Alex Shilman
  • Jul 13
  • 5 min read
The late Steve Jobs, illustration
The late Steve Jobs, illustration

A Satisfied User or a Correct User?


One of the exciting features Apple included in its new iPod when it launched in late October 2001 was the shuffle function—a completely random song generator. No more endless looping playlists, but a true randomizer choosing what you’ll hear next.


But within days, angry customer complaints started rolling in. The randomizer, they claimed, was a hoax. Songs repeated in surprising patterns, sometimes even the same artist played twice in a row. Apple engineers tried to explain—unsuccessfully—that the mechanism was truly random and this is what randomness looks like. But customers wanted “real” randomness.


Eventually, Apple was forced to change the mechanism to make it less random so it would feel more random. A frustrated Steve Jobs put it succinctly: “We made it less random to make it feel more random.” Apple had learned the hard way what politicians and salespeople have known for centuries: sometimes it’s easier to trick the brain than convince it to accept reality.


When it comes to the intersection of technology and human psychology, fate doesn’t discriminate between tech giants and fledgling startups. The exact same scenario unfolded when Spotify expanded beyond its British-Scandinavian early-adopter crowd to broader audiences.


As the platform gained traction, users (this was already 2014, when “customers” had fallen out of fashion and everyone said “users”) hated hearing songs repeat too closely or spotting what felt like patterns in their playlists. Wild conspiracy theories spread—Spotify must be favoring artists with better royalty deals. Forums (this was 2011–2014; forums still mattered online) were flooded with suspicions of algorithmic manipulation.


In hindsight, after the Jay-Z / TIDAL scandal where streaming was rigged to promote Beyoncé and friends, those theories seem far less outrageous. But that was still four years off. Meanwhile, Swedish engineers at Spotify were pulling their hair out, trying to prove their algorithm was bias-free.


Eventually, Spotify gave in. Rather than explaining how elegant and efficient their shuffle algorithm was, they tweaked it to better please users. Songs were still selected randomly, but now positioned in the playlist to maximize spacing between tracks from the same artist.




From Roulette Wheel to Playlist



So what really happened? Apple, Spotify, and countless others (many unknowingly) ran into a classic cognitive bias—the gambler’s fallacy. In short, this fallacy is the belief that true randomness doesn’t produce patterns, especially not consecutive repetitions. So if a roulette wheel lands on red six times in a row, the next one must be black, right? (Wrong—the roulette wheel doesn’t “remember” past outcomes, and the odds remain nearly 50% each spin.)


Think of a basketball player who makes three shots in a row (and who normally doesn’t—forget Steph Curry for a second). It’s almost impossible to resist the feeling that something special is happening. Commentators talk about a “hot hand” (there isn’t one—statistically, players tend to shoot slightly worse after a successful streak). Teammates will pass to him more. Coaches will hesitate to substitute him out.


The gambler’s fallacy stems from a cognitive process called the representativeness heuristic. To simplify our complex world, the brain uses essential shortcuts to categorize stimuli. These shortcuts are universal and likely evolved over millions of years.


One such shortcut assumes that every phenomenon has a typical “prototype.” The more something resembles that prototype, the more likely our brain is to classify it under that category. The mental prototype of “randomness” is “lack of order.” This works well if you’re trying to spot a hidden stream on a savannah by detecting repeating reed patterns—or if you’re scanning for a tiger in the underbrush. It works less well with roulette wheels or shuffled playlists.




Promises Must Be Kept



So who should we listen to—the satisfied user or the technically correct one? In many cases, human psychology means that an “imperfect” product—say, one that’s not truly random—actually serves users better and receives more positive feedback. The decision isn’t always obvious. It must rely on a deep understanding of user psychology, especially how satisfaction or dissatisfaction forms.


It turns out satisfaction depends almost entirely on expectation—which, in turn, is typically set by the product’s promise (or offering). Failing to meet those expectations can come at a steep business cost.


A vivid example of this disconnect between promise and strategy comes from two well-known consumer brands: Golan Telecom and Cofix. Both based their brand on a simple message: “We’re disrupting the market!”—or, in Michael Golan’s catchy and culturally resonant phrasing, “Don’t be a sucker.”


As long as they kept that promise, customers were forgiving. They put up with patchy service, long lines, poor coverage, and crowded stores. But the moment either brand took a single step that felt out of line with their scrappy, consumer-champion image, the backlash was intense.


Golan had to pause negotiations to sell the company to one of Israel’s telecom monopolies, and ultimately wiped hundreds of millions off its valuation. Cofix suffered heavy losses after raising prices by just one shekel. Though still much cheaper than competitors, they had to launch a national damage-control campaign.




When UI Feels Like a Betrayal



In digital spaces, user expectations also shape how major players like Facebook and Google roll out interface changes. One of our most basic expectations is waking up to a familiar world. Major UI shifts cause discomfort at best—and outrage at worst.


A classic industry anecdote involves eBay. One day, they swapped their iconic yellow background for a cleaner white one. The user outcry was so massive that eBay had to revert to yellow—and then gradually fade it to white, little by little, each day.


Google, too, learned the hard way not to “drop” new interfaces on users. When they released Inbox—an alternative email platform—it was done quietly, almost in secret. I recently exchanged emails with a Google exec who was fumbling to navigate the Inbox app, surprised anyone still used it.


Even Snapchat, the disruptor darling with a very young user base, faced a major backlash after a significant UI update in February. Over 800,000 users signed a petition to “bring back the old Snapchat.”


That’s why Facebook changes its interface extremely slowly. I was a very late adopter—no Facebook account until 2014. By then, even our grandmothers had learned to “like” posts. When I finally joined, I was shocked. The design felt ancient, stuck in the late 2000s. Worst of all—there were buttons everywhere! Dozens of them!


At first I thought my connection was slow (I signed up at London’s Luton Airport), and that I was seeing a “light” version of the site. But no—that was the real thing. My late entry to the world’s biggest social network made it painfully clear just how cautiously Facebook touches its interface. You really don’t want to move the cheese too quickly for 2 billion people…




In Conclusion



The promise (offering) of a product creates user expectations that don’t always match the founders’ perceptions. It’s hard to overstate the importance of calibrating the product to actual user expectations. And if that’s not possible—as often happens with early-stage startups when initial promises turn out to be inaccurate or unrealistic—it’s critical to pivot and reset expectations with users.


On the flip side, a mature product with an established user base builds its own expectation of smooth continuity. Any update that disrupts usage patterns must deliver clear and meaningful benefits to gain acceptance.


And sometimes—like with the iPod—founders deliver exactly what they promised, and users still believe they’ve been duped.

So what can we promise users when even they don’t quite know what they want?

That’s a topic for another time.



This post originally appeared on Geektime.co.il


 
 
 

Comments


bottom of page