AI guides us based on our past, but is that always a good idea?
The Comfort of Familiar Choices
It feels good when a system “gets” us. Click on a few recipes for spicy ramen, and suddenly your feed is full of noodle shops, chili paste brands, and chefs breaking down the science of perfect broth. That kind of personalization can feel like a friend whispering, “I know what you’ll love next.” In food, it means we can easily go deeper into our favorite cuisines.
But there’s a hidden cost: the more we stay in one lane, the less likely we are to discover what’s happening outside of it. If algorithms decide that you’re “the Korean food guy,” you may never stumble across Texas barbecue, vegan Ethiopian dishes, or a rustic French cassoulet. What starts as helpful guidance can quietly narrow the horizons of our experiences.
From Food to Politics: The Walls We Build
The same mechanism that brings you more noodle shops also shapes your political news, sports commentary, and even the jokes you see online. Algorithms are built to feed us more of what we already click on. The side effect? A loop that reinforces our views while shielding us from alternatives.
In politics, this reinforcement can become dangerous. People can be pushed into echo chambers where they see only one side, and their emotions are continually fueled by reinforcement. Some observers have even speculated that recent political violence might have roots in this algorithm-driven division.
The research on this isn’t one-sided. A systematic review of 129 studies found mixed evidence about whether echo chambers and filter bubbles truly dominate online life. Some studies show strong reinforcement loops; others show weaker or more context-dependent effects. But even small levels of isolation, repeated daily, can magnify divisions.
How Algorithms Amplify—and How People Play Along
Studies confirm that algorithms can amplify polarization. For example, the Center for Media Engagement found that platforms like Facebook and Instagram strongly shape what political news people see, often along ideological lines. Similarly, Brookings reports that engagement-driven designs tend to reward more emotionally charged, divisive content.
At the same time, researchers emphasize that user behavior matters. People often prefer to click on and share content that confirms their beliefs, even when exposed to opposing views. In other words, algorithms amplify—but we often play along.
Short Video, Influencers, and the Speed of Division
Newer platforms add a twist. Research on short video platforms (like TikTok and Douyin) shows stronger clustering around narrow topics and viewpoints, with rapid sharing accelerating the spread of polarizing or misleading content. And increasingly, many people—especially younger audiences—get their news from social media influencers rather than traditional outlets. A Knight Foundation study found these influencers often lean politically in one direction, shaping large swaths of public opinion without the checks and balances of journalism.
The Rise of “Pink Slime” News
Another contributor is the growth of websites that look like local news outlets but are actually funded by partisan groups. These “pink slime” sites are multiplying, particularly as traditional local newspapers shrink. The Financial Times has documented how these outlets exploit people’s trust in local branding to deliver politically loaded content disguised as community news.
When algorithms boost such outlets, the result can be a feedback loop of distrust and misinformation.
Can We Nudge AI Toward Curiosity?
The good news is that it doesn’t have to be this way. Research suggests that small algorithmic changes can make a difference. One study showed that nudging recommendation systems can increase exposure to political diversity and reduce ideological bias. Other researchers suggest using signals like “audience diversity” (how many different political groups read a site) to help surface more balanced content.
Even without algorithmic fixes, individuals can broaden their own “media diet” just like they would a food diet: mix comfort food with something new. Try sources you wouldn’t normally read, follow people outside your usual circle, and consciously seek balance.
Takeaway: Curiosity Over Comfort
AI personalization is powerful, but unchecked, it narrows the world. By broadening the inputs—whether food, politics, or sports—AI could shift from reinforcing divisions to encouraging exploration. Food may be the easiest place to start: invite someone to try your favorite meal, ask for theirs in return, and let curiosity, not algorithms, set the menu.
Comments