Once you start noticing, you can see it everywhere: The 'sameness' of style, music, art and even food.
New Yorker staff writer Kyle Chayka and author of new book Filterworld How Algorithms Flattened Culture blamed algorithms that direct our attention to what works best for digital platforms.
You see it in Google searches, Facebook feeds and the ads that follow us online. The end result is that we have stopped deciding what we like for ourselves.
Chayka told Afternoons he wanted people to better understand how these forces shape our taste and stop letting a computer-generated formula dictate our experiences and choices.
The phrase "filterworld" describes the immersive environment of algorithms that are adjusted to what we might like to consume.
"An algorithm is just an equation ... and what they judge by is really engagement. So it's what's already popular, what are other people looking at, what are other people clicking on and I think that tends to reinforce this sort of homogeneity.
"It funnels us towards the same styles or aesthetics or sounds that work for the biggest number of people."
While there are the obvious culprits like YouTube, Spotify or Netflix, Chayka said even physical world places like cafes and restaurants are being affected.
Many of our decisions for consumption have moved online, he said. For example, checking out a restaurant on Instagram or reading it's menu first on Google Maps.
"I think physical spaces have started to adapt to the aesthetics that are popular online."
He recalled travelling around the world as a freelance journalist and looking for coffee shops.
Most of them, no matter where he went had the same aesthetic.
"I could just open Yelp or ... Google Maps and just find this particular style of generic coffee shop and we all know what it is.
"[It's] reclaimed wood furniture, often avocado toast on the menu, latte art, you could just find that wherever you went."
The new gatekeeper
Gatekeepers of information have always existed.
Traditionally, that looked like television stations, magazines and newspapers. While Chayka acknowledged that had its own problems, having algorithms as new gatekeepers also had issues.
"It only judges based on data ... it has no feelings, it has no creativity, it has no human soul."
While the marketing message was that recommendations would be personalised, the reality was they would be what was most convenient for the platform, he said.
He pointed to Netflix as an example.
"[It's] supposed to be so personal, it shows me what I want to watch. But actually Netflix algorithmically adjusts the thumbnails of the shows and movies to make them appeal to you more.
"Rather than catering to you, it's manipulating you into thinking that you like what's already there, which I think is really sad."
So did exploring algorithms make Chayka question his own taste?
He admitted that his television preferences probably were not the best.
"I don't know if I have the most amazing taste in television and so I'm just sort of watching what's recommended ... and I can't tell if it's truly moving me or not."