Ep 67: How Pinterest Built One of Silicon Valley’s Most Successful Algorithms

Pinterest was built on assumptions and biases

Unlike most social networks, Pinterest admits it.

  • From the start, you tell the company how to profile you:
    • The service asks two personal questions when you register — your age and gender — and how you answer them shapes everything that happens next.
    • Based on your responses, along with your language, region, and bits of your browsing history, Pinterest chooses an array of topic categories it thinks you might be interested in and asks you to pick at least five.
    • Tell Pinterest you’re a woman in your thirties, and your suggested interests will include “Makeup,” “Hair Tutorials,” “Workout Plans,” and “Dinner Recipes.” Tell it you’re a man in your thirties, and you’ll get some very different choices: “Woodworking,” “Funny Pictures,” “Survival Skills,” and “Gaming.”
  • Once you’ve made your picks, Pinterest’s machine learning software crafts a home feed full of images, or “pins,” that it predicts will appeal to you. This is a crucial moment: Pinterest says its internal data shows that
    • if people see pins they like right away, there’s a good chance they’ll become active users, returning to the site regularly for fresh content related to their interests, viewing ads tailored to those interests, and curating their own “boards” of related pins.
    • If people fail to find anything that interests them at first glance, they may never come back.
  • For the 50 million new users who join Pinterest each year, the sign-up process is the first taste of one of Silicon Valley’s most successful yet least scrutinized algorithms. The code that powers Pinterest’s home feed, search results, and notifications — determining what images and ideas users see at every turn, similar in kind to Facebook’s, YouTube’s, or TikTok’s, is the core product of a $15 billion company that successfully went public this year — the only one among a stock of tech unicorns.

Behind the scenes

  • Pinterest’s engineers and executives are grappling with the same kinds of tensions that have caused trouble elsewhere. In its first year as a public company, it faces a pivotal challenge: How to grow beyond a user base that has historically skewed toward white, suburban women without alienating loyalists, stereotyping newcomers, or potentially allowing for the spread of misinformation and radicalization.
  • The company is rolling out a feature designed to address perhaps its algorithm’s most visible flaw: its tendency to draw the wrong conclusions from users’ past behavior, and pollute their feeds with stuff they don’t want to see anymore — like wedding dresses for a user who broke off her engagement, or nursery decor for a user who suffered a miscarriage.
  • The feature, which Pinterest is calling the Home Feed Tuner, will let users tell the algorithm what to remember and what to forget. It’s expected to reduce complaints and raise satisfaction among a small subset of power users. But it will do little to help the site expand, and could even reduce engagement for those who use it by limiting the information available to the algorithm. It’s the kind of trade-off the company says it’s willing to make, especially since early tests showed no significant drop-off in user activity.

Pinterest plans to announce a new feature titled “Tune Your Home Feed” that lets you tell its algorithm which of your interests and behaviors you want used for future recommendations, and which ones you don’t. Credit: Pinterest
  • Other trade-offs are proving trickier, however, like how to understand users deeply enough to keep them coming back for more, without boring them, boxing them in, or creeping them out. “Users don’t want to be pigeonholed,” says Candice Morgan, the company’s head of inclusion and diversity. “They don’t want us to guess what they’re going to like based on their demography,”.
  • And yet, Pinterest does guess what they’re going to like based on their demography, at least in their first moments after sign-up. If it didn’t, some portion of users would decide Pinterest isn’t for them.
  • Then there are troubles that have plagued higher-profile social networks: viral misinformation, radicalization, offensive images and memes, spam, and shady sites trying to game the algorithm for profit. Here the company has taken a different approach than rival platforms: embrace bias, limit virality, and become something of an anti-social network.

Early users shaped the site’s trajectory

  • As the company’s engineers followed the social media template by developing personalization algorithms that learned from users’ behavior. But relying too much on the specific data generated by these early users led to some problems. For example, you might stumble across a board full of wedding dresses in which the models are all white.
  • Initially, the home feed showed an assortment of the most popular pins from all users, based on the boards they followed, which was perfect for attracting like-minded newcomers, but not for diversifying the site’s appeal.
  • Over the years, Pinterest had to redesign its systems and retrain its algorithms to better identify and target different types of users and map their interests. Hence the question about gender when you sign up, the topic picker that gives the algorithm an initial sense of what you’re into, and the perhaps slightly intrusive (though industry standard) use of browser data that can tell Pinterest whether you’ve visited the site before and how you arrived there.
  • The question about language and region, for example, has helped Pinterest reach audiences outside the US, who had previously complained that the platform “felt foreign to them from the moment they signed up.” Well over half of Pinterest’s users now come from outside the United States, which is in line with other social networks of its size. In some ways, those users are helping to point the way to a more inclusive Pinterest: In Japan, for instance, the company reports that men are as likely to become active users as women after visiting the site for the first time.
  • But dicing users into ever finer subgroups carries its own risks, especially for groups that have historically been underrepresented on the site. Internal data might tell you that welcoming male users with a bunch of macho images boosts activation rates. What it might not tell you is that some subset of male users is turned off, or even offended, by the implicit assumption that they’re into “man caves” or pictures of “beautiful celebrities” who are all women.
  • PPinterest has never attracted as much media scrutiny as the likes of Twitter and Facebook, but that doesn’t mean it’s immune to the problems that have caused scandals elsewhere. One of its notable critics is Mike Caulfield, a media literacy and online communications expert at Washington State University Vancouver. In 2017, he went looking for political culture on Pinterest, and what he found was just about as ugly as what you’d expect on any other social platform. There were boards full of fake news, ethnic stereotypes, and QAnon conspiracy theories. Caulfield argued that Pinterest’s aggressive recommendation algorithm, coupled with its reliance on user-created “boards” of related images, can turn a user’s feed into a hate-filled cesspool within minutes. “After just 14 minutes of browsing, a new user with some questions about vaccines could move from pins on ‘How to Make the Perfect Egg’ to something out of the Infowarverse,” Caulfield wrote.
  • Pinterest’s reaction surprised him: They thanked him for highlighting the problem and invited him to meet with company executives and share ideas for how to solve it. And then, at least on the anti-vax issue, they followed through.
  • In August, Pinterest changed how its search engine treats queries about vaccines. Rather than surfacing the most popular vaccine-related pins, Pinterest said it would now show only pins from major health organizations, such as the World Health Organization and the CDC. Caulfield applauds the company on the move, which amounted to a more decisive stand than most other platforms have taken. It showed that the company was willing to override its own software to address systemic problems that the algorithm alone can’t solve.

But what if optimizing engagement isn’t your ultimate goal?

  • That’s a question some other social networks, such as Facebook and Twitter, have recently begun to ask, as they toy with more qualitative goals such as “time well spent” and “healthy conversations,” respectively. One of Pinterest users’ top complaints for years has been a lack of control over what its algorithm shows them. 
  • Eventually, Seyal says he decided that was the wrong question altogether. Instead, he told the engineers tasked with addressing the user-control problem that they didn’t have to worry about the effects on engagement. Their only job was to find a fix that would reduce the number of user complaints about the feed overcorrecting in response to their behavior.
  • The result of that project was “Tune Your Home Feed,” which it has already made available to some users. In allowing users to tweak how the algorithm responds to each of their actions, Pinterest will offer a level of customization that relatively few will care to employ. 
  • Pinterest is now giving users more control, but like any social network that relies on algorithmically driven recommendations, it ultimately relies on a kind of bias. Unlike its peers, Pinterest welcomes it — as long as it’s the right kind.
  • What the company can do to mitigate these problems, he says, is to look carefully at the types of content its system tends to amplify, and adjust the algorithm’s parameters to prioritize some over others. For instance, Pinterest’s algorithm treats “saves” of a given pin as a much stronger positive signal than clicks. “People don’t really save an inflammatory article about the president, but they do save an outfit they want to buy in the future. So we’re biasing toward those types of interactions, and biasing away from interactions with your friends.”
  • Biasing away from interactions between friends might seem like an odd approach for a social media site. But Pinterest says it’s part of how the company has mitigated problems like harassment and viral propaganda. “Ultimately, we don’t see disinformation campaigns like other platforms do because the algorithm just doesn’t reward it.”
  • But then he pauses and backtracks. The key for Pinterest as it grows, he says, is to remember its own limitations. “I think we want to only be good at what we can be good at. If you want to have every user spend every moment in your product, there’s kind of a lack of humility there. The reality is, tech companies can’t do everything on Earth.”
Content source: Oremus. W. (2019) How Pinterest Built One of Silicon Valley’s Most Successful Algorithms. Medium. Available from: https://onezero.medium.com/how-pinterest-built-one-of-silicon-valleys-most-successful-algorithms-9101afdfd0dd [Accessed 17 October, 2019]

Leave a comment