Recommender system failure as a business model: Repellent ad? Pay for premium!
While writing my last post, I spent a lot of time worrying about whether we really understand the forces at play with so much of our information world driven by business models based on clicks. My underlying assumption was that these forces all perpetuate the dependence of the production and flow of information in today's world on advertising. Today, I was reminded of the importance of thinking out of the box, and never assuming anything: there might be exceptions.
Here's what's happening: YouTube incentivized me to subscribe to YouTube Red by showing me an ad that raised the hair on the back of my neck, and then giving me a pop up window asking "Want to remove ads?" (screenshot above).
Specifically what happened: the pre-roll ad for my video was from Urban Carry Holsters:
and while my video played I had Urban Carry Holsters videos suggested at the upper right hand of my page:
After watching this for a while in horrified fascination, YouTube opened a pop-up:
The "try it" button might as well have been labeled: "Get me out of here!"
Pretty brilliant, really. What I am assuming is happening (i.e., "may be could be happening") is that the recommender system algorithm is optimized to increase not only the number of ad clicks, but also the number of YouTube Red subscriptions.
Of course, I am a proponent of recommender systems that are not designed to fulfill a single target . The target could be ill-designed, and the world is also just not that simple.
However, I am of two minds about what YouTube just did to me as a user. First, when we talk about gun violence in the US, we talk about deaths and causalities. The discussion of the psychological wear and tear is often in the shadows. If my heartbeat rises with an ad like this one, then I can't even imagine what parents must go through, who send their kids out the door in the morning to school with the constant worry of stray bullets and guns in irresponsible hands. Ads like these just contribute to the second-order harm that the fact that we have no real gun solution inflicts on society. YouTube's recommender should know enough about me to protect me from the psychological wear and tear (which results in wasted time).
Second, maybe YouTube should not be protecting me, but exposing me to more. (Yes, I am of two minds, and the second is completely opposite.) If recommender systems recommend advertisements that are personalized to be repellent for users, it could be a force that drives subscriptions at a large scale. If enough other people react like me, we will soon be on the road to being able to fund the production and distribution of information based on quality and trust, funded by subscriptions, rather than on clicks.
There is a chance that this ad is not a complete recommender system misfire. The Urban Carry Holster ad was not actually an utter mismatch for my tastes. They show that the holster was designed on the basis of a "user study", and I have certainly purchased a number of high quality real leather handbags in my day. It's the "detail" of putting the gun inside of it that freaked me out.
So maybe it is a recommender system failure, or maybe it is the most important thing that recommenders have done for our online information ecosystem in years. Whichever of these points of view ends of winning, it is something worthwhile thinking about.
My only concern is the manipulation aspect: in order not to destroy trust with YouTube, I would appreciate knowing that the ads are optimized to increase YouTube Red subscriptions, and I am indeed being nudged.
I divide my time between Radboud University Nijmegen and Delft University of Technology in the Netherlands. My research focuses on multimedia retrieval techniques that exploit speech and language and focus on human interpretations of meaning. I am particularly interested in internet video, in networked communities, and crowdsourcing techniques. Lately, I've been noticing how difficult it is to imagine life without search.