This post builds somewhat off my last thought. I introduced the idea of coherence generating mechanisms. These are tools meant to synthesize and understand some underlying pattern in the context of social phenomenon. In the realm of music, you can think about Pandora. Via a feedback system, it seeks to hone directly into ones preferences. The more I like R&B songs, the more the system understands that I have a preference for them. The brain also fits this definition. It’s one big pattern recognizer, trying its best to turn thought intensive responses into reflexive impulses. With the rise of “big data” these systems are becoming more ubiquitous, whether it be Google Ads or Netflix’s recommendation systems.
I think the best way to define something is to define what it is not. My idea of coherence generating mechanisms are not simple single lag systems. As in, they do not assume that the future will exactly mirror the past. Thus, the system has to “learn” in some sense. It must seek to create categories of knowledge that it calls upon to utilize in the future. Netflix’s recommendation system fits this since it attempts to utilize both viewing habits and ratings to pin point some locus of movie preference. Other examples will be discussed at length below.
From the standpoint of marketers, the utility of these mechanisms are clear. Strip away all the excess of human activity, and drill down to some innate dispositions and preferences. While I think it is clear people have preferences, be it liking Indian food relative Sischuanese cuisine, I think we need to acknowledge limitations and tradeoffs associated with these approaches. Think about listening to Pandora while you are running. This perfect song comes on that has a beat to keep you moving. (FYI any beats above 145 BPM seem to yield little effect). So you like that song on Pandora. Now three hours later, while sitting in your bedroom, contemplating the reason why cars park in driveways but drive in parkways, that same song comes on. Now in your contemplative state, that song comes off as too much, the beat pulses through your brain disrupting your ruminative state of mind. This situation can be easily resolved if you condition preferences on particular states of mind. But Pandora does not know what you’re currently in the mood for, it only knows the feedback you have generated for it. While these systems feel intelligent, they are simply classic stimulus-response mechanisms.
This feeling of something deeper going on in these systems gives rise to an even more important issue, how does the user feel about this system? What sort of confidence does the user place in it? These systems are built to allow individuals to outsource that impossible task of introspection. Rotten Tomatoes creates a single, unified metric for the quality of the movie. But under the guise of arithmetic precision hides all the underlying biases of the culture industry. Those movies that can garner reviews are the ones that end up on the site. Furthermore, band-wagoning and other group level effects, such as gender and racial bias, can skirt by under the false sense of security that numbers provide. Further studies on how users come to appreciate choices that are derived from these metrics are important, on top of, studies looking at how users utilize these informational channels in further searching behavior.
That the underlying metrics may be flawed does not distract from the way these systems have the potential of opening up users to new domains they have yet to experience. Google Maps gives me the confidence to walk to places I have never walked to. These systems open up the landscape for one to explore. From the perspective of learning, however, do these systems allow the user to internalize the new environment that they are exploring? Research into GPS systems impacts on mental visualizations shows that is not the case. GPS technology trades off with specialized mental tasks that underlie some important parts of cognitive development. Like exercising a muscle, the less users utilize their mental mapping, the harder it is to get it back in gear (Here is a short article detailing ways around this problem.)
These systems do not just have an ability to trade off with internal resources we use, but also resources in the external world, as well. Social communication used to be the route for recommendations in the past. Word of mouth was the primary tool to resolve informational asymmetries in the experiential good market. While studies have found that critic’s reviews seem to be driving revenue generation for films, the question of what these systems do to social ties are beyond the question of simple market analysis. Do we feel less of a need to consult one another because we have Rotten Tomatoes at our finger tips? Do we view our friend’s recommendations with more skepticism if they contradict the “popular” metrics we utilize online? If your friend constantly recommends CDs that MetaCritic deems to be terrible, does your friendship suffer?
These are just a small sample of possible negative implications of outsourcing our own internal pattern seeking, or coherence generating, mechanisms. The ultimate question from an economic science perspective is do the benefits outweigh the costs. While I believe at the end of the day that most do, the lack of attention to some of the negative ramifications of these systems should cause pause when attempting to run your own cost-benefit analysis.