Might the age of asymmetric information – for better or worse – be over? Market institutions are rapidly evolving to a situation where very often the buyer and the seller have roughly equal knowledge. Technological developments are giving everyone who wants it access to the very best information when it comes to product quality, worker performance, matches to friends and partners, and the nature of financial transactions, among many other areas.
These developments will have implications for how markets work, how much consumers benefit, and also economic policy and the law. As we will see, there may be some problematic sides to these new arrangements, specifically when it comes to privacy. Still, a large amount of economic regulation seems directed at a set of problems which, in large part, no longer exist.
Used Cars
Let’s start with a simple and classic illustration from the economics literature, namely George Akerlof’s pioneering paper from 1970 about asymmetric information and the market for used cars. In the core version of this model, sellers have better information than buyers: sellers know the value of their car but buyers know only the value of used cars on average. Since buyers don’t know the quality of a seller’s car they will be willing to pay only the average value. But if buyers are only willing to pay for average quality, why would anyone want to sell a car that is of above average quality, a plum? When the plums exit the market, the average value of the used cars for sale falls even further and buyers are willing to pay even less. Following the logic, we end up with a situation where only a few lemons are bought and sold, thus the moniker “the market for lemons.”
The market for used cars, however, has been one of the earlier examples where market institutions largely (albeit not completely) solved the problem of asymmetric information. Even in 1970, the market for used cars was extensive, and some institutions existed to make information more symmetric. Perhaps the most important of these was the odometer. First used by Alexander the Great to measure distances between cities, modern odometers were standard on almost all cars by 1925. The odometer reading is the single most important piece of information about a specific car that determines its value, and that is why used car prices are adjusted for mileage. The law contributes to this solution by making odometer tampering illegal and successive state and federal laws have increased the penalties and enforcement over time. In 1972, for example, the Federal Odometer Act made tampering a federal felony. As with other crimes, punishment doesn’t eliminate tampering but it does reduce it quantity thus making odometer readings more trustworthy and quality information more symmetric. Even more importantly, the Truth in Mileage Act of 1986 requires that sellers disclose and record the odometer reading on the title at every transfer of title. The 1986 Act greatly reduced the benefits of tampering because the odometer could not be rolled back prior to the reading from a previously recorded sale.
Since the 1986 Act, odometer readings and recordings have become more frequent. Odometer readings, for example, are now made at emission inspections and safety inspections. In many states such readings are made once per year. Services such as CarFax collect and report odometer readings from title transfers and inspections, making the information easily available for a small fee. In the future, states or the private sector could provide this information online for free. In addition to odometer readings, services such as CarFax collect information from service stations and insurance companies about repairs and accidents. The information collected is incomplete but it can be very useful in the important cases, such as when a car has been flooded or totaled.[1]
Perhaps the most telling fact is that the market for used cars is already some three times larger than the market for new cars (as measured by unit sales, see Bureau of Transportation Statistics). In 2012, for example, there were 40.5 million used car sales compared to 14.5 million new car sales (NIDIA 2013). On average, used cars sell for about a third the price of new cars, so the total size of the two markets is similar with both around $330 billion in sales. There just aren’t that many lemons to sustain such a high transactions volume. In fact both high-quality and low-quality used cars are available in fairly liquid, fairly transparent markets.
Information symmetry about the quality of automobiles is very likely to increase. Almost all vehicles today have “event data recorders” aka “black boxes,” similar to those found in airplanes. Event data recorders record data on vehicle performance and diagnostic checks but also speed, braking, seatbelt use and other information relevant to safety and car crashes. Some car companies, most notably Tesla, can collect such information remotely or stream it in real time. Tesla, for example, collects information on a vehicle’s odometer, service history, speed, location, battery use, charging time, braking, starting and stopping times, air bag deployment—even radio and horn use.[2] When a vehicle is sold the data transfers with the vehicle. It is now possible to prove that a used car really was driven by a grandma just on Sundays.[3]
Asymmetric information is no longer a plausible description of the used car market and, as a result, we should not be surprised that these markets are thriving, whether in terms of volume, diversity of product, or their ability to deliver a reliable purchase at a reasonable price.
What about the adverse selection argument as it applies to health insurance? There too it seems that we are seeing a rapidly increasing symmetry of information.
Black boxes for people are not yet standard, but wearable sensors can monitor movement, heart rate, and heart rhythm, blood pressure and blood-oxygen levels, and glucose levels and other health-related statistics. Such information can be recorded and reported through smartphone apps, watches, and other wearable devices. Life insurance firms already have enough information from actuarial tables to make a good guess about an individual’s health. In the data we see that life insurance rates decline with the purchase of larger policies, which is the opposite of the prediction of the adverse selection model, namely that rates should increase with purchases (Cawley and Philipson 1999).
The actual problems with health insurance markets have less to do with information asymmetry and adverse selection than with too much information. That can make some people’s insurance very expensive at actuarially sound rates. For instance if you have a cancer of a given kind, this is verifiable to the outside world, and if the treatment costs are $200,000, the cost of an insurance policy will in turn be about $200,000. Buying the policy won’t be cheaper than buying the treatments, and in that sense the market for insurance is not always present. That is a very real public policy problem, but it is not well understood by invoking standard theories of asymmetric information.
The cheap sequencing of the genome may accelerate and intensify these issues. Science still is not able to infer so much from a sequenced genome, but to the extent this changes medical information and indeed information about the person more generally will become more public. Even if privacy legislation is in place, the market equilibrium may induce a lot of disclosure, because employers and others (potential dating partners?) will infer negative information from a lack of disclosure (Tabarrok 1994). In the limiting case we can imagine that each person will carry indicators of genetic information and also of environmental upbringing, allowing other parties to evaluate that individual much more accurately than before.
Moral Hazard
Moral hazard is another kind of asymmetric information problem which very often can be tolerably overcome with cheap, ubiquitous information. By moral hazard we mean the tendency of a better informed party to exploit its information advantage in an undesirable or dishonest way; for instance it is moral hazard when a worker shirks on the job or when a business enterprise takes too much risk at the possible expense of its bondholders.
Let’s again consider the example of cars. A typical moral hazard problem might be that consumers insure their cars, but then drive recklessly, knowing that the insurance company will pick up the bill for any resulting damage. Deductibles have helped with this problem, but these days there are better remedies yet. For instance consumers can use the type of data collected by Tesla and other manufactures not only when selling their cars, but also to share with insurance companies in return for lower rates. A consumer who is a good and responsible driver can prove it to the outside world, or at least give a good indication that this is likely the case.
Since 1996 all cars manufactured and sold in the United States must have a standardized On Board Diagnostic (OBD) port. A similar requirement exists for Europe and China and for heavy duty vehicles (HDOBD). Today, Progressive Insurance offers “Snapshot,” a simple device that connects to the OBD port. The Snapshot device continuously streams vehicle data such as speed, time, VIN number, and G force to Progressive. Consumers who drive fewer miles, drive during daylight or early evening and without sharp braking typically receive lower rates. Snapshot in part allows Progressive to better select safer drivers, but it also reduces moral hazard. One user, for example, reported:
After my six month use of Snapshot, I’ve concluded that it’s most effective at helping drivers become more aware of their vehicle, driving conditions and slowing gracefully to a stop. It took me roughly a couple of months to retrain my driving behavior.
An interesting feature of Snapshot is that the user doesn’t have to be a current Progressive customer. Anyone can install Snapshot for 30 days and at the end of the 30 days receive an insurance quote. Allstate offers a similar system, as does GMAC insurance for customers using GM’s OnStar system. It is now possible in some states to buy insurance by the mile, a logical extension of these systems. That will reward drivers who put their vehicles less at risk, and of course those same drivers are putting other vehicles and pedestrians less at risk too.
The extensive information available through the OBD port can be used to control not just drivers’ behavior but also that of the repair shops. Keep in mind that most people today have the power of a 1990 Cray-II supercomputer in their pocket. With a Bluetooth connector to the OBD port a smartphone app can report fault codes, coolant temperature, fuel pressure, and many other performance characteristics in addition to speed, distance, location, and so forth. The extensive information from the car can be used to analyze and diagnose problems exactly as a mechanic would do. Once again, the information doesn’t have to be perfect or perfectly understood to alleviate the most serious moral hazard problems. If the mechanic says the car needs a new Johnson rod and the smartphone reports no problems, the consumer knows that it’s at least time to seek a second opinion.
Reputation Mechanisms
Reputation is one very general way to think about solutions to moral hazard problems. A mechanic with a reputation for honest dealing can earn more business at a higher price. Cheating becomes less valuable when the price is a loss of reputation.
In recent times, information technology has made it easier to observe a seller’s reputation and to contribute to the formation of a seller’s reputation at low cost. Yelp, Angie’s List, and Amazon Reviews all make it easy for past buyers to report their observations on seller quality and for future buyers to observe a seller’s accumulated reputation. And of course it is not just sellers who are rated but workers too are evaluated in a variety of ways; for instance many employers check a worker’s credit rating, or on-line history, before making a hire. We may be creating some privacy problems with these techniques, but the old school issues of asymmetric information are drying up rapidly.
Early reputation mechanisms were one-way, namely that buyers would generate reputations for sellers, but now the ratings often go both ways. Many of the exchanges in the sharing economy, including Uber (transportation), Airbnb (accommodations), and Feastly (cooks) use two-way reputational systems. That is the customer rates the Uber driver, but in turn the Uber driver rates the customer. Dual reputation systems can support a remarkable amount of exchange even in the absence of law or regulation. The Silk Road marketplace for illegal goods, for example, supported millions of dollars of exchange through a dual reputation system. On the Silk Road it was possible to pay for goods in advance of delivery or to buy goods which were delivered before payment was made. In each case, honesty was maintained through reputation even without legal recourse for contract breach.[4] Thus, in these cases reputation maintained quality even when theories of information asymmetry would have predicted the problematic nature of any exchange at all.
From these examples, however, we can see one major problem of the new information economy, namely a lack of privacy. For instance in the case of cars, it is easy to track where a driver goes, how long he stays there, and perhaps what his or her mood was along the way. Our cellphones track our personal movements, as does Facebook. More generally, online information often gives a fairly complete portrait of who we are, who are friends are, what we do, and many of our views and personal inclinations, including our previous infractions and perhaps also our personality defects. By no means is this all bad, because in large part we are able to project an image out into the world and match better with friends, partners, and jobs. Still, by no means does everyone feel privacy limits are set in the right place, and ours may become a world where second chances are harder to come by.
If you have a criminal record, or have behaved badly in the past, it is hard to create a new social identity when so much information is available online. Conceivably this could mean too few chances to recover from mistakes. In theory, rational actors can always discount low-quality information, but most people aren’t Bayesian. It’s telling that the legal system often prevents juries from hearing relevant information about both criminal defendants and victims. In a Bayesian world this makes no sense, but if people overreact to certain types of information, less information may result in better decisions.
Private and social returns to information may also differ. Each employer may find it cheap to discriminate against potential employees with an arrest record, for example, but when all employers discriminate in this way, a large class of people may find it difficult to find employment, and that in turn may increase recidivism.
Even for people who have done nothing wrong there will be greater risk aversion ex ante. We will all work too hard to avoid the perception of having done something wrong. If there are no second chances then second thoughts become ever more important. Perhaps we shouldn’t send that tweet, write that blog post, or contribute to Cato Unbound! Even if there are no repercussions today, we don’t know how the future will judge our past.
Even when a privacy “opt out” is allowed, a problem remains. Those individuals who wish to keep more privacy often have to forsake the benefits of modern technology and return to the earlier, more costly world of asymmetric information. We can all think the trade-offs are worth it, “all things considered,” and still see problems with how part of those trade-offs are working out.
It’s possible that advances in cryptology may create reputation mechanisms that are compatible with the demand for privacy. One of the most remarkable discoveries in computer science and cryptological research, for example, has been that reputation is compatible with anonymity and can be leveraged across markets with different pseudonyms (Camenisch & Lysyanskaya 2001, Androulaki et. al. 2008). You can buy and sell on eBay for instance, without having a publicly known name, and yet still reap the benefits of the modern reputation economy. One frontier set of issues is what the demand for such systems might be like and how far such mechanisms might be extended. At the moment this remains an open question.
Principal-Agent Problems
These problems have a common structure, namely that principals hire agents to produce output. Output is a function of agent actions and also noise. Agents know their actions but principals do not, and so principals must infer agent actions from output and the structure of noise. Incentive design theory provides lessons on how principals can optimally infer agent actions and how rewards should be structured to best navigate the tradeoffs between putting too much or too little risk on the agents (Prendergast 1999).
One simple solution to principal-agent problems is to reduce the information asymmetry so that principals better observe agent actions. Consider the parcel delivery firm UPS. UPS monitors the mechanical performance of all of its trucks and their location, speed, and braking behavior. UPS also knows every time a truck starts or stops, when a door is opened and closed, and whether a driver is wearing his or her seatbelt, among other pieces of information. UPS uses this information to optimize production routes. The number of stops and starts required to deliver packages is minimized. Routes are matched to the vehicles that get better mileage at the speeds the routes require, and so forth. Driver actions on the same routes are also compared to ensure that all drivers are behaving with maximum efficiency.[5] UPS estimates that a savings of one minute per day per driver increases profit by $14.5 million over the course of a year.[6] As a result of its technology, UPS, the principal, knows more about the actions of its agents than the agents themselves. That is a reversal of principal-agent theory, and that state of affairs is characteristic of the new information revolution.
Even simple information can be used to overcome many principal-agent problems, even in lower technology settings. A random sample of teachers in India found that about a quarter are absent on any given day (Kremer et al. 2005). In a field experiment, Duflo, Hanna, and Ryan (2012) showed that requiring the teachers to take a picture at the start and end of each day showing themselves and their students reduced absentee rates by over 50%, with resulting significant improvements in child learning and achievement. Systems like this are so cheap they are being moved from field experiment to practice. In 2014, India introduced a system that logs the entry and exit times of government workers. The system uses cheap fingerprint scanners to avoid cheating, and all of the information is publicly available in real time at http://attendance.gov.in/. Currently, over 80,000 government workers in New Delhi are logged and another 35,000 are logged by similar system in the state of Jharkhand (http://attendance.jharkhand.gov.in/). The system only logs entry and exit time, but failure to show up for work is likely the most important principal-agent problem in this setting.
Government workers in the United States are also coming under greater monitoring, most notably police officers. Inconsistencies between police reports and later discovered cell phone video recordings indicate that often the police behave worse than they let on. Many localities are now debating whether to require police to wear body cameras. A randomized controlled trial found that after body cameras were put into place, reports of use of force fell by more than half as did the number of complaints against police officers (Ariel, Farrar, Sutherland 2014).
Many “public choice” problems are really problems of asymmetric information. In William Niskanen’s (1974) model of bureaucracy, government workers usually benefit from larger bureaus, and they are able to expand their bureaus to inefficient size because they are the primary providers of information to politicians. Some bureaus, such as the NSA and the CIA, may still be able to use secrecy to benefit from information asymmetry. For instance they can claim to politicians that they need more resources to deter or prevent threats, and it is hard for the politicians to have well-informed responses on the other side of the argument. Timely, rich information about most other bureaucracies, however, is easily available to politicians and increasingly to the public as well. As information becomes more symmetric, Niskanen’s (1974) model becomes less applicable, and this may help check the growth of unneeded bureaucracy.
Cheap sensors are greatly extending how much information can be economically gathered and analyzed. It’s not uncommon for office workers to have every key stroke logged. When calling customer service, who has not been told “this call may be monitored for quality control purposes?” Service-call workers have their location tracked through cell phones. Even information that once was thought to be purely subjective can now be collected and analyzed, often with the aid of smart software or artificial intelligence. One firm, for example, uses badges equipped with microphones, accelerometers, and location sensors to measure tone of voice, posture, and body language, as well as who spoke to whom and for how long (Lohr 2014). The purpose is not only to monitor workers but to deduce when, where and why workers are the most productive. We are again seeing trade-offs which bring greater productivity, and limit asymmetric information, albeit at the expense of some privacy.
As information becomes more prevalent and symmetric, earlier solutions to asymmetric problems will become less necessary. When employers do not easily observe workers, for example, employers may pay workers unusually high wages, generating a rent. Workers will then work at high levels despite infrequent employer observation, to maintain their future rents (Shapiro and Stiglitz 1984). But those higher wages involved a cost, namely that fewer workers were hired, and the hires that were made often were directed to people who were already known to the firm. Better monitoring of workers will mean that employers will hire more people and furthermore they may be more willing to take chances on risky outsiders, rather than those applicants who come with impeccable pedigree. If the outsider does not work out and produce at an acceptable level, it is easy enough to figure this out and fire them later on.
Escrow Systems, Artificial Agents and Malleable Memory
When two parties do not trust one another, they may be reluctant to trade if information about the quality of the merchandise or payment cannot be quickly observed. An escrow system increases trade by introducing a trusted third party. The traders convey the merchandise and payment to a third party who makes the transfer if and only if the merchandise and resources satisfy the conditions of the traders.
Trusted third parties may themselves be difficult to find or expensive. It’s possible, however, to create trusted third parties in software. Using a blockchain based system of decentralized money, for example, contracts can be created which execute only when certain conditions are met; for example the S&P 500 rises by 4% or more on a day when the Nikkei falls by 5% or more. Moreover, such trustworthy contracts can be created with neither of the parties knowing the identity of the corresponding party.
Traders are sometimes reluctant to trade because even a bid or ask can reveal information the trader doesn’t want revealed. Escrow systems using software can solve many of these problems. A simple but telling example is the expression of interest in dating another person. Revelation of interest can be uncomfortable, especially as it may not be reciprocated. Phone apps like Tinder allow users to express interest in other users, but the users are not able to contact unless both express an interest in each other. In this case, the double coincidence of wants is not a problem but a feature. The usefulness of the Tinder app in overcoming the information asymmetry problem is indicated by its user base of 50 million, who together make millions of connections a year (Bilton 2014).
The evolution of artificial intelligence suggests yet further ways of overcoming information asymmetries. Arrow (1963) argues that one problem with the market for information is that it is difficult for the buyer to know the value of the information without knowing the information itself. Once the information is known, however, the potential buyer need no longer purchase it.[7]
A third party escrow system can solve this problem by making the exchange if and only if the third party judges that the buyer would value the information at more than its price if only the buyer knew the information. Two difficulties with the third party escrow system are, first, that the third party must be trusted by both not to use the information that it has acquired for its own purposes, and second, the third party must be trusted to accurately represent the buyer when judging whether the buyer would value the information at its price. Both of these problems can be solved by making the third party an artificial intelligence.
An artificial intelligence can be trained to evaluate information on behalf of the buyer (or seller). Some such systems have already been noted, such as a bitcoin system that makes a financial exchange only when certain conditions have been met. In this case, the artificial intelligence need only consult publicly observable information. Much more is possible, however, as A.I. systems grow in power. In a potential buyout, for example, a buyer’s A.I. system might be given access to a corporation’s internal financial reports. It would then report back to the buyer whether the corporation was a good buy at the proposed price, and if necessary the memory of the A.I. could be wiped. In this way, the A.I. would report back only a yes or no, easing the transaction but keeping the core information confidential.
Conclusion
A lot of economic theories about asymmetric information, while logically correct, have been rendered empirically obsolete. We are not suggesting that this new world is perfect in every way, and indeed privacy is one of the major concerns. Still, the passing of many information asymmetries will lead easier trade, higher productivity, and better matches of people to jobs and to each other.
These changes also cast new light on the costs of a political system that produces many new regulations but repeals very few old ones. The American regulatory apparatus is increasingly out of date. It is geared to problems that peaked in the previous generation or even earlier. We should revisit the topic of regulatory reform, with an eye toward making more regulations temporary, or having automatic sunset provisions, unless they are consciously and intentionally renewed for reasons of their continuing usefulness.
[1] In addition, state lemon laws generally require that a consumer is entitled to a refund or replacement of a “lemon,” defined as a car that has not been repaired within a reasonable number of attempts. Cars that have been termed lemons under the law are recorded.
[2] Tesla web site, PRIVACY, PATENTS, LEGAL, AND INFORMATION SECURITY, http://www.teslamotors.com/legal, accessed November 10, 2014.
[3] The information that Tesla and other car companies collect is currently easily available to Tesla and authorized mechanics. A proposed bill in California, The Consumer Car Information and Choice Act (http://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=201320140SB994), would give consumers the right to easily access all data collected by the car manufactures in a form that could also be transmitted to third parties.
[4] The original Silk Road was shut down by the FBI but quickly resurfaced as Silk Road II which also has been subsequently shut down. Silk Road 3.0 will probably be operational soon.
[6] Goldstein, J. 2014. To Increase Productivity, UPS Monitors Drivers’ Every Move. NPR.org. Retrieved November 11, 2014, from http://www.npr.org/blogs/money/2014/04/17/303770907/to-increase-product…
[7] Patents are one attempt to overcome this problem. If a piece of information is patented, not everyone who knows the information may use it as they might like. Non-disclosure agreements serve a similar purpose for non-patented information. Not all information can be patented, however, and non-disclosure agreements leak and are often difficult to enforce.
References
Androulaki, E., Choi, S. G., Bellovin, S. M., & Malkin, T. 2008. Reputation Systems for Anonymous Networks. In N. Borisov & I. Goldberg (Eds.), Privacy Enhancing Technologies, Lecture Notes in Computer Science (pp. 202–218). Springer Berlin Heidelberg. Retrieved from http://link.springer.com/chapter/10.1007/978-3-540-70630-4_13
Ariel, B., Farrar, W. A., & Sutherland, A. 2014. The Effect of Police Body-Worn Cameras on Use of Force and Citizens’ Complaints Against the Police: A Randomized Controlled Trial. Journal of Quantitative Criminology, 1–27.
Bilton, N. 2014, October 29. Tinder, the Fast-Growing Dating App, Taps an Age-Old Truth - NYTimes.com. Retrieved January 16, 2015, from http://www.nytimes.com/2014/10/30/fashion/tinder-the-fast-growing-datin…
Camenisch, J., & Lysyanskaya, A. 2001. An Efficient System for Non-transferable Anonymous Credentials with Optional Anonymity Revocation. In B. Pfitzmann (Ed.), Advances in Cryptology — EUROCRYPT 2001, Lecture Notes in Computer Science (pp. 93–118). Springer Berlin Heidelberg. Retrieved from http://link.springer.com/chapter/10.1007/3-540-44987-6_7
Cawley, J., & Philipson, T. 1999. An Empirical Examination of Information Barriers to Trade in Insurance. The American Economic Review, 89(4): 827–846. Retrieved from http://www.jstor.org/stable/117161
Duflo, E., Hanna, R., & Ryan, S. P. 2012. Incentives Work: Getting Teachers to Come to School. American Economic Review, 102(4): 1241–78.
Goldstein, J. 2014. To Increase Productivity, UPS Monitors Drivers’ Every Move. NPR.org. Retrieved November 11, 2014, from http://www.npr.org/blogs/money/2014/04/17/303770907/to-increase-product…
Kremer, M., Chaudhury, N., Rogers, F. H., Muralidharan, K., & Hammer, J. 2005. Teacher Absence in India: A Snapshot. Journal of the European Economic Association, 3(2-3): 658–667.
Lifsher, M. 2014, March 18. Bill would give auto owners more control over vehicle data. Los Angeles Times. Retrieved from http://www.latimes.com/business/la-fi-connected-cars-bill-20140319-stor…
Lohr, S. 2014, June 21. Workplace Surveillance Sees Good and Bad. The New York Times. Retrieved from http://www.nytimes.com/2014/06/22/technology/workplace-surveillance-see…
Manna, J. 2014, June 21. What Every Driver Needs to Know about Progressive Snapshot. Joe Manna. Retrieved January 10, 2015, from https://blog.joemanna.com/progressive-snapshot-review/
Niskanen, W. A. 1974. Bureaucracy and Representative Government. Transaction Publishers.
Prendergast, C. 1999. The Provision of Incentives in Firms. Journal of Economic Literature, 37(1): 7–63. Retrieved from http://www.jstor.org/stable/2564725
Shapiro, C., & Stiglitz, J. E. 1984. Equilibrium Unemployment as a Worker Discipline Device. The American Economic Review, 74(3): 433–444. Retrieved from http://www.jstor.org/stable/1804018
Tabarrok, A. 1994. Genetic Testing: An Economic and Contractarian Analysis. Journal of Health Economics 13:75-91. http://www.sciencedirect.com/science/article/pii/0167629694900051