From Seashells to Electrons

 

 

“The use of coin, which has been handed down to us from remote antiquity, has powerfully aided the progress of commercial organization, as the art of making glass helped many discoveries in astronomy and physics; but commercial organization is not essentially bound to the use of the monetary metals. All means are good which tend to facilitate exchange, to fix value in exchange; and there is reason to believe in the further development of this organization the monetary metals will play a part of gradually diminishing importance.”

—Antoine-Augustin Cournot, Researches into the Mathematical Principles of the Theory of Wealth (1838)

 

The manner in which people pay each other has seen few revolutions as fundamental as the spread of electronic money. E-money has altered how we pay for things: most of us now use plastic cards and their digits for many of our daily purchases. And it has altered when we pay for things: about 70 percent of us defer some payments until our monthly credit card bills arrive, and around 46 percent of American households finance some purchases over time on their cards. History teaches us how significant both of these transformations have been.

I. The Evolution of Money

The most fundamental development in the history of money is the birth of money itself. Seemingly from nowhere the first payment system emerged, one based on coins, and transformed trade and with it the well being of our ancestors.

Cash, Check, or an Ox?

For many millennia, people bartered—twenty of my arrows for two of your bushels of wheat; six of your vases for two of my loincloths. As civilization developed, people discovered the convenience of a unit of account, launching the first major transformation in the development of money as a means of exchange and a store of value. Over the next centuries, and today, money is still being transformed, but always with a focus on improving its convenience and usability as a unit of account, a means of exchange, and a store of value.

The Iliad and The Odyssey, which appear to reflect customs around 850-800 BCE, refer to exchanges measured in oxen. A large tripod (back then this three legged product served as an altar or a sacrificial basin) was worth twelve oxen and a skillful female slave four. That didn’t mean you had to pay twelve oxen for the large tripod. You could pay three skillful women or perhaps gold or silver worth twelve oxen.

An ox wasn’t money because, although a fine unit of account, it wasn’t a great means of exchange or store of value. Around two centuries later, the Lydians, building on a millennia-old tradition of using precious metal for exchanges, stamped out coins of fixed weights. Convenient to exchange, easily stored, and readily counted, these were the first money in the Western world. It took industrialized societies another 2,500 years to accept that money was valuable even if not made of precious metal as long as people accepted it for exchange and used the same unit of account. Trust replaced metal.

 

 

Today, people exchange units of account—such as the dollar or the euro—which they convert into goods and services. Cash, checks, electronic transfers of funds among accounts, and payment card systems are ways of exchanging these units. Each is a medium for facilitating the exchange of value among buyers and sellers. And each is a variety of money.

Money, unlike air, is not a free good for society. It requires resources: producing physical money, processing checks, or keeping the books for electronic money. One way or another, people have always paid for the money they use. That is obvious for private payment systems such as checks and cards. It is less obvious, but nonetheless true, for government payment systems based on coin or paper. For instance, mints are usually impressive buildings with large staffs. And the “inflation tax” is one of the oldest levies. This tax is imposed when the government prints more money to buy things, the price level increases as a consequence, and the real value of money held by consumers and businesses therefore declines. It is the price of having the government run this payment system.

 

“Payment systems are two-sided markets. For basic introduction read our book, Catalyst Code, or for a more technical treatment with some applications to antitrust read Markets with Two-Sided Platforms.”

 

History shows what it takes to have a successful medium of exchange. A large number of buyers and sellers must agree to use the same medium. There must be a critical mass to ignite a payment system as we will see in more detail in Chapter 6: no payer wants a medium that few payees take and no payee can insist on a medium that few payers have. To become a standard, the medium must be an efficient form of exchange—seashells worked better than oxen. Reliability is key as well—there may be episodes where, as Sir Thomas Gresham described, bad money drives good money out of circulation and into people’s rainy-day stashes, but in the long run only reliable media of exchange survive. (Gresham, an adviser to Elizabeth I, was an early observer of this phenomenon, now known as Gresham’s Law.)

Lydia, Florence, Boston, and Manhattan

Four major innovations have marked the history of money: (1) the birth of money in the form of metallic coins, (2) the creation of checks that promised payment in money, (3) the creation of paper money, and (4) the emergence of electronic money through payment cards and other methods.

According to Herodotus, perhaps not the most reliable of historians, the Lydians “are the first people on record who coined gold and silver into money, and traded in retail.” For at least a millennium before the Lydians struck the proverbial first coin in the latter half of the seventh century BCE, people had exchanged precious metals made of irregular lumps. But the Lydian coins made it easier to quote a price that everyone could relate to the costs of other goods—they provided a unit of account. They also facilitated transactions among buyers and sellers—they provided a means of exchange. And buyers and sellers alike could hold coins as part of their assets—coins provided a liquid store of value. This first money spread east into the Persian Empire and west to Italy. (There is some dispute over whether the Lydian coin was indeed the first used for exchange. Other civilizations from India to China can lay claim to similar innovations; here we focus on the West.)

LydianA Lydian coin known as a trite with a lion on the front.

Governments became the main sponsors of currencies. While this served a noble purpose by making money a standard, it also generated profit: the difference between the value of a coin in exchange and the value of its metallic content is known as seigniorage. Many Greek cities debased the alloys in their coins to earn more profit from this inflation tax. Athens, though, was known for the integrity of its money. For four centuries after Solon took office in 594 bce, the silver content of the drachma remained roughly constant. Athenian coins became widely used in trade not only in Greece but also in a good portion of Asia and Europe, and this remained the case even after Greece was absorbed into the Roman Empire.

It took about two thousand years to have another event of such lasting importance as to merit being called a revolution: the development of checks. Among westerners, the Italians appear to get the credit for this one. The original invention was the “bill of exchange,” developed in northern Italy around the twelfth century. The bill was a piece of paper issued by a bank that promised the recipient money—at that time, metal coins—from the bank. With this bill, a trader avoided having to carry heavy coins on a long-distance shopping expedition and risk their theft. Bills of exchange proved handy during the Crusades. They were the precursors of paper money—they were valuable only because the possessors of the bills trusted that they would get their funds.

Bills of exchange were also the basis for an innovation in the provision of bank credit. For example, say that a Florentine trader wanted to buy wool in London, to then sell to a weaver in Venice. The cash-strapped trader would go to a bank, which would give the trader a loan to buy the wool. But rather than giving the trader coins from its vault, the bank would provide a bill of exchange. The bill would then go to the trader’s London supplier, which in turn collected from a branch or agent in London (possibly a certain number of days after the purchase, depending on the type of bill). By this time, the trader had returned from Venice and repaid the loan. The bank got a fee for the loan even though none of its coins went anywhere (although periodically the bank would have to settle up its transactions with its London branch). Bills of exchange and their descendant, the check, enabled banks to loan out more money than they had in their vaults. Timing was everything here. It still is as we were reminded in 2008. If too many people cash their checks and withdraw their money at the same time, a bank quickly collapses.

 

 

Credit for another crucial revolution, paper money, goes to the Commonwealth of Massachusetts. In 1690, the colonial government needed to figure out how to pay for an unsuccessful attempt to capture the French fortress in Quebec City. The Commonwealth issued pieces of paper that promised redemption, later, in gold or silver coins. These pieces of paper circulated side by side with gold and silver coins, and were treated as if they were worth the gold or silver they promised. Issuing paper money and postponing its redemption became a regular way for the colonial government to pay for things. (Prices of goods, including gold and silver, rose so that both the purchasing power and the redemption rates of the notes collapsed—an early example of the inflation tax.) Other colonies caught on. So did the rest of the British Empire and later the world. As with Athenian coins, successful paper currencies have been based on the trust that the issuer will not debase their value. The world has less confidence in the US dollar than it once did. Trust of course is relative.

Far lighter and easier to transport than gold or silver coins, paper money provided a convenient means of exchange and, although the ultimate unit was still gold or silver, a good unit of account. Gradually, over the next several centuries, paper money, usually but not always backed by precious metal, became a popular medium of exchange. Most industrialized economies relied on paper money by the beginning of the twentieth century.

At that time, the central banks of most developed nations guaranteed that people could convert their paper money into gold or silver. Of course, as long as people believed they could exchange their paper money for goods and services, and that their paper money would hold its value in terms of goods and services, they had little incentive to trade it in for metal. And by the last quarter of the twentieth century, all industrialized countries had effectively abandoned the pretense that their currency was tied to gold, silver, or anything else of intrinsic value. The successful introduction of the euro on January 1, 1999 reinforced the point that “faith” is enough. You cannot convert the euro to gold or silver, and you have to trust a still loosely connected group of often bickering countries (as the 2010 efforts to deal with the collapsing Greek fiscal system demonstrated) to maintain its value. Nevertheless, having a standard currency in Europe has reduced transactions costs, increased the transparency of prices across markets, and helped integrate European markets.

Between the time when paper money spread and the time when its knot with precious metals was definitely cut, there was another key innovation, the consequences of which are the focus of this book. In 1950, McNamara’s charge card for restaurants in Manhattan was originally based on paper: the card was cardboard, and its use resulted in a paper trail from the merchant to Diners Club to the cardholder. Not for long, though; the computer revolution transformed cards into an electronic medium of exchange.

Today, money consists of a unit of account that is exchanged through several different payment systems. None of these systems is free. Each requires that society—through the government or private parties—spend resources to operate and maintain it. And each provides benefits to society by reducing the costs of exchanging value among buyers and sellers.

 

 

There isn’t enough evidence to say that one of these systems is better than another, and that society should promote one over another. Some observers leap to conclusions that one system is superior without considering all of the costs or the benefits. Cash is sometimes said to be the best because it does not require card or check fees. Yet many of the costs of cash are hidden in the government’s budget. And some of the advantages of checks or cards over cash are easily ignored. Buyers don’t want to carry around large wads of cash, for instance, and sellers worry about security. Likewise, electronic exchange—via cards or other transfers—is sometimes said to be the cheapest. It may be in the long run. But past investments in cash and checks, whether wise or foolish in retrospect, may make those seemingly antiquated media appear efficient at the moment.

 

“Robert Hahn, Anne Layne Farrar and Daniel Garcia Swartz explain the cost and benefits of payments systems in their paper on moving to a cashless society.”

 

From the standpoint of individual buyers and sellers, each of these media has costs and benefits. The fact that most buyers and sellers use all of the available media suggests that different ones have advantages in particular circumstances. Coins don’t work well with eBay, nor checks with vending machines. More important, the success of any particular medium depends on its value to both buyers and sellers. Buyers can’t pay cash if sellers won’t take it. Sellers can’t insist on checks if people don’t want to carry them. Beyonce’s American Express Black Card may be exclusive, but it won’t get her a ham sandwich at Joe’s Diner. Exchange media are multisided platforms: to be viable, they must get all sides on board. For that to happen, a platform has to make sure that each side gets benefits that exceed the costs, with enough left over to cover the cost of running the system.

Simple, efficient cash remains a major payment platform almost three thousand years after the Lydians started stamping coins. It accounted for 21 percent of the value of consumer expenditures that involved payment (thus excluding items such as the “rent” that you pay yourself for living in your own home) in the United States in 2008. That is almost the same as what it was in 2000. Another survey found that a typical American consumer paid with cash for about 23 percent of the transactions they made during the month (including online and bill payments). Cash is used much more outside the highly developed nations. It is hard to beat for its convenience in countries that aren’t extensively wired like most of Africa. But even in those that are, buyers and sellers find cash attractive for many transactions. Entrepreneurs who think that consumers and merchants will flock to shiny new payment methods like beaming from your phone or swiping with your finger should consider why so many people still prefer the Methuselah of tender types.

 

 

Checks have fallen out of favor in many countries. Weaning Americans from the paper checks has been particularly hard though. The breakthrough came in the last decade. People in the United States paid for 26 percent of transactions for personal consumption (excluding housing) in 2000 but only 14 percent in 2008. Consumers use checks for larger payments than they make with cash and cards so in the end they paid for about 43 percent of the dollar value of personal consumption (excluding housing) with checks in 2000 but only 21 percent in eight years later. That understates the decline in the use of paper checks. Some merchants have installed readers that scan the paper check and have the consumer sign electronically. The check’s loss has been the card’s gain.

Sixty years after its introduction, plastic accounts for 53 percent of monthly transactions and 46 percent of the value of consumer expenditures in the United States. (These shares include some cards, such as store cards, that do not fit in the general-purpose payment card category that is the focus of this book.) Like previous important monetary innovations, the electronic medium of exchange has spread slowly but surely. Even in the United States —the country that gave birth to plastic cards and one that has had one of the most conducive environments for the growth of electronic money given the nation’s relatively inexpensive telecommunications system and heavy use of computers—paper methods of payments still accounted for 48 percent of transactions and 43 percent of consumer expenditures as 2008. But as we go into the second decade of the 21st century in the United States cards are on the cusp of becoming the leading method of payment. They face competition though from new methods of e-payment that neither use cards nor the networks behind them.

In the Land of the Dollar

 

To tell the story of cash and checks in the United States—and ultimately the story of payment cards—we must first explain why, until quite recently, the country has been populated by a large number of small banks.

America, Land of the Small Local Bank

The evolution of the banking system in the United States guided the evolution of the electronic payment card industry. Colonial America was primarily agricultural. Particularly away from the Atlantic coast, long-distance communications and trade were difficult and risky. Understandably, small business and property owners required the services of conveniently located banks. The local banks that developed tended to be small, just like the towns they served. Larger companies involved in European import and export trade often dealt with the big merchant banks in Great Britain and continental Europe; they had little need for such institutions in the British colonies. (We use the term “banks” in this book to refer to commercial banks, savings banks, savings and loan associations, and credit unions.)

Federal and state legislators prevented banks from operating in multiple states—and sometimes limited branching within a single state—for much of the first two centuries of the United States. There were brief spasms of exceptions toward the end of this period. In the early 1950s, banks evaded interstate banking restrictions for a while by forming holding companies that controlled several banks that each operated within a single state. Citicorp was thereby able to establish a national network of banks. Congress prohibited holding companies in 1956 unless states specifically allowed them to operate, although it allowed the few nationwide banks that had been established previously to continue to operate.

In 1966, a watershed year for the payment card industry as we will see in the next chapter, there were 13,821 commercial banks in the United States—70 per million people—and the 10 largest held 24 percent of all assets held by commercial banks. (Data going back to 1966 on savings institutions and credit unions are not available; however, if the ratio of total banks to commercial banks was a high as in 1990 there would have been about 160 banks per million people.) As we will see in the next chapter, the fragmented local banking industry of the 1960s shaped the evolution of the payment card industry in the United States.

 

 

Barriers to geographic expansion started falling in 1978 when Maine began permitting out-of-state bank holding companies to operate within its borders. Most states followed Maine’s lead over the next fifteen years, and by 1992, only Hawaii prevented out-of-state bank holding companies from operating. Consolidations have taken place as banks have merged with one another. The number commercial banks in the United States fell by 43 percent between 1990 and 2008 from 12.3 thousand to 7.1 thousand; the number of total banks including savings institutions and credit unions fell by 41 percent from 28.0 thousand to 16.4 thousand. Nevertheless, even with these consolidations, in 2008 there were approximately 23 commercial banks and 54 total banks for every million people in the United States. That is higher than other industrialized countries. The closest rival is Germany with 24 total banks per million. The United Kingdom has 6 per million and Canada has just 2 per million. How this will change as a result of the near implosion of the banking system in 2009 remains to be seen. The smaller banks avoided many of the excesses of the larger ones in the 2000s and emerged from the financial crisis healthier than the gigantic rivals.

While there are an astounding number of banks in the United States the statistics above understate the increasingly important role of large banks. They were front and center in the financial debacle that ended the last decade and have the tagline “too big to fail.” The banking industry was becoming increasingly concentrated based on assets and deposits over the last two decades. The financial crisis accelerated rate in 2008 and 2009 as result of the near collapse and forced marriage of some of the largest institutions. The share of the top 10 financial institutions based on deposits in the United States increased from [XX] percent in 1990 to 28 percent in 2000 to 45 percent in 2009. Later we will see that many of the small banks have gotten out of the credit-card business leaving this largely to the biggest banks.

From Wampum to the Greenback

 

The first-known money on the North American continent was the “wampum” used by Native Americans. Wampum consisted of black-and-white seashells strung together like necklaces and was redeemed for things like beaver pelts. As colonial settlement progressed, the beavers scurried away, so the story goes, and wampum became hard to redeem. Shells were on the way out as the seventeenth century ended, and they ceased to be legal tender in the New England colonies in 1661. Various commodities were used as media of exchange including grain, rice, cattle, whiskey, and brandy. Tobacco was one of the most important and durable media. It lasted as legal tender in Virginia for 200 years and in Maryland for 150, until the U.S. Constitution stopped the states from having their own currencies. Massachusetts used paper money at times, as we noted above, as did some of the other colonies—all redeemable, at various rates depending on the time and the place, for silver or gold coins.

The Constitution gave Congress the power to coin money and regulate its value. Congress followed the recommendation of Alexander Hamilton and passed the Mint Act in April 1792. The act identified the dollar as the basic monetary unit of the United States and defined the currency system on a decimal basis (cents, nickels, dimes, and quarters). The act also placed the United States on a bimetallic standard, with fifteen ounces of silver equivalent to one ounce of gold.

 

 

Paper money was the subject of considerable controversy and chaos until the Civil War. Several attempts at creating national banks fell afoul of politicians, and no national paper money took root. Many state-chartered banks issued notes that circulated like money, but there was no national standard. According to one account, “Thousands of notes were being printed and issued by hundreds of state banks. It was hard to know anymore what a dollar ought to look like, and the farther a dollar traveled from the bank that had issued it, the more gingerly it was treated by people who could not be expected to recognize the note or even the bank, let alone assess its reputation.” One source estimates that by 1860, more than 1,500 state banks were in operation and there 10,000 different kinds of paper money in circulation.

The Civil War brought about important changes in the U.S. monetary system. For one, the ability to convert paper money into gold or silver was suspended in the early 1860s. Second, the National Banking Act passed by Congress in 1863 provided for the creation of nationally chartered banks and helped put the state banks out of the currency business. The national banks were authorized to issue notes, the so-called national bank notes, backed by government bonds. Third, the federal government issued “greenbacks” (or U.S. notes) to help finance the Civil War. By 1867 national bank notes and greenbacks were the two main types of cash.

BankNote1858 bank note that was used as currency

In 1879 the United States resumed converting paper money into gold. The country went back and forth on the gold standard, and finally abandoned it during the Great Depression of the 1930s, although a modified gold standard (the so-called gold-dollar standard) survived until 1971 when President Richard Nixon famously “closed the gold window.”

The monetary system that prevailed in the United States during the national banking era (that is, between the Civil War and World War I) was successful in terms of creating a uniform currency. But it was also unstable. Banking panics occurred in 1873, 1884, 1890, 1893, and 1907 (reminding us that banking crises happen without big banks). Although since the late nineteenth century there has been a debate as to what caused the panics, the interpretation prevailing before the creation of the Federal Reserve suggested that none of the various forms of currency in circulation could adjust rapidly enough to temporary shocks in the demand for money, like those that occurred at harvest time. In this view, the underlying problem was the absence of a central bank, an institution that would provide funds at times when the demand for currency was unusually high.

The Federal Reserve System, created by Congress in 1913, was set up to address this issue. Congress created a network of twelve regional Reserve Banks supervised by a Board of Governors located in Washington, D.C. All nationally chartered commercial banks had to become members of the Federal Reserve System (“member banks”), while state-chartered banks could decide whether or not to join. The Fed, as it is now called, assumed responsibility for regulating the overall money supply. In addition, the Fed and the U.S. Treasury maintain the currency system together. The Treasury makes the paper and coins, and the Fed allows banks to deposit used currency with them and get new currency at par. The Fed took charge of checks which became a formidable payment system.

 

 

Checks Plummet from Great Heights

 

Check Use in Industrialized Countries

As we noted above Americans have not given up their use of checks easily. Most people in developed countries keep their money available in direct depository accounts. Slightly more than 90 percent of American consumers had one of these in 2008. But in many other countries the banking system and the government figured out ways to transfer money between these accounts without writing paper checks. For regular payments such as utility bills consumers in other countries could just have the bank automatically take the money out of their accounts. When other countries adopted card systems the banks focused on issuing DDA holders debit cards that took payments at the merchant right from the checking account (of course these could also take money from cash machines). Debit cards were popular in other countries decades before they were in the United States.

The striking difference in the use of checks across countries is seen in Table 2.1 which shows the use of checks in 13 developed countries between 2004 and 2008. In 2004 checks accounted for 41.3 percent of transactions in the United States. That is more than a third higher than in France, more than double in Canada, Italy or the United Kingdom, and more than 50 times higher than Germany. Part of the explanation for the differences comes from the use of direct debits which pulls money out of the DDA without a check. Direct debit accounted for 41 percent of transactions in Germany, 19.7 percent in the United Kingdom, 18.2 percent in France, and only 6.8 percent in the U.S. in 2004.

Table 2.1

There was a considerable drop in the use of checks in these countries over the five years from 2004 to 2008. The check share of transactions for Americans plummeted to 26 percent putting it in striking distance of France at 22.1 percent. But most other countries also saw check use even from lower levels fall dramatically. Italy fell from 15.7 to 11.1 and even Germany halved from .8 to .4. Thus even at their smaller use of checks Americans are outliers among citizens of developed countries.

 

 

The Early History of Checks in America

 

Checks are such a pervasive—and in the international context, odd—part of the U.S. monetary landscape that it is worth examining how the largest economy in the world got to this point. This history will provide a precursor to some of the economic issues that any new payment system, since that’s what checks were in early 19th century America, must grapple with. It will also get us into the role that the government plays in guiding the development of payment systems.

By the mid-nineteenth century, U.S. banks issued and accepted three media of exchange: bank notes, drafts, and checks. Checks, our focus here, gradually displaced the less flexible notes and drafts. Like all payment media, checks have two sets of customers, both of which must be on board for the medium to work: those who are willing to pay by check, and those who are willing to accept checks for payment. A bank in an area can operate a check platform all by itself if its depositors write checks and local merchants accept them. To do so, it must have a way for merchants to get reimbursed. This is, of course, simple when merchants and consumers have accounts at the same bank. Checks are more useful if they are more widely accepted, however. So banks can offer more valuable checking services to their customers if their checks are accepted at more merchants. As a practical matter, that requires the cooperation of other local banks as well as banks far away. The manner in which banks cooperated on checks provides a preview of what was to come for payment cards.

There is one major difference. Banks faced a constraint on how they could price checks to people who wrote checks and people who cashed checks—for convenience, call them check payers and check cashers. According to common law, which the United States carried over from England, cashers can get the amount of money on the face of the check if they present it in person at the payer’s bank. But if they present it in any other way—say, by the Pony Express fast mail service—the payer’s bank could hand over less than the face value and usually did.

That odd legal distinction—whose reasoning seems lost to history—created the incentives that shaped checking from the mid-nineteenth century through the early twentieth. The economics of two-sided markets (see chapter 6) suggests that this constraint on pricing prevented banks from coming up with the right price structure—the one that best balanced the demand by payers and cashers—and may have led to pricing that was less than optimal from the standpoint of both society and the banking industry.

 

 

For now, though, we want to look mainly at how this distinction influenced practices and institutions for honoring checks. Banks close to one another tended to reach agreements to exchange each other’s checks at par (that is, for face value). When it would be easy enough for check holders (consumers and banks) to present in person, the other banks couldn’t collect more than minimal exchange charges. Also, if banks charged for cashing each other’s checks, the fees would tend to offset each other. It made sense for all banks in a locale to agree to the simplest and thus least costly mechanism for exchanging checks. That resulted in the emergence of clearinghouses: banks sent clerks to a central location at a specific time where they were able to quickly and efficiently exchange checks drawn on each others’ banks. The bank card associations that emerged a century later followed a similar model only with better technology.

These clearinghouses were far more efficient than having banks send messengers to every other bank in town. For example, there were fifty-two banks clearing checks with each other at par in New York City in the early 1850s. Without any agreements, each bank would have had to send fifty-one messengers to arrange settlement, and each would have had fifty-one messengers coming to its door. Instead they agreed to deal with a single clearinghouse. This is an example of what Brandenberger and Nalebuff  have called “co-opetition”: firms that compete with each other benefit from cooperation and so do consumers.

The Emergence of National Checking Network

 

Without railroads, that would have been the end of the story. But the country became increasingly integrated over the last half of the nineteenth century. Interregional commerce increased. Local banks received an increasing volume of out-of-town checks from merchants. The problem wasn’t just how to present that check—in person or by mail. Checks drawn on distant banks were more likely to bounce: it was harder to know whether any particular check was legitimate, more difficult to collect from travelers, and harder for the law to catch a bad-check artist on the move. Hence, an out-of-town check wasn’t worth as much as one drawn on a local bank.

Although at that time banks could not address the issue of the trustworthiness of the payer (the technology for check verification and guarantee was far in the future), they did attend to the clearing-and-collection concern by setting up the respondent-correspondent system. The respondent bank, often a smaller country bank, would clear its interregional checks through the correspondent bank, typically a city bank. Frequently, the city bank would not charge explicit fees for its services (and would in some cases agree to pay exchange fees for checks cleared on the respondent), but instead make its money on deposits that country bank was required to hold with it.

Correspondent relationships tended to be organized by region. New England banks selected banks in Boston and New York as their correspondent banks, whereas Midwestern banks selected banks in Cincinnati, Cleveland, and most commonly, Chicago. Most banks had at least one correspondent in New York. Furthermore, most banks also had relationships with between three and a dozen banks located in neighboring counties.

 

 

The overall result was a complex network. Every respondent-correspondent relationship was bilateral. Each correspondent, however, had many respondent relationships and was thus at the hub of a mini-payments platform. The respondent banks at the spokes were also connected to a few other hubs. And the correspondent banks generally had relationships with each other to facilitate check clearing across the country. For example, a Cleveland bank and a Chicago bank might have an agreement to collect checks for each other at par within specified geographic limits. The correspondent banks were the intermediaries that helped solve the redemption problem for interregional checks. Any bank could sign up with a correspondent and thereby clear its interregional checks. These networks were precursors of the card systems that emerged in the mid 1960s. They resulted of course from the fragmented banking system: clearing checks would have been much easier if the country had but a handful of national banks.

The late nineteenth-century interregional clearing systems have been subject to criticism although the evidence indicates that they worked well given the challenges they faced. For example, according to lore, there was one check that was drawn on a bank in Sag Harbor, New York, and was deposited in a bank in Hoboken, New Jersey, about one hundred miles away. It passed through eleven banks, traveled around 1,500 miles, and was in transit about eleven days en route to Sag Harbor. It turns out that the tale of this single check was repeated so often by successive popular writers and historians that this exception became thought of the rule. The bank networks generally had the incentives to route checks efficiently and it appears that’s what they did. Another criticism was that some banks charged check fees that were greater than their costs. We discuss this below.

The Fed got into the business of check clearing starting around 1915. That was partly as a response to these concerns about non-par checking. (For simplicity, we leave out some of the distinctions between actions taken by the Federal Reserve Board and those taken by the regional Federal Reserve Banks.) The Fed set up a national system for the clearing and collection of checks. The one key difference is that the Fed imposed check collection at par. In the first phase, in March 1915, the Fed announced a voluntary plan in which banks that chose to join would clear checks on each other at par. This plan proved unsuccessful; at its peak in October 1915, only one-quarter of the Fed member banks had joined, and more banks were leaving than were joining. This suggests that par exchange was not an attractive option for most of these banks.

The Fed changed its strategy in April 1916. It required Fed member banks to remit at par for each check that a Reserve Bank presented to them whether in person or not. Then the Fed turned its attention to nonmember banks. The Fed kept a “par list” of nonmember banks that were willing to clear at par and began an aggressive campaign to expand the list. The Fed accumulated the checks of non-par banks to present over the counter for payment at par, thereby preventing them from collecting exchange fees. In some cases, the Fed even put non-par banks on the par list without their consent. The number of non-Fed members on the par list grew from about 10,000 in December 1918 to over 19,000 at its peak in November 1920, with only about 1,700 nonpar banks left.

 

 

Following litigation that led the Supreme Court to impose limits on some of the Fed’s actions to coerce banks to join the par list, many nonmember banks withdrew from the Fed’s system, and the number of non-par banks reached four thousand by 1928. The number of non-par banks later declined, especially in the early 1970s. As the economy expanded, one-bank towns became two-bank towns. The second bank could clear checks at par over the counter on behalf of out-of-town banks. By 1980, there were no non-par banks left in the United States.

The Fed’s role in the development of check clearing in the United States is controversial. Did the Fed help solve a market failure resulting from non-par checks, or did it create one by forcing banks to clear at par? Most firms that operate two-sided platforms vary prices to the two sides to balance the two demands and to make sure they get both sides on board. The par rule in the context of the checking platform prevents firms from using prices to do that. One is tempted to say that it is only fair that a merchant who gets a check for $100 should be reimbursed $100. It is not that simple, though: someone has to bear the costs of the checking system, and the merchant is receiving a service just as the payer is. If you doubt this, ask yourself why it is fair that you receive Adobe Acrobat Reader for free to read PDF files, but that those who generate these files have to pay for the necessary software. In fact, the par rule is much like the government telling Adobe that it can’t give away its readers. The economics of two-sided markets we discuss in chapter 6 suggests that imposing par pricing for checks could just as well have caused a market failure rather than curing one.

After check-processing fees were eliminated in 1918, the Fed shouldered the burden of the clearing and collection process, underwriting the system and increasing the incentive to use checks. It did not charge any explicit fees for check clearing until 1980. By then the taxpayers were, indirectly, shouldering about $500 million a year for the check system. The U.S. check system is as widespread and efficient as it is in part because of the government subsidies it received from 1918 to 1980.

Many other countries developed different systems for paying bills that didn’t rely on checks, and not surprisingly, people in those countries now use checks less as we saw in Table 2.1. Some of these are based on a “Giro,” a consolidated bill payment scheme that is operated either by the post office, the bank system, or both. It originated in Europe in the 1800s, and made the shift from paper to electronic in the late 1960s and early 1970s. Although one might claim that the absence of these alternative systems in the United States resulted in heavier check use, it is also possible that the early development of America’s efficient check processing system discouraged investment in these alternatives. U.S. banks initially had greater incentives to develop an efficient checking system because banks were smaller and more spread out than banks in other countries. Our fragmented banking system might be the root cause of our reliance on checks.

 

 

Check Killers

 

Almost a century after the Federal Reserve System started taking over the checking system this 12th century Italian invention is finally on the ropes. The fraction of transactions made with a check fell at a compound annual rate of almost 9 percent a year between 2004 and 2008. The statistics confirm what many of us have seen in the checkout lines over the last half decade. It is becoming less common to find oneself behind someone going through the seemingly eternal process of pulling out a checkbook, writing out a check, presenting identification, and seeking approval from the harried clerk. Two innovations—one card, one not—have made if increasingly likely that the paper check will eventually go the way of the typewriter.

Many Europeans were paying at the point of sale with one of the cards that took money directly out of their checking accounts and therefore substituted almost perfectly for writing a check. Only 2 percent of Americans had one of these in 1995 almost thirty years after the birth of the bank card association. But by 2010 most Americans who have a DDA get a debit card that allows them to take cash from an ATM and pay with the card at the point of sale. (As of 2008, 91 percent of U.S. consumers had a DDA and 88 percent of those account holders had a debit card.) In 2008, 24 percent of transactions with merchants were made with debit cards (more than half of all card transactions) and 17 percent of the dollar payments at merchants were on debit cards (more than a third of the dollar payments made on cards). When Visa made its big push into debit cards in the mid 1990s it branded them the Visa Check card so that consumers would understand that they were just like paying with a check except for the paper and the bother.

 

The rise of ACH was one of the greatest developments of the last decade, as one of us noted recently. Evans interviewed the Head of NACHA which sets the rules for these transactions—about the future of this type of payment.

 

Checks also declined because of the growth of an electronic payment system known as the Automated Clearing House (ACH). Banks can use the ACH system to send funds to each other electronically rather than relying on checks. This network supports a number of services that are increasingly displacing checks. By 2007 about 80 percent of American households received direct deposits, most commonly their paychecks, through the ACH system. More than half used online banking which relies on ACH to pay bills that same year. PayPal tries to persuade users to provide their checking account information. As a result they can take money directly from consumer’s DDAs and avoid card fees. New payment methods are emerging that rely on ACH. One innovation whose prospects are unclear at the beginning of 2010 is the “decoupled” debit card which separates the issuer of your debit card from the bank where you have your DDA. When you use a decoupled debit card to pay the money comes out of your account over the ACH system. ACH, debit cards, and checks are all different ways of taking money from a consumer’s DDA account. There is a difference: debit cards generally take money out on the same day the charge is made, checks depend on the vagaries of how quickly it makes its way to the bank and policies on crediting accounts, while ACH usually pull out money from the account about a day after the charge.

 

 

Buy Now, Pay Later

 

The rise of payment cards changed not only how people pay for things but also when. That is why the most important transformation for the payment card was its integration with two ancient activities: lending and borrowing. This didn’t happen overnight. Shortly after the launch of the Diners Club card in 1950, some banks tried to create a card that integrated payment services with lending. Most failed, and even the successes grew slowly. But over time, the products were improved and the problems were solved. The credit card picked up steam in the 1970s. A revolution in bank lending, consumer borrowing, and merchant financing followed. It was one waged amid the historical remnants of age-old efforts to regulate lending and borrowing by religious and governmental authorities.

People have borrowed from each other for at least as long as they have traded with each other—though like the first trade, the first loan is lost in time. One imagines a farmer in prehistoric agricultural times loaning his neighbor some seeds in return for a share of the future harvest. We do know that buying on credit predates primitive money. There is evidence of ancient lending in communities that had no medium of exchange or standard of value.

Credit and Sin

 

From the earliest recorded history, lenders have sought a return—interest—on their capital. Others have complained that interest is immoral. For long periods of time, in various parts of the world, civil or religious laws forbade interest or limited how much lenders could charge. Aristotle argued that making money from money is unnatural. As is always the case when laws prohibit willing buyers from entering into deals with willing sellers, people have made efforts to evade the laws. The loan shark may not be the oldest profession but it is likely up there.

The earliest-known formal code of laws, set forth by Hammurabi in 1800 BCE, permitted lending but capped interest rates: 33.33 percent on loans of grain and 20 percent for loans of silver. There were other restrictions. Debt could be secured with the debtor, his wife, concubine, children, or slaves—one of them would become a slave to the lender for up to three years if the debtor defaulted. He could also put up land or other capital as security. The Greek Laws of Solon (about 600 BCE) eliminated previous limits on interest. They also did away with slavery on default. The Twelve Tables, a codification of Roman laws around 443 BCE, set the maximum interest rate at 8.33 percent. The creditor could seize the debtor, but was also required to feed him. Interest was briefly banned in Rome around 342 BCE, but was then returned to its old maximum; the top rate was raised to 12 percent in 88 BCE and another half point in the fourth century CE.

 

 

The Bible was hard on lenders, and its language is still used to describe interest limits. Lenders were called “usurers,” and lending was called “usury.” The Old and the New Testament advocate zero-percent financing. In the Old Testament, Ezekiel 18:8 tells us, “He that hath not given forth upon usury, neither hath taken any increase….He is just, he shall surely live, saith the Lord God.” In the New Testament, Luke 6:35 reads, “Lend freely, hoping nothing thereby.” St. Bernardine in the early fifteenth century frowned on interest as well: “Accordingly all the saints and all the angels of paradise cry then against [the usurer], saying, ‘To hell, to hell, to hell.’ Also the heavens with their stars cry out, saying, ‘To the fire, to the fire, to the fire.’ The planets also clamor, ‘To the depths, to the depths, to the depths’” (De Contractibus, sermon 45, 3:3). St. Thomas Aquinas said, “To take usury from any man is simply evil.” In the Muslim world, the Koran prohibited usury even more strictly than the Bible. It precluded people from profiting on the exchange of silver or gold, and it also prohibited the use of bills of exchange.

Not surprisingly, given these harsh views, usury was a sin in Europe during much of the Dark Ages and the Middle Ages, and with the church so much a part of the state, there were laws against lending as well as the possibility of excommunication from the church for usury. The Capitularies of Charlemagne (800 CE) forbade usury “where more is asked than is given.” In the twelfth century, the Second Lateran Council prohibited usury. Sidney Homer and Richard Sylla’s survey of interest rates summarizes the period: “For long centuries the ordinary consumer loan, and, for that matter, the ordinary commercial loan, was opposed by effective popular and clerical censure and often by civil law.”

It was never possible to eliminate an economic activity as basic and necessary as lending and borrowing. Thus, laws were often enforced only in egregious cases. Semantic and theological distinctions were made to permit some lending. Pawnshops existed openly. Usury laws mainly discouraged the development of lending institutions and capped interest rates, and as a result, held back the development of banking.

Hostility toward usury waned through the Renaissance with the growth of trade. With the Reformation, lending became more accepted in Protestant countries. Calvin and others rejected the extreme view of Aristotle and Thomas Aquinas. Catholic countries came around more slowly. Finally, around the middle of the eighteenth century, commercial lending, which is essential for modern capitalism, emerged victorious throughout much of Europe. Most European countries repealed their usury laws between 1854 and 1867.

The United States stands out in two contradictory respects. It has maintained usury laws much longer than its European cousins. These laws date back to the country’s beginning. Most of the colonies followed Maryland’s lead in 1692 in capping rates at 6 percent. Two hundred years later, in 1881, only fourteen of the forty-seven states had repealed their usury laws. Many of these states eventually reinstated them. The typical ceiling remained at 6 percent. More than a hundred years later, in 1987, thirty-three states had legislated maximum interest rates for consumer loans. Today, more than half of the states have either fixed ceilings or ceilings that are tied to market interest rates. Given how few other prices are regulated by law today, this is a remarkable historical legacy. Indeed, until well into the twentieth century—and we still see echoes today—borrowing money, especially by households, was considered bad form.

 

 

With this legacy, it is a paradox that along with its comparatively high rate of usage of checks, the second way the United States stands out is that it gave birth to the credit card, which was used far more in the United States than in any European country for many years. In fact, the card builds on a tradition of buying on credit that dates back to the early days of the republic. One chapter in a late nineteenth-century memoir suggests that not much really changes: “Buy Now, Pay Later—Mama Discovers an American Custom.”

Credit: Financing the American Dream?

 

The chest of drawers for the early nineteenth-century parlor, the sewing machine for the late nineteenth-century home, the radios and phonographs for the early twentieth-century household, televisions for the mid-twentieth-century family room, and an automobile for the garage—all were essential in their day to the American Dream. Americans have been able to pay later for household durables that they buy now since at least the early years of the United States. As Lendol Calder details in his study of U.S. lending, Financing the American Dream, Cowperthwaite and Sons in New York sold furniture on installment terms by 1812. Other furniture dealers did the same during the nineteenth century. I. M. Singer popularized the installment plan in the last half of that century. It allowed suitable sewing machine buyers to put $134 down and pay $79 to $134 dollars a month plus interest (as always, in 2010 dollars). By the 1890s, Singer settled on $23 down and $23 a week for its sewing machine. By the turn of the twentieth century, at least in Boston, retailers were selling clothing as well as consumer durables on credit. According to one historian studying immigrants in the late nineteenth century, “The practice of installment buying initiated newcomers into the possibilities of immediate acquisition and familiarized them with the impatient optimism that characterized the American consumer.”

Instalment credit at department stores took off in the early part of the 20th century. Up to then, a company like Sears could brag that it refused to sell on instalment. Its 1889 catalogue, for example, pointed out: “We sell for cash, having no bad debts … no expense for collections, we can sell at a far lower margin of profit than any other dealer and when you buy from us you are not helping to pay for all such useless expense.” In the very early years of the 20th century the company’s only payment terms were “cash in full with the order;” and several of its catalogues at the time touted the cash-only policy as being in the public’s best interest.

This picture started changing around 1910. By then, instalment sales were common throughout the country. Consumers had gotten used to buying on credit. In fact, around 1912, Sears discovered that some individuals and agencies were purchasing things from it, mainly durable goods, for cash and reselling them on credit. The following year Sears announced in its catalogue that it would beat the terms offered by these lenders. By 1913 Sears used special catalogues and circulars to advertise the possibility of paying in instalments for a reasonably wide variety of items, including pianos, farm implements, cream separators, gas engines, and vehicles. Other department stores waited a bit longer. Montgomery Ward, for example, started openly advertising instalment plans around 1921-22.

 

 

The annual volume of retail instalment credit rose from about a half-billion dollars in 1910 to about $7 billion in 1929. In 1928-30, sales on instalment credit accounted for about 9.2 percent of total retail sales. By 1930 most durable goods were bought on instalment plans—roughly 60-75 percent of cars, 80-90 percent of furniture, 75 percent of washing machines, 65 percent of vacuum cleaners, 75 percent of radio sets, and 80 percent of phonographs. Risks for lenders, in this period, were quite low—in 1927 bad debt losses accounted only for about 1.2 percent of total credit sales. Of course, most households by this time were buying cars on credit. By 1920, consumers used credit for about two-thirds of car purchases. Dealers sold cars under two prices: the cash price and the “time price,” which was generally 15 to 22 percent higher than the cash price, and included finance charges, dealer reserves, loss reserves, and insurance premiums. Installment contracts for cars were typically twelve to eighteen months long. In the 1920s American households relied on instalment credit to shift their spending patterns toward expensive durable goods.

Single-merchant charge cards developed in this early period in parallel with the development of store-based instalment credit. Western Union, the nation’s leading telegraph company at the time, is often credited with introducing the first charge card in 1914. The metal plate bore the name of the customer and gave her the convenience of sending many cables and then paying for all of them at the end of the month. (Today, Western Union the world’s leading money transfer and check cashing system.) In 1924, General Petroleum of California issued the first gasoline card to its employees and certain select customers.

From the late 1920s onward the so-called “Charga-Plate” system became popular. A 1960 history of the department store explained that “the service provides a simple and efficient means of recording credit purchases in a large store, or from a group of stores in a large city.” A department store customer who was approved to obtain credit from the store had her name, address, and account number embossed on a small metal plate; the customer’s signature appeared on the reverse of the plate for identification. When the customer made a purchase at the store, the employee used a small imprinting machine available in each department in combination with the “Charga-Plate” for that customer to make a printed impression on the receipt. Apparently, in some cases several stores in a city approved a customer for credit and allowed her to use the same “Charga-Plate”in all of them. There were more than a quarter of a million imprinters in use in the USA, and over 30 million plate holders. An important limitation of the system, however, was that it constrained the customer to pay over time only at the store, or set of stores, where she had been approved for credit. Moreover, it was costly to have each consumer’s credit-worthiness evaluated by multiple stores, and only fairly large merchants could afford to operate instalment credit systems.

 

 

Charga PlateCharga Plate

By the time banks started introducing credit cards in the mid-1950s, Americans were accustomed to buying now and paying later. Many merchants financed purchases. And households had accumulated significant debt. The average household had [DEBT IN 1950]. Since the 1950s, consumer debt has continued to grow. Figure 2.1 reveals some interesting facts about this process.

  • The total of consumer credit outstanding has grown consistently since the middle of the century—real growth rates were 144 percent in the 1950s, 91 percent in the 1960s, 41 percent in t1he 1970s, 46 percent in the 1980s, 53 percent in the 1990s, and 33 percent in the 2000s.
  •  

  • Consumer debt has grown at a slower pace since the spread of credit cards which didn’t become significant until the 1980s and expanded most rapidly in the 1990s.
  •  

  • Revolving credit (accounted for mainly by outstanding balances on credit cards) has replaced other forms of consumer debt—the share of revolving credit in total consumer credit increased from zero in 1950 to about 3 percent in 1970, 16 percent in 1980, 28 percent in 1990, 40 percent in 2000, and 36 percent in 2009. Revolving debt as fraction of total debt has not increased in the last decade or more.
  •  

Figure 2.1 Consumer credit outstanding, 1947-2007 (not seasonally adjusted) Note: Revolving credit includes balances outstanding on credit cards and other unsecured revolving lines of credit; nonrevolving credit includes secured and unsecured credit for automobiles, mobile homes, trailers, durable goods, vacations, and other purposes. Figures are in 2010 dollars. Source: Federal Reserve Board, Statistics: Releases and Historical Data.

Credit: Dealing with the Devil?

 

Throughout its two-century life in the United States, the “buy now, pay later” concept has faced two major objections and one obstacle. The objections are that (1) it is just wrong for consumers to consume on credit, and (2) consumers are irrational to consume on credit based on the terms that lenders offer. The obstacle is that lenders should not charge interest rates that are “too high.”

Americans have never had a problem with borrowing to produce things. Capital markets helped finance the industrialization of the United States during the nineteenth and twentieth centuries. Borrowing to consume things was another matter. Benjamin Franklin’s Poor Richard warned people about borrowing: “He who goes a borrowing, goes a sorrowing”; and “Be frugal and free.” A piece of financial advice given to households during the nineteenth and early twentieth century was “live within your means.” One nineteenth-century adviser said the “trinity of evil” was “debt, dirt, and the devil.” An early twentieth-century adviser noted that there “was no excuse for going in debt for the ordinary necessities of life.” The Victorian ethos waned after the turn of the century. Perhaps that was driven by inflation that eroded the value of savings. Even today, there is no shortage of opinion leaders and financial advisers who believe that buying on credit is bad for households and the economy.

 

 

For many decades, some commentators have argued that even if it isn’t wrong in principle to consume on credit, Americans are foolish to do so on the terms actually offered by lenders. Already in the late nineteenth century, women in charge of managing the household budget were singled out. An 1884 editorial in Scientific American discussed “the curious processes of reasoning” that women used in deciding to buy a sewing machine on an installment plan. The author discovered the “psychological fact, possibly new,” that women “will rather pay $50 for a machine in monthly installments of five dollars rather than $25 outright, although able to do so.” (Original dollars included in quote.) Both sexes engage in this reasoning now. Many people pay interest on credit card purchases that is far higher than the interest they are earning on their savings. A new literature has emerged known as “behavioral law and economics” that argues that there is a scientific basis for the view that consumers borrow too much money because they are shortsighted and bad with math. We examine this case against credit in more detail in chapter 4.

 

The modern case against credit is made by Elizabeth Warren and Oren Bar-Gill in Making Credit Safer which provides the intellectual foundations for the Consumer Financial Protection Agency. Other modern books echo the 19th century concerns over people being lured into taking on too much debt. A good example is The Fragile Middle Class: Americans in Debt.

 

Banks, Merchants, and Consumer Financing

 

Whatever the view of its critics and proponents, consumer credit has become as American as apple pie, via merchants’ financing programs and, later, credit cards, which made borrowing and lending of small amounts of money even more efficient. After credit cards became widely available, many merchants happily discontinued their lending programs, while a few with highly profitable programs continued to encourage their customers to use their store cards. Why did merchants offer financing in the first place? One possibility is that merchants had better access to capital than did their customers. Merchants could obtain capital—possibly from banks, but more likely from their own retained earnings—much more easily for financing sales than their customers could for buying, let’s say, some furniture. Banks were not in the consumer lending business throughout most of U.S. history. Personal finance companies appeared in the late nineteenth century, but they were where consumers went to consolidate their merchant installment debt, not where they went for a small loan for a phonograph. Thus, providing financing was a service that merchants could perform more cheaply for their customers than customers could provide themselves. It made sense to offer this complementary service, particularly if one could do it better than the competition.

Some merchants were extremely good at lending; they were large enough to average the risks of default, and good at spotting and monitoring credit risks. Sears excelled. Others did not. We know from the credit card experience that lending is a complex business. We’ll see later that even American Express initially blundered when it tried to provide credit cards to its charge card customers. And there are some scale economies so that smaller businesses found lending costly—although necessary given that consumers wanted to pay in installments and their competitors offered this option.

Credit cards provided a platform that made the borrowing and lending of small amounts of money more efficient. Cardholders received a small credit line that could be used at many merchants based on a single application for a card. These cardholders didn’t have to fill out applications at many merchants and keep track of the bills, and they didn’t have to make a trip to a bank or personal finance company to buy a specific item. Merchants benefited so long as the credit card issuers could provide loans to their customers more efficiently than they could. Many merchants applauded credit card programs and dropped their financing programs once customers had another alternative.

 

 

For example, Joseph Nocera tells us that when Kenneth Larkin, a Bank of America executive, visited a drug store in a small town hoping to persuade its owner to accept BankAmericard, the man received him as the savior of his business. In practice, Nocera explains, “A store owner who accepted the credit card was, in effect, handing his back office headaches over to the Bank of America.” Banks could lend money without having a personal loan officer interview every consumer who approached them for a loan.

The convenience of payment cards for banks, merchants, and consumers is a product of a delicate balance that emerged after much trial and error. As we will discuss in chapter 3, getting the business model for lending money on the card platform right was tough work, and it took many years for the credit card industry to figure out how to do this profitably.