First Come, Last ServedLine-Standers
The best free show in Washington, D.C. is the Supreme Court. The courtroom is both ornate and intimate. You sit only steps from the justices of the highest court in the land and listen to America’s top advocates. This is democracy at its best, open and accessible to all. If you want to witness the fate of abortion, gun control, or religious freedom, you can. But you need to get there early—there are on average fewer than one hundred seats available for the public, and admission is first come, first served.
For high-profile cases, people arrive a day or more ahead of time, armed with camping chairs, sleeping bags, ponchos, and extra batteries for their smartphones. Folks in line tend to look out for one another—Supreme Court police officers refuse to monitor the line. If you have to go to the bathroom, those around you will hold your place. And they will also be on guard for people cutting in or adding friends. If someone does that, they are harangued with cries of “no cutting” and “back of the line.”
As the time to enter the Court approaches, though, a strange thing happens. Many of the disheveled people nearest the front of the line exchange their spots with gray-suited men and women. A little later the well-dressed enter the courtroom and take the best seats while those farther back in the line are not even admitted. What is going on?
Welcome to the line-standing business. Companies are paying line-standers, sometimes homeless people, to arrive days ahead, secure a spot at the front, and then wait and wait and wait. At the last minute, by the Court entrance etched with the words equal justice under law, the line-standers give way to paying clients who have the money to get in first but not the time or patience to wait. Small start-ups like Linestanding.com, Skip the Line, and Washington Express charge clients up to $6,000 for a “free” seat, while paying minimum wage to the hired line-standers who wait in the rain and cold.
Line-standing companies have transformed how seats become mine
not just for Supreme Court arguments but also for open congressional hearings where the nation’s laws are debated. Hearing rooms used to be free to anyone willing to wait to see their elected representatives in action. Now those hearings are often packed with lawyers and lobbyists, all of whom paid, none of whom waited. The same transformation is happening in lines for new passports at the local federal building or building permits at City Hall.
Paid line-standers are a booming business in the private sector as well. If you’re willing to pay, you can get to the front for new iPhones at Apple stores, hot skatewear apparel at Supreme, rush Broadway show tickets, or even prime spots on New York City streets to watch the Macy’s Thanksgiving Day Parade. One line-stander employed by SOLD (Same Ole Line Dudes), a line-standing start-up, waited forty-three hours holding a spot so a client could be sure to get an audition for Shark Tank,
the hit reality TV show for start-ups. Odds are that Robert Samuel, the entrepreneurial founder of SOLD, would have done better on the show than the guy from Colorado who paid Samuel for his place in line.
The same transition is happening online. The musical Hamilton
was continuously sold out on Broadway for years after it opened. The producers of the show made most tickets available on a first-in-time basis on their website. The problem was that tech-savvy scalpers created computer programs—bots—that bought up all the tickets the microsecond they became available. As a result, the artists and producers earned only the tickets’ face value while fans paid scalpers’ premiums, often a multiple of the original price, on sites such as StubHub. Many weeks ticket scalpers earned more from Hamilton
than did the producers and artists who put on the show. What good is the first-in-time rule if a bot will always jump the line faster than a mortal with a mouse? When Hamilton
tried to outsmart the scalpers by making some tickets available only at the theater box office, companies like SOLD hired line-standers to snag them.
Bruce Springsteen tried another approach when he played his sold-out run on Broadway. He paired with TicketMaster as it debuted Verified Fan, an online system that aimed to circumvent the bots and line-standers and get at least some tickets directly to prescreened real fans. But even those tickets often ended up on the resale market—you have to be quite committed to the Boss to turn down a $10,000 offer for an $850 ticket.
How should we think about this rapid rise of paying to get to the head of the line?
For many, this transformation seems deeply unfair and undemocratic. One disappointed woman stood for days in line at the Supreme Court and still did not get to hear the 2015 case establishing the right to same-sex marriage. The real system, she said, is “Let’s pay the poor Black guys to hold the line for rich white people.” On the other hand, maybe line-standing should be viewed as a good thing—capitalism at its best, creating new jobs where none existed before, both for programmers scripting their bots and for the poor and homeless waiting in lines.
We never used to ask these questions. But today we must, because first-in-time is being dismantled from within.Who’s on First?
For most of human history, for most resources, the rule for establishing original ownership followed a maxim expressed in ancient Roman law as “Whoever is earlier in time is stronger in right.” In other words, First come, first served.
This has long been the practice in families. Think back to your childhood Bible lessons. Why did Jacob put an animal skin on his arm to trick his blind father, Isaac, into thinking he was blessing Esau, Jacob’s rough-skinned brother? Esau was born first and by right should have received his father’s gifts. Being first got you not only paternal blessings but also earthly treasure. Jacob’s trickery let him jump the line.
The practice of primogeniture,
inheritance by the firstborn son, has long decided the succession of royal families around the world. It still does today, with an egalitarian twist in countries such as Sweden and the Netherlands, which now pass the crown to the monarch’s firstborn child rather than just the first son.
First-in-time governed colonial exploration as well. Colonies in the New World were carved up among the European powers based on which nation’s explorer was first to plant his sovereign’s flag. This may hold some intuitive appeal for uninhabited lands, but what about places with people already living there? If being first is what counts, surely Native Americans had the stronger claim for owning America. Not so, said the international law of the time—as written by the European powers. When Europeans came to America, they defined first
to mean “the first Christian discoverer.”
And here lies a key to understanding this ancient maxim for making things mine.
Even something as factual-sounding as “who’s first” is not self-defining. The right question is “Who decides who’s first?” In American law, the answer is “The conqueror prescribes its limits,” according to Chief Justice John Marshall in Johnson v. M’Intosh,
an 1823 Supreme Court decision that most lawyers read during their early days in law school. Being the first Christian European was what justified, as a matter of law, the claims of Spain to the Caribbean, Texas, Mexico, and California; of France to New Orleans, Canada, and much of middle America; and of England to New England and Virginia.
But if that’s the case, why did the world not rise in protest when Neil Armstrong planted the American flag on the moon in July 1969? That should have made the moon just as much a U.S. territory as early America was a European one. The answer is that by the 1960s, countries had renounced discovery and conquest as the basis for deciding who was first. In 1967 the United States, along with the Soviet Union and dozens of other countries, signed the U.N. Outer Space Treaty explicitly rejecting first-in-time for extraterrestrial resources.
So when Armstrong became the first human on the moon, he was not asserting American ownership there. Indeed, to make America’s intentions clear, in 1969 Congress felt compelled to pass a law stating that when a U.S. astronaut places a flag on the moon, it is “intended as a symbolic gesture of national pride in achievement and is not to be construed as a declaration of national appropriation by claim of sovereignty.”
Countries continue to play the “who’s on first” game, though, with contested results. In 2007 the Russian Navy tweaked the international community by placing a small titanium Russian flag on the bottom of the Arctic Ocean. Russia was symbolically staking a claim to the mineral-rich seabed beneath the North Pole and the trade shipping routes that cross the pole—all newly accessible because of climate change and melting ice. Though an international furor erupted over the idea that Russia might win these resources simply by flagging them first, the strategy is time-tested. As we shall see in Chapter 4, China is now implementing a version of this strategy by building and claiming islands in the South China Sea.
Territorial claims and family inheritances are not the only venues for first-in-time. Being first is also the default rule for how ordinary people claim all sorts of unowned things. It was how miners staked claims during the 1848 Gold Rush in California. In 1889, Native lands in Oklahoma were opened for pioneer settlement through “land runs” that began with a pistol shot on the state line. (Sooner
was the derogatory term for those who jumped the gun.) Today well-funded start-ups are aiming to mine the moon and harpoon asteroids for water, platinum, and gold—all in tension with internationally recognized ownership rules. This is also the origin story of Uber, Airbnb, YouTube, and many other Internet businesses that raced ahead of the law to create and then capture markets. Ambiguity about ownership favors the bold, the heedless, the outlaws—those who race ahead first.
But not always.
The law looks not only to who is making the claim but also to what they are doing with it. In the 1800s, homesteaders not only had to arrive first at their 160-acre parcel of earth but also had to show that they cut, burned, fenced, planted, and wrenched sustenance from it continuously for a period of years. This was another reason courts of the day ruled that Native Americans did not own their ancestral lands. Europeans imagined Native peoples treading lightly through the forest while hunting fish and game and did not view that labor as sufficiently productive to sustain an original claim of ownership. They defined first
to mean first to labor according to the agricultural and commercial ethos of the settlers.
What is first turns out to be a slippery concept—never just an empirical fact, always a legal construct. In the classic children’s book The Little Prince,
we encounter a businessman counting stars. The little prince asks why and hears, “I own the stars, because nobody else before me ever thought of owning them.” But being the first to think of owning stars does not necessarily make them owned. By and large, courts and governments define and redefine what’s first to guide people invisibly and inexorably toward particular, socially approved forms of interaction with scarce resources.
Copyright © 2022 by Michael Heller and James Salzman. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.