Algorithmic zoning could be the answer to cheaper housing and more equitable cities

Zoning codes are a century old, and the lifeblood of all major U.S. cities (except arguably Houston), determining what can be built where and what activities can take place in a neighborhood. Yet as their complexity has risen, academics are increasingly exploring whether their rule-based systems for rationalizing urban space could be replaced with dynamic systems based on blockchains, machine learning algorithms, and spatial data, potentially revolutionizing urban planning and development for the next one hundred years.

These visions of the future were inspired by my recent chats with Kent Larson and John Clippinger, a dynamic urban thinking duo who have made improving cities and urban governance their current career focus. Larson is a principal research scientist at the MIT Media Lab, where he directs the City Science Group, and Clippinger was formerly a Research Scientist at the Human Dynamics Group at the MIT Media Lab and is now a cofounder of Swytch.io which is developing a utility token called Swytch.

One of the toughest challenges facing major U.S. cities is the price of housing, which has skyrocketed over the past few decades, placing incredible strain on the budget of young and old, singles and families alike. The average one-bedroom apartment is $3,400 in San Francisco, and $3,350 in New York City, making these meccas of innovation increasingly out-of-reach of even well-funded startup founders let alone artists or educators.

Housing is not enough to satiate the modern knowledge economy worker though. There is an expectation that any neighborhood is going to have a laundry list of amenities, from nice and cheap restaurants, open spaces, and cultural institutions to critical human services like grocery stores, dry cleaners, and hair salons.

Today, a zoning board would simply try to demand that various developments include the necessary amenities as part of the permitting process, leading to food deserts and the curious soullessness of some urban neighborhoods. In Larson and Clippinger’s world though, rules-based models would be thrown out for “dynamic, self-regulating systems” based around what might agnostically be called tokens.

Every neighborhood is made up of different types of people with different life goals. Larson explained that “We can model these different scenarios of who we want working here, and what kind of amenities we want, then that can be delineated mathematically as algorithms, and the incentives can be dynamic based on real-time data feeds.”

The idea is to first take datasets like mobility times, unit economics, amenities scores, and health outcomes, among many others and feed that into a machine learning model that is trying to maximize local resident happiness. Tokens would then be a currency to provide signals to the market of what things should be added to the community or removed to improve happiness.

A luxury apartment developer might have to pay tokens, particularly if the building didn’t offer any critical amenities, while another developer who converts their property to open space might be completely subsidized by tokens that had been previously paid into the system. “You don’t have to collapse the signals into a single price mechanism,” Clippinger said. Instead, with “feedback loops, you know that there are dynamic ranges you are trying to keep.”

Compare that systems-based approach to the complexity we have today. As architectural and urban planning tastes have changed and developers have discovered loopholes, city councils have updated the codes, and then updated the updates. New York City’s official zoning book is now 4,257 pages long (warning: 83MB PDF file), the point of which is to rationalize what a beautiful, functional city should look like. That complexity has bred a massive influence and lobbying industry as well as startups like Envelope which try to make sense of it all.

A systems-based approach would throw out the rules while still seeking positive end results. Larson and Clippinger want to go one step further though and integrate tokens into everything in a local neighborhood economy, including the acquisition of an apartment itself. In such a model, “you have a participation right,” Clippinger said. So for instance, a local public school teacher or a popular baker might have access to an apartment unit in a neighborhood without paying the same amount as a banker who doesn’t engage as much with neighbors.

“Wouldn’t it be great to create an alternative where instead of optimizing for financial benefits, we could optimize for social benefits, and cultural benefits, and environmental benefits,” Larson said. Pro-social behavior could be rewarded through the token system, ensuring that the people who made a neighborhood vibrant could remain part of it, while also offering newcomers a chance to get involved. Those tokens could also potentially be fungible across cities, so a participation right token to New York City might also give you access to neighborhoods in Europe or Asia.

Implementation of these sorts of systems is certainly not going to be easy. A few years ago on TechCrunch, Kim-Mai Cutler wrote a deeply-researched analysis of the complexity of these issues, including the permitting process, environmental reviews, community support and opposition, as well as the basic economics of construction that make housing and development one of the most intractable policy problems for municipal leaders.

That said, at least some cities have been excited to trial parts of these algorithmic-based models for urban planning, including Barcelona and several Korean cities according to the two researchers. At the heart of all of these experiments though is a belief that the old models are no longer sufficient for the needs of today’s citizens. “This is a radically different vision … it’s post-smart cities,” Clippinger said.

Update: John Clippinger’s affiliation was changed to better reflect his current activities at Swytch.