Manhattan Institute for Policy Research.
search  
 
Subscribe   Subscribe   MI on Facebook Find us on Twitter Find us on Instagram      
 
 
   
 
   
 


Civic
Report

No. 87 May 2014


DIGITAL TRANSFORMATION: Wiring the Responsive City


Stephen Goldsmith


Introduction

DOWNLOAD PDF
Table of Contents:
Introduction
About the Author
Chapter 1. Information: The Key to Effective Government
Chapter 2. Making Data Matter in Administrative Systems
Chapter 3. Infrastructure Management
Chapter 4. Predictive Tools for Public Safety
Chapter 5. Regulatory Reform with Data and Technology
Conclusion
Endnotes

American cities are on the brink of the most significant change in local and state governance since the reforms that dismantled Tammany Hall 125 years ago. Across a wide range of political convictions and practical expertise, innovators—Republicans and Democrats, technocrats and citizen activists, businesses and nonprofit organizations, street-level bureaucrats and their clients—together are revolutionizing local government.

This revolution cannot come soon enough. Americans across the board are profoundly distrustful of business-as-usual government. Last October, for example, a survey by the Pew Research Center found that only 19 percent of Americans trust government all or most of the time, a near-historical low.[1] This distrust is driving wedges between groups, creating tension that undermines even good-faith efforts at reform. With this sort of trust gap, many well-intentioned policies will be doomed from the start, as important constituencies refuse to assent or, worse, portray policies as a redistribution that benefits “them” at “our” expense.

Fortunately, some of the forces that overwhelm traditional government—demographic shifts, complex problems that cross traditional lines of responsibility, and mountains of data that undermine old-style procedures—have a silver lining. Even as they reveal the inadequacy of government as we know it, these developments are also providing the incentives and tools for change. They are, in fact, driving a new era of progress that will make government more cost-effective, engaged with citizens, and trusted.

Still, business-as-usual municipal government often persists, with many insisting that nothing can be done, until pressure (old ways not working) and opportunity (new technology, new political dispensations) combine to create sudden change. Yet in a few short years, it can sweep away decades of supposedly intractable governance problems. This is the kind of change that local government is experiencing now.

Today, a new generation of elected and appointed leaders, motivated by extreme challenges—economic dislocation, population migrations, environmental crises, job losses, unsustainable budgetary policies—is creating genuinely new approaches to government, made possible by technologies that could not even be imagined a few years ago.

Already, at the dawn of this transformation, “big data” and its analytic tools are bringing quickness and transparency to all aspects of modern government. For example, in many workforces, GPS-enabled smartphones and tablets tell exactly where employees perform their work and how long each discrete task requires. Real-time wireless signals send notices to supervisors of outliers—those employees who operate at the extremes of superior and poor performance. Combined with sharp-eyed analysis and old-fashioned teamwork, this richly detailed data enables city agencies to figure out how to do a job better, more cheaply, and with more consideration for citizens.

Taken alone, such changes may sound like incremental technical improvements. Collectively, they add up to a reimagining of what government is and how it works.

The Responsive City

Governments in the United States face citizen demands that exceed their resources. This is not a temporary aberration but a permanent fact of life that requires government to operate more effectively. Specifically, governments must provide more personal attention and interact with communities to understand and anticipate their needs; and public resources must be better calibrated to provide the outcomes that constituents demand. We can no longer afford to make massive investments in hopes of finding a solution. The return on investment for each public dollar spent needs to improve dramatically.

In the following chapters, I describe how public officials are making these improvements a reality. These officials use tools of the digital revolution, combined with changes in the structures of government, to bring about reform. For the last year, we at the Harvard Kennedy School Innovations in Government Program, with support from Bloomberg Philanthropies and the John D. and Catherine T. MacArthur Foundation, have worked—through our Data-Smart City Solutions project—to identify the most creative new application of data that produces better public services. I have spoken with leading mayors about their use of data and the best of their initiatives, and these conversations furnish some of the material in these chapters. I continue to see major breakthroughs in how government engages its own workforce, regulates the marketplace, involves the private sector, and incorporates the wisdom of its citizens. The chapters that follow show not only how reform happens, but also demonstrate how reform increases productivity and responsiveness.

As the following overview shows, I consider many ways that government can transform how we govern ourselves, from garbage pickup to city planning.

Chapter 1. Information: The Key to Effective Government

Urban governments such as Boston, Chicago, New York, and elsewhere in the United States have broken free of business-as-usual bureaucracy by using information tools, thus liberating imaginative civil servants. The lesson from the cases that I examine is simple: governments can, and do, work in the information age with the same efficiency and creativity that citizens expect from private enterprise. To do so, government today must gather, analyze, and act on data far more effectively than did the bureaucratic structures of the twentieth century. A theme that recurs in my research on this topic is that government structure is key: we are witnessing the most important breakthroughs when mayors and governors appoint a chief innovation officer, and/or establish an office in the executive suite, responsible for organizing information and initiatives that use reform concepts.

Chapter 2. Making Data Matter in Administrative Systems

I cannot afford all the layers of bureaucracy that now exist. It is not just the cost of the layers but the fact that they frequently subtract from, rather than add to, the effectiveness of the functions for which they are responsible, limiting employee discretion, skill, rewards, and innovation. Management too often means directing subordinates’ activities, rather than producing solutions. Analytics now allow us to know which employees are producing desirable outcomes and solving problems, and not just following rules. Reform will allow those workers to have more discretion, but at the same time require them to be more accountable.

I discuss the places where digital tools have already begun to make a difference, often beyond the goals for which they were first employed. For example, transparency causes changes in the budgeting processes, making it much more open to outside groups and allowing better management inside government. Web-based procurement, coupled with transparency, will drive down costs, as well as corruption, in municipal procurement.

Chapter 3. Infrastructure Management

This chapter deals with the hard assets that constitute urban infrastructure and the way new digital tools are engaging citizens in public works solutions. We look at how to make the most of our road and utility infrastructure and the energy that it uses and supports. City utility departments, as well as mayors with energy and sustainability offices, can now harvest much more real-time feedback from a smart grid and, in turn, provide consumers better information upon which to make consumption decisions. This same information helps service staff anticipate and manage service disruptions.

In transportation, mobile devices and sensors help increase efficiency of traffic and transit systems. Digital systems allow for new means of moving traffic, synchronizing movements and facilitating information flows. That allows for dynamic pricing of services, such as parking, and facilitates other enhancements—for example, systems that let passengers know when their school bus or transit vehicle will arrive.

Chapter 4. Predictive Tools for Public Safety

The public safety chapter focuses on police officials in many communities who engage in predictive policing and disaster response. Predictive analytics can deter crime through strategic officer deployment, reduce risk through better assessment of parolees, and improve planning for disaster. Fire departments use data to better target high-risk structures for corrective action. This incorporation of data into public safety policy helps refine the use of resources and increase efficacy. And when disaster does strike, emergency responders now use a broad array of digital tools and social network analysis to organize the response.

Chapter 5. Regulatory Reform with Data and Technology

This chapter looks at emerging breakthroughs across the country in how regulators use digital tools to transform once-burdensome regulation and inspection regimes into systems that provide better enforcement and service to new businesses. Reform efforts using digital technology drive down the time and cost of securing a license or permit. This is partly because data analytics can help distinguish between good and bad actors, reducing red tape for the majority while efficiently targeting regulation and inspection toward the troubled subset of the population.

Conclusion

The emergence of more effective, data-driven government does not do away entirely with what came before. The previous era of government was a collection of freelancers—party machines, private companies, charities—that were shut down and replaced by the reforms that created an era characterized by bureaucracy. But as data-driven, responsive government arrives, the freelance mentality—which sets individuals in motion to look for opportunities—is back in city hall. This time, though, data connects those free agents to other government players. And the public enjoys access to the data, forcing these players, unlike in the past, to think strategically and act preemptively. Knowledgeable members of the public with access to data tools can hold them responsible.

The new paradigm sets clear standards for everyday administration. (Then, too, the bureaucratic era enshrined the values of specialization and well-defined tasks.) This time, though, those standards don’t lock the players into narrow routines. Professionals are now enabled across government to share information, join together to make policy, and work side by side to execute policy.

In sum, responsive government places a high value on flexibility and an ad hoc approach to problem solving. The new paradigm uses clear goals and standards—not to mention professional civil servants—to carry out government policy.

This, then, is a revolution that conserves the valuable lessons of the past even as it defines a very different future. In these chapters, we explore that revolution in detail.


About the Author

Stephen Goldsmith is professor of government and director of the Innovations in American Government Program at Harvard’s Kennedy School of Government. He currently directs Data-Smart City Solutions, a project that highlights local government efforts to use new technologies combining data analytics with community input, in order to reshape the relationship between government and citizens. Goldsmith’s work at Harvard brings together representatives from major U.S. cities in support of quality-of-life innovations. He previously served as mayor of Indianapolis, where he earned a reputation as one of the country’s leaders in government efficiency, public-private partnerships, and neighborhood revitalization. Goldsmith has also served as the deputy mayor of operations in New York, chief domestic policy advisor to the George W. Bush presidential campaign, chairman of the Corporation for National and Community Service, chairman of the Center for Civic Innovation at the Manhattan Institute, and prosecutor for Marion County, Indiana. He is the author of The Power of Social Innovation: How Civic Entrepreneurs Ignite Community Networks for Good; Governing by Network: The New Shape of the Public Sector; The Twenty-First Century City: Resurrecting Urban America. His newest book, The Responsive City: Engaging Communities Through Data-Smart Governance, will appear in September 2014.


Chapter 1. Information: The Key to Effective Government

Government can perform most tasks reasonably well—picking up garbage, responding to emergencies, funding programs—without high technology. In New York City, for example, the police perform superbly, sanitation workers rarely miss a pickup, and firefighters are on hand for every emergency, always well staffed and brilliantly prepared. Workers deliver these good results despite layers of supervision, lawsuits, suspicions, media aversion, labor contracts, job classifications, and other constraints (constraints that help explain why the city’s workforce—350,000 people, counting teachers—is so massive). But vast armies of able government workers can do only so much. If not managed well, these armies duplicate work, interfere with one another, and create extra rules and paperwork that sap government energy and focus. The key to effective management is the efficient treatment of information.

All organizations depend on three basic processes. They need to get information (data), they need to make sense of that information (analysis), and they need to act on that information (action). The nature of reform becomes clear once one sees how responsive government handles each of these three essential aspects of information management—all quite differently, in fact, from the traditional bureaucratic approach.

The Battle of Simplicity and Flexibility

In the past generation, two paradigms of information handling have battled for the soul of government. One paradigm is the structure of bureaucracy invented in the late nineteenth century and enshrined in that era’s ideal of progressive government. Developed to bring cities out of the stagnation of corruption and nepotism, this bureaucracy runs on a hierarchical structure with a detailed division of labor, guided by formal rules and regulations. Its aim is to make each job simple to perform and to evaluate.

Progressive reformers of the late nineteenth century saw that accountability required a reduction in discretion, and they certainly succeeded in that goal. The method they used, though, was the elimination of discretion altogether. In the standard bureaucracy of the last century, oversight systems imposed by law—civil service and union contract work rules, job classifications compounded by inspector-general investigations—reinforced the importance of the narrowly defined routine. In this first paradigm, cities “manufacture” government with more brute force than finesse. And when they need more government, they simply purchase more units of input: more labor, more inventories, and more trucks, for example.

When a police officer’s job is to respond to a dispatcher, a teacher’s to conduct a class, and a public nurse’s to perform checkups, classic bureaucracy—with its clear assignment of responsibility to individuals to perform discrete tasks—works well. But we now live in a digital world where work does not stay within such neat and discrete divisions. It is now quite possible for employees in separate locations and agencies to work on the same file concurrently, for supervisors to see “red flags” and outliers in real time, and for wireless devices and data analytics to create new forms of supervision. And when an employee’s work requires sharing information and making decisions not specified in job descriptions, the simplicity of old-style bureaucracy becomes a liability. It is then that we see the importance of the second governmental paradigm: the need for flexibility in the face of complex problems.

In traditional bureaucracy, a dilapidated building is a matter for people whose job it is to deal with construction and maintenance: the inspectors who ensure that work is done in compliance with city codes. In today’s city governance, the building is correctly seen as a concern for workers in public health, police, environmental affairs, economic development, and other areas beyond building-code enforcement. It is simply more efficient and cost-effective to grapple with all the different causes of a building’s broken-down state. Understanding those factors could yield the best strategy for repairing the building. Such a complete analysis can reveal that fixing a problem requires a multipronged approach; other times, it shows how focusing on one key element will tip all the others.

To respond with this kind of flexibility, a post-progressive bureaucracy must allow professionals the discretion to explore the complexity of policy challenges and, working together, to design collaborative responses to those problems. This is the ideal of today’s reformers.

Innovation Infrastructure

Ironically, the data-driven revolution of today’s reformers has its origin in pushpins on paper maps. An officer in New York City’s Transit Police Crime Analysis Unit used the pins to designate where crimes occurred. The unit was then able to visualize clusters and patterns, enabling them to precisely focus their efforts on trouble spots. In the mid-1990s, the pushpins gave way to CompStat, a more sophisticated data collection, analysis, and mapping system adopted to manage the New York Police Department. CompStat, which quickly fostered a 60 percent reduction in crime, inspired Baltimore’s CitiStat, a performance-management system that extended the concept of data awareness and flexibility beyond policing.

Through the first seven years of the program, CitiStat helped Baltimore save about $350 million by improving the efficiency of its departments.[2] The system was later adapted to work on a statewide level after Baltimore’s mayor, Martin O’Malley, became governor of Maryland in 2006. In today’s Maryland, data and analytics combine to make government responsive in ways that scarcely could have been imagined 20 years ago. For example, algorithms predict the risk that a certain parolee or probationer will reoffend. That person will then receive increased supervision.

Such techniques allow the state to utilize scarce resources where they will have the most impact. This management approach is now spreading to other municipalities, where it is usually associated with improved services and lower costs. For instance, in Louisville, Kentucky, under Mayor Greg Fischer, the city’s asthma-monitoring project uses GPS data from inhalers to identify and mitigate the environmental conditions that cause chronic asthma.

For all their differences, these and other responsive government efforts share several key traits. They all involve returning discretion to municipal employees, informing that discretion with vast amounts of data, and analyzing data in ways that help civil servants solve problems. At the same time, these new approaches give citizens a clear window into the operations of their government. Importantly, too, all these efforts have involved new management structures and have had the support of key executives determined to change “business as usual.” A look at three major cities shows how these themes play out in different ways in different municipalities.

Chicago: How Centralized Data Analytics Change the Essence of Government

When people mingle socially in a single place, they organize and synthesize information. Hence data analytics centers, where previously separated streams of data are collected in one place, are now among the most powerful drivers of innovation in government. Such centers do much to solve the biggest problem in government today: isolation of knowledge and resources from problems. These efforts drive change by mining data that power predictive analytics, which can then be delivered to the field through decision-support tools.

A leading example of the power of data centers is the work of Chicago’s Department of Innovation and Technology (DoIT), which is transforming that government’s data from an after-the-fact archive to a resource that guides proactive decisions. DoIT has the sort of control over data collection and analysis that used to belong to individual departments. The DoIT approach lets ordinary citizens access information that they could never see before: Chicago now has one of the largest and most robust open-data portals in the United States, with 592 data sets, ranging from the salary of every municipal employee, to every reported incident of crime from 2001 to the present.[3] Chicago’s SmartData platform soon will be able to pull information from across the government, facilitate predictive analytics, and display results on an interface that lets users see data geographically in real time. Another DoIT component of SmartData is the geographical interface—called WindyGrid—which allows users to view geographic and temporal-based data mapped across the city.

WindyGrid was developed in early 2012 by then–chief data officer (CDO) Brett Goldstein, who envisioned a program that would be able to tell the story of a given location at a given time. Thus, by querying a specific location in the city—such as the intersection of State Street and Madison Street—a user would be able to see a history of 311 calls, 911 calls, crime reports, and other occurrences at State and Madison, as well as any activity occurring there in real time.

SmartData’s predictive analytics capability will enable city officials to observe relations among disparate sets of data, providing estimates of the probability of one type of event occurring after another. For example, the system revealed that concentrated rat populations tended to spike seven days after garbage-related complaints—thus allowing the city to mobilize forces accordingly when garbage-related complaints occur. These analytics would then be displayed on the WindyGrid interface, providing users with a powerful tool for coordinating incident management and workflow.

Chicago’s innovations have succeeded in no small part because they had the strong support of Mayor Rahm Emanuel and the concentrated resources required to succeed. For instance, Chicago developed its own, in-house “center of excellence” around data analytics, hiring the first-ever CDO in Brett Goldstein, whose mission as CDO was to coordinate the city’s open-data policy and lead new data analytics initiatives. Throughout his tenure, Goldstein oversaw the expansion of Chicago’s data portal and used innovative new methods to use the city’s data reserves. For example, he developed WindyGrid using open-source tools at a small fraction of the cost of traditional software procurement.

Under the leadership of CIO Brenna Berman, DoIT has been developing its internal capacity for innovation and product development. The department currently administers the growing WindyGrid application, is in the process of developing the SmartData Platform, and has plans to design other resource tools as well. By making Chicago’s data transparent, DoIT has been able to leverage the talents of the city’s growing community of “civic hackers,” techies who use their programming skills to develop meaningful apps in the public interest.

Furthermore, Chicago’s open-data policy enables academic participation in problem solving. DoIT has formed essential partnerships with the University of Chicago and with Carnegie Mellon University, both of whose considerable capabilities are a component in Chicago’s development as a center for data analytics.

Thus, through strong executive leadership, the cultivation of talent for in-house development, and key strategic partnerships, Chicago and DoIT have been able to set the stage for creating a new, data-driven policy framework that was unthinkable only several years ago.

Boston: Structures That Change the Citizen’s Relationship to Government

In the standard bureaucratic model of twentieth-century government, citizen activists often assume that they need to overwhelm local officials with information about how common problems are. When I served as New York’s deputy mayor, one community board leader told me: “I always assumed that the volume of complaints determined whether city officials would respond.”

But the availability of easily accessible 311 data, she went on, had increased her effectiveness as an advocate. Instead of looking for ways to make noise, she now looks for patterns in information. Once she finds such a pattern, she can ask the government to craft an appropriate solution. The availability of online data helps people figure out the causes of problems rather than assuming the answers without substantiation. As an example, the board leader told me about a series of accidents involving elderly pedestrians at a traffic intersection. Publicly available data about those mishaps led her to realize that they arose from insufficient time to cross the street to a pharmacy—meaning that timing changes in walk/wait signs could address the problem.

This new model of engagement cuts across agencies. Safety problems might result from policing inadequacies; but they might also result from the design of streets or parks, activities of nearby schools, schedules of delivery at nearby stores, or any other factor. So data liberate public officials to do what they do best: identify problems and develop solutions—not just in their own department, but by working with people across government.

Boston’s responsive government effort stresses this kind of engagement with the public. The city led the country on this front in 2010, when it created a specific mayoral office dedicated to service delivery and citizen engagement: the Mayor’s Office of New Urban Mechanics (MONUM). The office lies within the mayor’s suite (another example of strong support from the top), and its cochairs come from backgrounds mixing private-sector experience with public-sector interests. The MONUM incubates innovative projects and facilitates collaboration within city hall and with outside entities. “Civic innovation is civic engagement,” says Mitch Weiss, the former mayor’s chief of staff.[4] The unit’s name is a nod to the fact that former mayor Thomas Menino has long been called—both admiringly and critically—an “urban mechanic.” Yet no one knew the city and its neighborhoods better. The MONUM is thus an amalgam of new technology and old-fashioned attention to the details of urban life.

The MONUM illustrates how responsive government needs a “safe space” within city hall for experimentation, and it allows the city to be ambidextrous: to have city agencies focus on the traditional work of service delivery while allowing “citizen mechanics” to experiment on new initiatives. For example, Citizens Connect, a smartphone app, allows citizens to report problems throughout the city. The reports, usually accompanied by a photo taken with a smartphone, are tied to specific locations in the city. Citizens report issues such as illegally parked cars stopping traffic, rat infestations, building elements or trees that may fall onto or impede pedestrians on the sidewalk, graffiti, or trash left out; all these reports are then automatically mapped and sent to the government to sort out. Such complaints deal with issues touching the full range of departments—police, sanitation, traffic, housing, neighborhood services, and parks. When collected and organized in the appropriate databases, these reports can serve two purposes. First, they can serve as work orders for city workers. Second, they can help identify patterns: “hot-spots” where broader initiatives and policies need to be initiated. This has been so successful that the state funded a rollout of Commonwealth Connect, which has brought this functionality to more than 40 communities in Massachusetts.

Digital platforms allow government to quickly capture community sentiment and translate that information into action. Citizens generate data when they converse with their governments via social media, when they participate in online “ideation” forums, or when they use city apps to report problems or rate services that they receive from government departments. Government can curate and use these new streams of information to operate more effectively and bring in innovative ideas from the community.

Boston’s success with citizen engagement reflects a broader trend evident in many other municipalities. In 2012, for instance, Philadelphia formed its own Mayor’s Office of New Urban Mechanics through an executive order placed by Mayor Michael Nutter. Meanwhile, Mayor Vincent Gray of Washington, D.C., provides an innovation infrastructure in a more direct retail way by providing a platform for collecting and visualizing customer grades of city services and employees. Grade.DC.Gov solicits, mines, measures, and displays resident satisfaction and reviews, placing emphasis on the quality of government services as perceived by the citizens being served. Consumers of city services can use the Grade.DC.Gov website itself or provide input by texting, posting on blogs, or using social media services (such as Facebook, Yelp, and Twitter).[5] A daily digest helps city officials start their workdays with a list of comments from the previous 24 hours, allowing for quick remediation of problems. Use of Grade.DC.Gov has resulted in departments improving their grades consistently from the outset—from “barely passing” to As and Bs. These innovations illustrate the power of new technology to improve responsiveness and communication between citizens and their government.

New York: Innovation Delivery Units Meet the Digital Scientist

All the themes of responsive government—data freed from separate silos and made available to all, powerful analytics, executive commitment, and the active quest for useful innovation—can be found in New York City’s recent efforts. You would expect a government led by former mayor Michael Bloomberg to care about data. Before going into politics, after all, Bloomberg made his fortune by selling terminals that displayed comprehensive, minute-by-minute data to stock-market traders. In fact, Bloomberg’s New York measured just about everything, from 311 call trends to tree plantings to detailed, agency-specific indicators.[6] Mayor Bloomberg’s famed “bullpen” in City Hall—designed to emulate the trading pits on Wall Street, where information and energy flow freely—contained a large screen that displayed performance data all day. The information flow encouraged officials to think not just about specific problems, but also about the connections among city departments.

Hundreds of indicators helped the mayor and his team focus on areas needing attention. Of course, streams of real-time data have their limitations: they tend to measure activity more than outcomes; they are retrospective looks at the past rather than predictions; and they often focus on a single agency. But such data streams are a critical first step toward a new and broader digital effort—one that combines and analyzes information across agencies to find and preempt problems. In New York, an ad hoc data-mining group (a self-described “skunk works” in City Hall) stepped up to fill the gap between raw data and useful knowledge, using analytics to identify answers to important questions.

When I served in New York City government (2010–11), we discussed three models of how to drive digital innovation. One model was to establish a digital analytics center within the mayor’s office, with links to each agency. Such a center would look for discoveries that would transform the way government works. As we conceived it, the new revenues discovered from underreported taxes, waste, fraud, and abuse would fund important new work in social services and other areas. At that time, though, our budget office could not precisely define the benchmarks that would have let us define substandard performance. The city did not create the formal analytics center. Instead, the government used a second, more ad hoc model, empowering a smart group of data scientists who used information to unlock discoveries across agencies.

Our third model came from social-service agencies that had concurrently started their own data exchange, HHS-Connect. Rather than seeking to connect siloed data sets by assembling them into a large data warehouse, this initiative took another approach. It linked disparate information sources via information portals, where once-separated processes were now combined. For example, because of HHS-Connect, a citizen applying for public benefits need not resubmit a birth certificate that had already been furnished to the city for another purpose. This makes dealing with the city easier for citizens, while helping social-services employees improve the quality of casework by giving them access to additional relevant data about customers.

An early example of New York’s success in responsive government was motivated by a fire on April 25, 2011, in a row house in the Bronx. The blaze killed a 12-year-old boy and his parents. More than 100 firefighters worked for an hour to contain the fire. The boy’s family was one of several who lived in illegal conversions in the row house. As early as 2008, neighbors had called the Police Department, the Department of Buildings, and the 311 phone line, complaining that the building housed illegal single-room occupancy units and did not, among other violations, have appropriate exits. So why hadn’t anyone responded?[7]

The building and fire commissioners, in fact, both had information about the building. Yet they also had an overload of data about structures all over the city, and the inspectors whom they sent to investigate complaints were rebuffed by the building’s tenants. None of their data sets were organized in a way that helped either department identify priorities or ways of pursuing those priorities with other city agencies. The firefighters were doing their best just to keep up—they were more likely to respond to crises than to complex, simmering problems.

But what if the fire department’s goal was not just to respond to crises but to predict what problems could lead to crises? What if fire officials could identify the causes of the fires—before the blazes struck? What if the city could predict where a fire is most likely to occur? This information would allow the city to strategically target its resources to inspect and evacuate the most dangerous dwellings. To pursue this goal, the city launched a pilot project guided by Mike Flowers, a lawyer and data analyst, which focused on fire risk and used metrics to identify the most dangerous of the illegally subdivided properties. The Flowers-led effort used the following metrics to identify the properties most at risk for fire:

  • Owners in financial distress (including those with foreclosures or tax liens)
  • Multiple illegal-conversion complaints
  • Multiple-family dwellings built before 1938, when a significant building-code revision took effect
  • Low-income/high-immigrant/low-employment neighborhoods

The special team consulted easily available data, such as agency reports, real-estate filings, and finance and tax information. The analysts found that dwellings with all four risk factors were over 40 times more likely to have a fire. Further screening allowed the team to target true high-risk illegal conversions, attaining a 40 percent vacate-order rate for single-room-occupancy conversion complaints, up from a historical 3 percent.[8] The project showed that, by intelligently moving from a focus on activity to a focus on cross-agency data to achieve specific outcomes, resources can be much more effectively deployed. And, as with most comprehensive review efforts, the study’s insights led to other discoveries. For example, what about the illegally subdivided buildings that had a low risk of fire? In a place like New York, which has a severe housing shortage, perhaps zoning or other regulatory changes should be made to allow such conversions. We can see from the New York experience that a traditional data-tracking and management tool, which simply evaluates the elapsed time to an inspection, would not produce the same result as identifying and mitigating risk. The next step, in the evolution of new structures to power data innovation, takes the New York model into an institutional solution for a city.

Conclusion

The examples of responsive government described above have several important, common traits. First, they are all real: practical applications of innovative thinking and new technology, driven by civil servants committed to change. Second, they are all successful: these programs have found solutions to stubborn problems, replaced the heat of conflict with the light of information, and saved millions of dollars by freeing information from its former constraints. Third, they use technologies that consumers increasingly expect from private industry, such as Netflix, Amazon, Apple, and Google.

These traits illustrate some of the means and motivation for responsive government. An easy exchange of ideas and technology with the private sector helps provide government with the technological means. And now that consumers see how well the private sector uses data intelligently to meet their needs, they increasingly expect government to do the same. That is the political motivation: urban residents accustomed to intelligent use of data in the private sector will not long tolerate “data blindness” in city services. The era of responsive government is here to stay.


Chapter 2. Making Data Matter in Administrative Systems

For decades, well-managed cities have used data to enhance performance. Police Commissioner William Bratton launched the New York Police Department’s widely copied CompStat program in 1994. Through CompStat, Bratton and former mayor Rudolph Giuliani proved data’s effectiveness by gathering hot-spot data, which they connected to operations and strategy to dramatically drive down crime. This program inspired former Baltimore mayor Martin O’Malley’s CitiStat (launched in 2001), which, after O’Malley’s election to the governorship, was expanded to create Maryland’s StateStat in 2007. Yet these early examples stood out as exceptional and difficult. Analysts had no access to real-time information and even less insight into data from other agencies. The process of gathering data from multiple points and making sense of it in a paper-based analog world defied imagination, let alone execution.

Today, city officials sit atop loads of data that are expanding exponentially. In the 1990s, and even shortly after the turn of the century, the thought that every machine, vehicle, and clerical worker would incessantly generate important information remained a pipe dream. Even now, some ready sources of data fail to reach the attention of city leaders, a consequence of continual technology breakthroughs in every area of city services. In this chapter, I will discuss how the seemingly boring world of basic administrative systems can be, and is being, transformed to produce breathtaking advancements for municipal leaders and their residents. Nothing better proves this example than the work of former New York City finance commissioner David Frankel, who used data to produce much more with less.

Even those who oppose higher taxes generally do not object to increases in revenue derived from ensuring better compliance—more people paying what they owe. Incomplete enforcement favors those who abuse the system, shifting more of the burden to those who follow the rules. One of my first conversations with Frankel, when I was deputy mayor in New York, was about how to increase corporate taxpayer compliance. Simply increasing the total number of audits conducted seemed both unrealistic in terms of manpower and unfair to those taxpayers randomly targeted for an expensive audit, since more auditors meant more audits of those companies that had paid their taxes in full. Frankel and his auditing team, led by his deputy commissioner for audit, Michael Hyman, had a better idea: use analytics to increase the productivity of auditors reviewing companies thought to be underpaying their taxes. Using sophisticated data analytics, the commissioner instructed his department to look for patterns—identifying individuals who had businesses similar to others but who stood out as outliers on taxes paid. In so doing, Frankel and his team reduced the portion of audit cases closing without change: from 37 percent to 22 percent over three years. This represents a 40 percent increase in productivity for the department and a 100 percent reduction of government intrusion for the thousands of companies that would have been catapulted into the audit process, with an end result of no change on their returns.

To accomplish this task, the Department of Finance created the Data Intelligence Group (DIG), with the goal of identifying the most audit-worthy candidates. DIG creates models that suggest audit priorities. In their first three years, these models produced estimated assessments of $292 million, including $27 million from non-filers. The analytics encompass all business and individual income and excise taxes. Experts in the group access a broad variety of data, including city, state, and IRS tax information, as well as nontax information from other governmental agencies. These efforts include comparison of reported income and expense data across multiple sources, leading to the identification of under-reporters as well as non-filers. According to Frankel, “DIG’s modeling techniques, sophisticated statistical approaches, such as predictive modeling, and a multistep non-filer protocol are designed to avoid flagging businesses that are not required to file.”

In this way, the Finance Department more effectively captures underpaid taxes by focusing its finite audit resources on the targets most likely to have underpaid taxes. The Frankel lesson reveals the path to operational excellence—taking advantage of existing data, mining information from other sources, and assigning a well-trained team that asks the right questions and creatively applies algorithms.

For a century, the ideas of renowned mechanical engineer Frederick Winslow Taylor shaped efforts to improve organizational productivity. Taylor broke down assembly-line operations to identify the most efficient ways to perform a wide range of tasks. Workers received bonuses for reaching certain productivity milestones. Every “record” created a new standard for workers. But modern management requires more than performing a series of simple, specific tasks. The best managers develop detailed understandings of their employees’ unique skills—and they work with those employees to develop those skills over time.

American government has seen various basic human resource regimes. Under old political machines, winners gave out jobs and contracts to supporters. If patronage workers performed well, defenders of the system said, voters would reward them with reelection. If they failed, voters would turn them out of office. To wipe out the abuses of this system, civil services set detailed, legalistic rewards and punishments for government workers and contractors. Progressivism expanded this system by setting rewards and punishments for businesses. The welfare state did the same for recipients of government largesse.

But issues that require collaboration, negotiation, problem solving, and creativity are a different matter. Those issues require a broader skill set—and the autonomy to get the job done. Hiring results-oriented public employees and granting them a level of discretion previously thought to be unwise is now feasible; those employees, however, need tools and processes that allow them to find the best way to achieve their goals and to help others reach their goals as well. Those tools begin with data and new administrative systems, allowing data to be translated into results.

Human Resources

No area better defines the clash between new and old than public-sector human resources departments, where administrators apply outdated approaches to public administration—while beginning to use data to manage employee costs. The standard city HR department concentrates on job classification and testing regimes to attract individuals capable of performing carefully prescribed job descriptions. But this is occurring at a time when more complexity is confronting workers and thus, they need tools available to allow them to do more.

Managing Overtime and Scheduling

Many cities have created value by converting paper processes to online ones, including processes such as “onboarding” new employees, automated time and attendance tracking, and data-driven scheduling. The same issues now exist in HR as in other areas—for example, officials are increasingly sitting on huge amounts of underused data. Mining, harvesting, and analyzing this data will help mayors increase productivity by hiring better, promoting smarter, and managing scheduling and overtime more effectively.

The best workforce-management systems start with straightforward applications of automated systems. Houston, for example, uses automation to ensure flawless connection between payroll, time, and attendance—ensuring accuracy for time worked, tracking of medical leave, and unexcused absences. The Mississippi Department of Corrections uses its system to provide automated alerts to supervisors before overtime is worked. These digitally based systems dramatically increase accountability and set the stage for substantial productivity improvements. Accountability results from such newly available applications, including:

  • Staffing decisions that optimize open job or overtime assignments with individuals who have the right skills and training
  • Automated staffing based on business rules and employee preferences
  • Automated pay that uses analytics to connect contractual and overtime rules for pay calculations and to identify exceptions, making it easier for supervisors to pay attention to outliers
  • Simplifying management of vacation, sick FMLA, and flexible time attendance

Don Pagel, a data-driven, former deputy director in Houston, oversaw the implementation of a sophisticated time and attendance system that, coupled with analytics, will not only facilitate the move away from paper-based systems to track attendance, but will also eventually save $10 million annually in compensatory and unauthorized time. According to Pagel, not only did the city save personnel costs with an automated system; it was able to introduce more accountability. He was, as well, able to redeploy people who had previously entered data from paper.[9]

Moving from Accountability to Productivity

The state-administered highway system in Arkansas totals 16,416 miles, which ranks 12th in the country. The State Highway and Transportation Department utilizes about 6,000 pieces of equipment, including its vehicle fleet of about 2,000 and a workforce totaling more than 3,000 full-time employees. Nearly a decade ago, the department developed and implemented an inventory system to manage its fleet, equipment, and machine parts. Yet the time and activities tracking for staff was still a slow, manual process completed by data clerks, which caused even more expense when merged (manually) into the fleet and inventory systems.

When mechanics at the department started to work on a piece of equipment, they used the fleet-management and inventory systems to log their activities online, and an HR manual system to log time on paper, which was then processed by data stewards into the HR system. Once the stewards painstakingly entered the data, others had to merge the data with cost information about parts and supplies utilized (e.g., oil change for tractor, versus oil change for a dump truck, versus an engine overhaul). The department sat on top of data in two areas: work-order management; and time and attendance, which produced true value when analyzed together.

After showing promising savings through paper-based systems, department managers made a clear case for why their time and activities system needed to be updated to a technological platform that would more efficiently tap the benefits of analytics. As a result, the department’s leadership looked for a system that would automate the workforce-management process, accurately track employee time, streamline payroll, and give employees access to their regular and leave-time information.

Under the department’s new time and activities system, management can now better plan activities and align budgets with real costs, while also addressing staffing, maintenance, equipment, and other needs in real time, instead of after the fact. Further, Arkansas not only retired clerks who no longer needed paper but redeployed others, who are now focused more on managing knowledge than on mind-numbing data processing and time entry. Other benefits include:

  • Improved efficiencies through strategic deployment of resources (decreasing redundant maintenance trips to the same area)
  • Consistency and standardization of data collection to improve accuracy and better analysis of payroll calculations
  • More discipline and better activity tracking for grant management
  • Better tracking of repairs based on actual activities and the ability to use maintenance history to predict future maintenance needs
  • Ability to identify trends and flag problems

As governments continue to invest in more advanced IT tools, they will show savings from effective workforce-management processes. One can anticipate an empowered workforce compensated not for activities, like oil changed and tires and fan belts replaced, but instead for keeping the fleet operable, where data is used to retire the obsolete and extend the life of equipment with preventive maintenance.

Employee Empowerment

Data produces results when effectively organized and delivered to the people who do the actual work. As important as it might be for managers to use data to enhance performance and reduce overtime, the true value results when those who do the public’s work avail themselves of this new treasure trove of information. Powered by digital tools, street-level workers now can pierce hierarchical and jurisdictional barriers to find answers to public service requests. In these chapters, we advocate a new approach to public employment that allows public employees to control more of their processes and accept more responsibility for outcomes.

This movement will take time. In the interim, officials can rely more on digital means to extract innovative suggestions from within the bureaucracy. In the mechanically driven work-order systems of the past, government workers would be dispatched to fill a pothole. If, on the way, they passed another pothole on the same block (or worse, if the homeowner pointed out another pothole to the workers), they would decline to change their instructions. After all, following the rules, not following common sense, drove work. And, of course, city officials did not trust their workers to make these decisions—out of fear, perhaps, that some inappropriate factor would influence them. Now, however, city workers in Boston more effectively manage work orders via mobile devices, which allow them to open new cases themselves. The mobile app enables before-and-after pictures of work, which are useful for management, promote transparency, and engage neighborhood residents who can see the work and, in some circumstances, the picture of the person doing the work.

Smarter Management

Analytics allow managers insights into labor productivity that were previously impossible. For example, the U.S. Office of Personnel Management (OPM) is taking steps to use data to change the way it manages hiring, training, and vacancy planning. OPM aspires to use data to facilitate better-informed hiring decisions by looking at mission-critical occupational areas, accession planning, retirement bubbles, and the effect of alternative incentives for early and voluntary retirements or retention, and more.

In HR, as in other areas, data capabilities no longer should, or need to, reside in some remote IT workshop. The management insights from data are most useful when available to a broad array of managers. Business intelligence tools provide such broad access, and OPM has encouraged chief human capital officers to avail themselves of the tools. As emphasized by OPM, a data warehouse accompanied by business intelligence tools allows ad hoc queries “on a number of subject areas, including (but not limited to): age, agency, employee, payroll, performance appraisal, personnel action, position, and retirement.” Knowing how to manage future vacancies will help government perform better. However, discoveries about other matters that affect hiring, promotion, and performance will not take root if trapped inside an archaic system that neither allows true discretion, nor rewards excellence, while also failing to hold employees accountable for poor performance.

Budgeting

In this chapter, I discuss how city leaders can learn from easily available data. Such insights often come from employees discovering lessons from the data. Yet these lessons also come when citizens can see data not previously available. Municipal budgeting has long been an arcane sport—one designed to control departments and impress taxpayers and rating agencies, but with little direct relationship to actual management, and even less to do with transparency and participation. Many municipalities have taken the step of “opening the budget” by putting it online. This transparency effort encourages civic participation, as well as more accountable government, but only when city hall provides usable information and presents it in an easily consumable way.

Mayors express policy through the budget, which, in turn, drives government by defining priorities. Budget processes across the country tend to be fairly complicated, political, and difficult to communicate. Citizens gain confidence and force better results when open government allows them to be innovators as well as watchdogs. In recent years, several city governments increased the level of transparency in their public budget process, opening their checkbooks and building budget portals to share data online.

Palo Alto, utilizing software from a Silicon Valley start-up, OpenGov, launched an easy-to-use portal with virtualization tools that provide citizens with more information and flexibility. For example, residents may search and download detailed information about government budgets, contracts, spending, subsidies, and tax expenditures across most, if not all, government functions. These portals offer a range of search-and-sort functions that allow residents to navigate complex expenditure data with a single click of the mouse, gaining access to checkbook-level spending information by agency, recipient, category, or purchasing office.

With only a few more clicks, taxpayers can also view expense and revenue items by department, type of account, and project; digging slightly deeper, users can find data details relating to their neighborhood’s police, library, or other city service. Users can find out how much each function costs, down to budgeted technology, workforce, or “bricks and mortar” expenses. The portal also includes budget history, allowing for historical benchmarks and comparisons of budgeted accounts for the current year against actual expenses from previous years.

In New York, four city council members launched a pilot that allows their constituents to democratically allocate $6 million in discretionary capital funds to community projects. More than 8,000 people participated in these four districts, funding everything from public school bathroom renovations to pedestrian walking-path repairs.[10] The program is now entering its third year, and nine council members have endorsed participatory budgeting in their districts. Between September 2013 and April 2014, residents will directly decide how to spend $12 million in capital funds.[11] The Participatory Budgeting Project, a national advocacy group, predicts even more involvement during the NYC 2014–15 cycle, with as many as 21 of the 51 council members committing their districts, and $25 million in capital funds at stake.[12]

But participatory budgeting has yet to make the digital leap to include online deliberation and voting. The current process is a time-intensive commitment that involves a long series of face-to-face meetings and committee sessions. Project proposals tend to be low-tech, with little presence in online or social media platforms. Votes are counted on paper ballots. Albeit on a much smaller scale, the mayor of Westfield, Indiana, managed the digital transition by inviting his community to the polling and nominating process through Facebook to gather advice on priority projects for a new capital fund. Easy-to-use open-data budget portals that focus on citizen engagement, transparency, and accountability can save cities money through more efficient government operations, better communication, and citizen engagement programs, as well as more competitive contracting and lower risk of fraud and abuse within agencies.

Managing Performance

The CompStat and CitiStat approaches now serve as the foundation for transformative breakthroughs, when coupled with the examination of cross-agency data. For example, Louisville’s LouieStat uses data analytics to move the stat programs one step further. Led by a Chief of Performance Improvement, who combines a major open-data effort with a data-driven approach to identify the root causes of problems, Louisville can now better tackle those problems, rather than simply manage the efficiency of reacting after problems arise. Before LouieStat, for example, more than 300 inaccurate inmate fingerprints were returned each month. Agency staff initially thought that the problem stemmed from the hardware and software used in processing the fingerprints. But after looking at the shifts with the highest return rates, the corrections department realized that much of its staff had not received formal training in fingerprinting. Now every shift has at least one trained technician, the city is training more, and the return rate has dropped from more than 300 a month, to fewer than ten.[13]

The Elements of Success

There for the taking: huge increases in operational excellence through the use of readily available data. We are on the cusp of a breathtaking set of government reforms that will unleash far more productive governance by combining the talents of an empowered public worker with a more engaged citizen. New digital tools flatten bureaucracies while enabling managers to know much more, in real time, about the productivity, effectiveness, and fairness of their employees. The reformers of the last century had to dramatically restrict discretion to produce accountability. Now we are on the verge of a new definition of accountability that furthers responsiveness.

The elements of success will include not just empowerment, but also the use of predictive analytics to answer new and serious questions about the root causes of problems. These breakthroughs involve more than technology: they also involve building a structure that encourages interagency capacity. Applying these fundamentals will result in much success, thereby moving government’s use of data away from simple, open-source transparency through participation, and toward integrated analytics.


Chapter 3. Infrastructure Management

When I started writing about city services 15 years ago, infrastructure innovations included better pavement processes, design-build approaches, and improved labor practices and productivity. All these continue, but they have partly been supplanted by a new approach to infrastructure: asset optimization.

Today’s innovative city leaders must maintain and build infrastructure, but such things as moving people and traffic, and processing water, now involve analyzing massive amounts of information to derive the best use for critical public assets. In this chapter, I examine digital breakthroughs in repairing equipment before it breaks, moving people more effectively with analytics, and using smart grids to conserve energy.

Extending Asset Life Through Data

Data pours out from all over a city—from residents, smartphones, repair orders, and assets themselves. Government technicians see data from flow meters embedded in sewer and water pipes, cameras that monitor bridges, and sensors embedded in streets. Administrators now face a new challenge: how to integrate the data to expand the life and capacity of municipal infrastructure. How does one interpret information to know when, say, a bridge will ice up or begin to develop expensive stress fractures, or when a roadbed will deteriorate so badly that it needs total resurfacing?

In 2007, an I-35W bridge in downtown Minneapolis collapsed during rush hour, dropping 111 vehicles into the Mississippi River. Thirteen people were killed, and 145 injured. When Minnesota officials built a replacement bridge, they ensured its ongoing safety by embedding sensors throughout the structure. During the bridge’s construction, concrete maturity meters and thermal sensors allowed the contractor to produce higher-quality concrete. The bridge’s 323 sensors and gauges now track structural health over time and enable comprehensive, ongoing safety management by the Minnesota Department of Transportation, the Federal Highway Administration, and the University of Minnesota. In addition, temperature, humidity, and wind sensors along the bridge trigger the preemptive spraying of deicing chemicals. The bridge has additional sensors to monitor traffic.[14]

Smart Water

Faced with enormously expensive federal mandates and continuing local pressures, officials across the country, forced to raise water rates almost annually, have increasingly turned to sophisticated approaches to maintain and enhance their systems. Often in partnership with leading private providers, these officials use technology to make more precise decisions about how to maintain systems, prevent problems before they occur, and expand capacity.

For example, the very well-run Milwaukee Metropolitan Sewerage District (MMSD) prioritizes maintenance work and reduces the risk to critical assets, through data collection and a ranking system that considers the asset’s criticality. The district uses extensive remote sensor information, including sewer levels, gate positions, water quality, weather (rain and wind), and pump-operation status to better maintain valuable assets. MMSD can now anticipate serious problems, and by integrating the sensor information with a sophisticated maintenance-scheduling system, officials adjust maintenance to better protect MMSD’s collection-treatment assets. The mining and evaluation of data from multiple points moves the department from organizing work orders on problems (after they become serious enough to visually manifest themselves) to instead targeting resources most effectively.

Smaller communities can also avail themselves of such an approach. Gresham, Oregon used such a data-driven approach to recognize that a blower pressurizing gas in a methane power cogeneration plant posed the plant’s greatest maintenance risk, which allowed water officials to develop an optimal maintenance plan for the asset. The more that data can be organized and mined—including bringing information together from different programs—the better the results. In Vancouver, Washington, each work order that the utility generates now has accurate and reliable asset-failure information associated with it. Establishing the framework for connecting these data points enables utility staff to easily and quickly identify assets on the brink of failure; thus, thousands of work orders created during the year form an information-based case for additional investment and asset protection.

Unsafe or overweight vehicles impose a cost on fellow motorists, emergency responders, and taxpayers. Enforcement traditionally relies on two imprecise methods: visual checks by safety officers driving the roadways; and inspections at weigh stations (the latter imposes costs on good operators, too). New Mexico’s Smart Roadside Inspection System identifies high-risk trucks, without impeding the flow of commerce, by integrating roadside imaging systems with multiple data networks. Cameras and character-recognition software capture the vehicle’s license and DOT number. A thermal detector images to find unsafe equipment. Officials then compare the information with an array of national and state criminal-justice, tax, registration, and motor-vehicle databases to screen for noncompliant operators. In 2011, more than 3.5 million alerts were generated in the following categories: 54 percent tax, 44 percent safety, 1.5 percent overweight, and 0.5 percent crime-related.[15] By automatically identifying high-risk trucks from the roadside, New Mexico’s approach allows inspectors to focus their resources on trucks that pose the most risk to transportation safety and security.

Using Data to Increase Capacity

Few items cost cities and states more than highway pavement. Today’s breakthrough challenge involves how to use data to drive more efficient use of public goods. For example, NYC collects increasing amounts of traffic data and incorporates it into traffic management. The degree of change that is possible became clear to me in a discussion with former New York transportation commissioner Janette Sadik-Khan, as she showed me her traffic-control room. Inside this room, filled with an impressive array of video and digital monitors, city engineers worked to make dynamic and real-time adjustments to traffic lights, thereby improving traffic flow. Midtown in Motion, an innovation of NYC DOT, monitors systemwide traffic patterns to set stoplights and move traffic. City officials, as of the beginning of the program in 2011, used more than 100 sensors at 23 intersections to find “congestion choke points,” helping reduce delays. According to NYC DOT, the program reduced traffic times in the areas studied by 10 percent or more (with the city planning to expand the program as a result).[16]

Each year, more than 200 million vehicles clog the New Jersey Turnpike. But the Turnpike Authority has begun to build out the architecture for predicting traffic jams before they occur. Contractors are installing “puck” sensors in the pavement at one- or two-mile intervals on the turnpike and every four miles on the Garden State Parkway. When complete, the sensors will detect traffic volume, lane occupancy, and speed in real time. Algorithms applied to this data will predict congestion; accordingly, traffic managers will route cars to the turnpike’s inner or outer roadway to prevent congestion. This information will also allow transportation officials to avoid secondary accidents by signaling slowdowns to truck drivers. Barry Pelletteri, CIO of the New Jersey Turnpike Authority, emphasized that to successfully manage a high-speed roadway, “you need the detection sensors, you need the strong network, you need the algorithm, and you need the software.” On the New Jersey Turnpike, you have “too many vehicles, too much speed, and too much roadway. We can’t keep up with it all unless we can anticipate, and the only way to do that is with the data.”[17]

Adjusting Consumption Behaviors Through Pricing

Harvesting data to better maintain infrastructure, or to help officials manage flow as suggested above, drives great value. Another approach involves organizing data and connecting the information to residents to change user behaviors.

Before dynamic pricing began in Virginia in November 2012, the state’s average commute time was 26.5 minutes, the seventh-highest in the country. Traffic congestion cost residents the equivalent of one full week a year. The state concluded that about 10 percent of all morning rush-hour vehicles are occupied by people on nonessential trips, such as shopping or personal errands. In the afternoon, that figure increased to almost 30 percent. As a result, Virginia began using a variety of demand-based tactics, including tolls on the Downtown-Midtown Tunnel that vary by time of day, as well as High-Occupancy/Toll (HOT) lanes on I-495, outside Washington, D.C., that can be used by high-occupancy vehicles or those that pay a variable toll. Pricing the use of valuable and limited resources such as faster lanes can, in effect, increase the supply of critical resources such as roadways. The system piggybacks off existing EZ-Pass transponders and toll-collection technology—making implementation easier and reducing the costs of setting up monitoring and billing systems.[18]

These initiatives present difficult political obstacles, since, by definition, they shuffle winners and apparent losers. Even more difficult obstacles occur when tolls go up to affect congestion, but the beneficiaries of those new tolls are another class of commuters. For example, when former New York mayor Michael Bloomberg undertook his congestion-pricing initiative, his goal was not only to ease congestion but also to raise money for mass transit. To a degree, this decision created diverging interests between those paying the tolls and the Metropolitan Transit Authority riders who would reap the financial benefits. And even MTA bus and subway riders remained unconvinced that the extra money would enhance services. Thus, a sound proposal to use data to affect consumption ran into a buzz saw of more parochial interests.

Good data needs to be presented inside a framework that produces allies. For example, Virginia’s dynamic pricing program is producing less revenue than anticipated because motorists are either driving less or at different times, instead of paying to avoid congested roadways during high-demand hours. Obviously, this result could be characterized as a success, if judged from a congestion-reduction, rather than a revenue, perspective. Similarly, London’s decision to include environmental mitigation as a goal in its congestion-pricing efforts may not reduce emissions as efficiently as a more exact, granular system that varies charges based on cars’ true environmental impact. London’s current pricing scheme, rather, exempts ultralow-emissions vehicles, plug-in hybrids, and electric cars, for example. Combining multiple objectives makes the data usage much more complicated in proving or disproving value. London’s congestion-charge revenues, while limited, have been leveraged by Transportation for London, a quasi-governmental entity that controls roads, to issue bonds for road and transit improvement, planned to total £3.1 billion. If this bond issue is successful, it may be one of the more significant positive impacts of the congestion charge, providing much-needed cash to develop the city while tax revenues and government expenditures continue to shrink.

Milan’s Area C congestion charge is levied primarily as an environmental initiative, with four levels of payments based on a car’s emissions. Over time, residents purchased more lower-emissions vehicles to capture free entrance to downtown, which again increased congestion, if not pollution. The Area C program has since emphasized congestion control by doing away with the scaled payments for all except hybrid and electric vehicles.[19] The fact that these cities continue to modify charges to induce certain outcomes could be thought of as a success because it demonstrates how pricing and data can be used to achieve public goals.

Conserving Resources Through Better Data

Across the United States, water and electric utilities are replacing older meters that require monthly visits from meter readers, with online ones that produce a continuing stream of important data. Dubuque, Iowa’s “smart city” initiative, Smarter Sustainable Dubuque, began in 2009 with a partnership with IBM. The program expanded from Smarter Water to include Smarter Electricity, Smarter Travel, Smarter Discards, and Smarter Health and Wellness. The water program equipped 300 volunteer households with smart water meters and access to an online dashboard, coupled with leak-detection monitoring, community education, and incentive programs. In the first year, participating households used 6.6 percent less water and detected eight times as many leaks. The next effort, Smarter Electricity, involved 1,200 households and used similar incentive programs to promote a reduction in energy use. The pilot program resulted in 4 percent less monthly energy use.[20]

A different form of integrating data and consumption occurs when consumers grant a provider the right to monitor their data to make decisions on their behalf. A group of Duke Energy customers participate in a program that allows the utility to remotely turn off air conditioners at these 230,000 households, which receive $25–$35 annually to participate.[21] Duke reduces power only incrementally, and for short periods of time, to save energy without causing such disruption that participants would opt out of the program.

More information allows residents to make better decisions. One key opportunity for public officials to produce value occurs when they find new ways to generate consumer-usable digital information. In 2010 and 2011, the San Francisco Municipal Transportation Agency (SFMTA), with federal and private partners, installed parking sensors in more than 8,000 on-street parking spaces.[22] The innovative pilot program, called SFpark, makes parking easier, public transit faster, biking and walking safer, and commercial areas more vibrant. Between sensors, meters, and a demand-responsive pricing model, the pilot lets drivers know where they can find open parking spaces at meters and in garages in seven neighborhoods, through an app available via the web or smartphone. The SFMTA uses occupancy data from the sensors to adjust parking prices with the goals of having at least one free parking space available on each block and ensuring that garages never fill. For example, on August 11, 2013, rates decreased by $0.25 or $0.50, depending on the location and occupancy rate of the parking space, during 18 percent of metered hours. It made no change for 62 percent of metered hours, and increased rates by $0.25 during 20 percent of metered hours.[23] Parking rates may vary by block, time of day, or day of the week, and rates increase by either $5 or $7 per hour during special events around the baseball stadium. Although the evaluation of the pilot had not been released by press time, extensive online data is available on the meter-rate adjustments, including the pilot area, street, block, time of day, previous rate, and current rate.

SFMTA reports that an unprecedented data set is being established, with parking (collected through sensors, meters, and citations), garage, municipal (travel time and transit-vehicle data), parking tax, and survey data. According to Sonali Bose, San Francisco Municipal Transportation Agency’s chief financial officer, “After evaluating the SFpark pilot project, the SFMTA will use lessons learned to develop a proposal for expanding the SFpark approach to the rest of San Francisco’s metered parking and city-owned parking garages. We expect that expanding demand-responsive pricing for parking will make it easier for drivers to find parking and improve the quality of life without any loss of parking revenues.”[24] With all policies and data used to make rate changes available online and a new source code for SFpark apps and map uploaded in July 2013, this effort is truly transparent and has transformative potential.

Learning from Data Generated by Residents

Cities can also learn from another set of sensors: the customers of their transit and street systems. The Oyster card is a plastic smartcard used to pay fares on most forms of transportation in London. Customers using Oyster on the London Underground tap their cards upon entry and exit of the system to pay their fares and open the gates. More than 80 percent of public transport trips in the London network use Oyster, providing the public operator of the system, Transport for London (TfL), with millions of data points every day that can be used to understand the size and shape of customer demand on the transport network. Although users may register their Oyster cards to protect them against loss or theft, TfL encrypts card numbers as part of a process to make the travel data anonymous before analysis. Lauren Sager Weinstein, head of analytics for TfL’s Customer Experience Team, described Oyster as a powerful tool: “Before Oyster was introduced, TfL was reliant solely upon surveys asking passengers to report how they traveled. Surveys, by nature, have limited sample sizes and have a limit in terms of how frequently they can be undertaken. Oyster is a more cost-effective and powerful tool for understanding TfL’s customers and responding to their needs.”[25]

In addition to providing an efficient way to count the number of customers across the network, Oyster data provides specific journey times for particular origin and destination pairs. “Through Oyster data, TfL can understand the ranges of times that customers take to make particular journeys across the network,” said Sager Weinstein. “This allows TfL to measure reliability of services over time and to measure the impact of the introduction of new services and timetable improvements.” Oyster data also helped TfL develop models to identify stations that would be hot-spots for congestion ahead of the 2012 London Olympics. Based on these findings, TfL rolled out a public messaging campaign encouraging customers to avoid certain stations at peak hours and also supplied extra trains to relieve anticipated congestion at targeted stations and lines. Subsequent analysis by TfL indicated that—while demand at stations serving Olympic venues surged 83.9 percent higher, on average—during the games (when the London Underground carried record-breaking numbers of customers), the transport network was able to accommodate this spike through the combination of increased capacity and targeted messaging about hot-spots. Oyster data showed a decrease in the “background” demand travel at stations typically serving commuters, with card entries and exits down 13 percent during the Olympics.[26]

Boston secures perhaps more infrastructure information from its citizens than any other city. Promoted by a special mayoral office and initiative called the New Urban Mechanics, the city developed a smartphone app called Citizens Connect, which allows citizens to report infrastructure and other public-works needs (potholes, graffiti, etc.). Geo-tagged pictures provide more accurate information to city crews, or even to neighbors interested in common problems. Reporting by app provides a digital foundation for analysis. As of December 2012, Citizens Connect had been used to resolve more than 35,000 issues.[27]

For a look at the evolving possibilities, observers can look to Santander, Spain, which outfitted its community with more than 15,000 sensors that measure everything from air quality and noise level to light intensity and traffic, with the ultimate goal of improving city services and quality of life. The SmartSantander initiative has transformed the city into an experimental platform to research applications and services for use in “smart cities” of the future. In 2010, Luis Muñoz, a University of Cantabria IT professor, secured—jointly, with Telefónica—a European Commission grant of 8 million euros to fund the initiative. The regional government of Cantabria provided an additional 500,000 euros to pay for half the cost of purchasing the sensors (the other half of purchase and deployment costs was funded by the E.C. grant). Four major, fixed-sensor initiatives now include environmental monitoring, traffic, energy, and irrigation pilots. The energy pilot program started with an intelligent lighting program that dims lights by 30 percent of their voltage when they do not sense any nearby movement. On a rainy evening, when fewer pedestrians and cyclists use city parks, the intelligent lighting system can reduce electricity use by 30–40 percent. Muñoz predicts that expanding this pilot could reduce city lighting costs by 20–25 percent. The next two pilots, scheduled to start in the coming year, involve water and waste. For the latter, waste bins on the city streets will be equipped with sensors to measure fill level. The city will monitor this data to plan routes by garbage and recycling trucks, reducing trips, traffic, and emissions.[28]

Increasing demands on urban infrastructure require city officials to find new ways to both keep it in good repair and engage citizens in a way that encourages efficient use. City officials possess enormous amounts of data that they can use to make better decisions about how to maintain and extend the life and use of infrastructure. Data, especially the combination of information inside government combined with data harvested from interested residents, creates an opportunity for scarce resources to be used far more efficiently. The initial successes highlighted here will soon become mainstream, providing commuters and taxpayers with new opportunities in changing the behavior of consumers and public workers alike.


Chapter 4. Predictive Tools for Public Safety

For decades, criminal-justice officials, advised by the results of good research, used data to drive performance. As a district attorney 30 years ago in Indiana, I used tools developed by the Department of Justice to identify “career criminals,” assigning them a score that would affect the severity of their charge and sentence. In retrospect, this was quite crude—prosecutors and police going through old paper-based criminal-history records, assigning numerical values to certain events, and ignoring other difficult-to-attain data sources.

Now, in the wake of diminished finances but better technology resources, public safety officials increasingly rely on better targeted interventions, and they do more with less. According to a 2011 COPS office report, between layoffs and attrition, law enforcement departments reduced headcount by 40,000 (including reductions in more than half of America’s police departments) in the preceding year.[29]

In the face of mounting challenges, local officials turn to predictive analytics to enhance public safety efforts intelligently and judiciously. From policing and probation to disaster response and performance measurement, the capacity to track, model, and predict high-risk/high-need areas and constituencies has been an invaluable tool in public-sector management.

Predictive Policing

Traditional hot-spot policing has been around for two decades. Since the advent of CompStat in New York in 1994, police departments have used statistical analysis to predict criminal patterns. Today, public safety agencies are using sophisticated data mining to produce insights and focus on underlying causes. Instead of responding to a situation, new technology helps analysts predict and respond to patterns that anticipate crime before it happens. In the front lines of the effort, California public safety officials in Santa Cruz and Los Angeles are applying what they learned from predicting earthquake aftershocks to stopping crime.

In 2010, the Santa Cruz Police Department (SCPD) had 20 percent fewer staff members than in 2000, but received 30 percent more calls.[30] Tight budgets and higher incidents of property crime led the agency to think creatively about deploying resources effectively. The SCPD’s public information officer turned to George Mohler, an applied mathematician and assistant professor at Santa Clara University, who developed a method for predicting future crime locations based on a mathematical formula used to predict post-earthquake tremors. Mohler theorized that “criminals want to replicate their successes, they go back to similar locations, they repeat their crimes—it’s almost identical to how aftershocks roll out after earthquakes, following predictable fault lines and timetables.” Mohler and several associates formed PredPol (short for “predictive policing”), which provides crime-prediction software to 11 municipalities in the United States and the United Kingdom.[31]

Rather than simply highlighting high-crime areas, PredPol’s system intelligently locates possible future crime locations. In Santa Cruz, the data scientists started with eight years of verified crime incidents. The program analyzes short- and long-term patterns in property crimes and, using the aftershock algorithm, weighs the relative risk of a future crime occurring in the next few hours. So if a burglar breaks into a car along a particular stretch of road, the system evaluates that incident against the system’s history of car break-ins and identifies high-risk areas and time periods for police supervision.

At each shift briefing, officers receive maps of 15 future crime hot-spots, developed using the most recent crime data, for purposes of targeted patrol. The 150x150-meter hot-spots are easy to generate: analysts can simply log on to a user-friendly web application and print maps before roll call. Hot-spots can be tailored to address specific time and area concentrations, allowing officers to generate unique predictions for day and night patrols. Officer buy-in has been a key component of SCPD’s success in implementing predictive analytics. Department officials encouraged, but did not require, patrol officers to consider hot-spot data. Officials wanted the predictions to augment—not replace—officers’ intuition and local knowledge. This bottom-up strategy allowed the city to quickly move from pilot phase to fully integrated operations. In the first six months, this strategy reduced burglaries by 14 percent and motor-vehicle theft by 4 percent.[32]

The Los Angeles Police Department and PredPol decided to test the system in a controlled experiment. Every morning, officers in the Foothill Division (population: 300,000) received hot-spot maps. On some days, PredPol’s system created the maps; on others, the LAPD’s in-house analysts produced them. The results: PredPol’s predictions were twice as accurate at predicting crime incidents as the traditional hot-spot analysis. Property crime in the Foothill Division dropped 12 percent, while L.A. as a whole experienced a 0.4 percent increase during the four-month period after implementation. Los Angeles now uses the PredPol system in three divisions.

Predictive Probation

In 2006, violent re-offenders established Philadelphia as one of the murder capitals of the United States. Philadelphia’s Adult Probation and Parole Department (APPD) oversaw 50,000 individuals, with only 295 probation officers.[33] To manage the escalating crime, the APPD needed a systematic way of identifying the riskiest individuals and dedicating staff resources accordingly. If the APPD could accurately categorize recently paroled individuals as low-, medium-, or high-risk for potential to commit violent crime, the agency could save time and money and reduce the likelihood of violent recidivism.

Enter Richard Berk, a sociologist who for 30 years was a professor of criminology and statistics at the University of California. Wharton School of Business and the University of Pennsylvania’s Criminology Department recruited him partly to bring his modeling skills to Philadelphia’s crime problem. In partnership with Ellen Kurtz, APPD’s director of research, and Geoffrey Barnes, a criminology professor at the University of Pennsylvania, Berk began experimenting with “machine learning”[34] to find connections across probationer backgrounds to estimate the likelihood of violent re-offense.

Berk built his predictive engine based on tens of thousands of individual criminal records, with dozens of variables such as age, gender, previous zip code, number of previous crimes, and type of offense. This intelligent, machine-learning model enables the computer to find patterns and relationships across dozens of variables and constantly reassess those relationships as new data are added. Berk created several iterations of the model, but the current one relies on outcomes from 119,998 historical probation cases drawn from nationwide data sets, each with dozens of predictors, resulting in a data set with 8.74 million decision points used to forecast a new probationer’s risk of committing a violent crime in the next two years.[35]

Machine learning not only sorts and categorizes probationers according to risk; it also carefully adjusts the forecast to avoid costly errors. For example, the consequence of assigning low-risk probation supervision to an individual who will likely commit a violent crime is very serious—the economic and social costs far outweigh the cost of the supervision. Barnes and Berk worked with the APPD to decide on an acceptable error rate: As a policy, how many people would the APPD be willing to categorize as high-risk who were actually low- or medium-risk in order to prevent miscategorization of the highest-risk cases? At a ratio of ten over-supervised offenders to one under-supervised offender, the model categorized far more people as high-risk than the APPD could ever hope to handle. The policy challenge was to refine this error rate until capacity to monitor high-risk individuals matched the model predictions. The benefit of this model construction is that prediction errors are not uniform and are tailored to account for the cost of bad forecasts.

To the surprise of many in the APPD, Berk’s model found that the original crime (violent versus nonviolent) had little effect on the riskiness of a probationer committing a violent crime in the future. Compelling conclusions emerged, such as the probationer’s age at the time of his first crime compared with his most recent crime. But the real value of the model is less its research results and more its practical management benefits. The model takes a very complicated decision (the level of supervision needed) and intelligently sorts through all the possible predictors to derive a sensible strategy to segment the probation population by risk.

Barnes spent months after the initial model construction working with the APPD on a front-end-user interface and back-end database system that could tap into court records and police filings, and thus provide instantaneous forecasts. The resulting intake process streamlines and integrates siloed data across agencies in real time. When an individual registers with the APPD for probation, the program immediately pulls the predictor variables from centralized court and police records, runs the case through Berk’s model, and assigns a low-, medium-, or high-risk category, all in less than ten seconds. Probation officers are assigned accordingly, and the APPD, in partnership with Barnes and Berk, continues to perform randomized experiments to pinpoint the proper level and content of supervision for each risk category.

According to Barnes, a key factor of the tool’s success has been the APPD’s unwavering commitment to the model predictions. Rather than circumventing the model when the staff found a particular weakness, all efforts were made to adjust and build a better model with new or different data. This commitment came from the highest level of the APPD, allowing the team to innovate despite obstacles.

While it is too soon to tell whether overall recidivism has decreased because of this innovation, the model helped the probation staff handle a 28 percent increase in overall caseload with a staff 15 percent smaller than before the introduction of forecasting. According to APPD’s chief probation and parole officer, this feat simply would not have been possible without the use of risk forecasting.[36]

Situational Awareness: Smart Policing with Sensors and Social Media

Criminal-justice authorities are using new digital tools and big data to improve operations. New technology allows data to be synthesized and made geographically relevant for a patrol officer in innovative ways. Increasingly, the solutions to urban problems turn on how new technologies allow data to be curated, mined, and delivered to those who can act on the information.

Leading innovators, such as New York, have invested heavily in situational awareness platforms. The New York Police Department partnered with Microsoft to develop the Domain Awareness System (DAS), a solution that aggregates and analyzes public safety data from reports, video feeds, license-plate information, witness reports, and so on, and then provides NYPD investigators and analysts with a comprehensive, real-time view of potential threats and criminal activity. The NYPD/Microsoft solution tailors the information to the specific needs of users.[37]

Similarly, ShotSpotter works with municipalities to provide instantaneous gunfire alerts to police departments across the country. The core of ShotSpotter’s service is a wide-area acoustic surveillance system, supported by software and human ballistics experts, all focused on accurately detecting gunfire. The company mounts waterproof, watermelon-size, acoustic sensors on rooftops across a city. Networked together, an array of sensors can triangulate the incident location accurately in real time. If ten sensors detect a shot, the array can determine the incident location with a two-foot margin of error.

In the cacophony of an urban locale, many sounds can be misinterpreted as gunshots to the untrained ear. To solve the misidentification problem, ShotSpotter relies on a centralized qualification center. Once the sensors detect an explosion, ballistics experts at ShotSpotter’s command center in California analyze the noise to weed out false positives, such as car backfires and fireworks. Results are sent back to police dispatchers, providing the police with precise location, number of shots fired, exact time of the incident, and gunfire history for the area. From detection to review, the entire process takes about 40 seconds. ShotSpotter guarantees that it can accurately detect 80 percent of gunfire in coverage areas, although actual detection rates are as high as 95 percent.[38] The technology has been implemented in 75 cities and towns across the United States, including Washington, D.C., and Milwaukee.

While predictive policing is proactive, ShotSpotter is reactive, but that does not limit its efficacy. Washington, D.C., has been using the system since 2005. The police there have detected more than 39,000 gunshots since 2006, using 300 sensors deployed across the city, according to an analysis by the Washington Post. D.C. police no longer need to hear a gunshot or depend on citizen reporting to know about it. The system has helped the D.C. police respond quickly to gunfire, track trends in gun violence, and establish evidence for criminal trials.

In Milwaukee, the police are using ShotSpotter to proactively respond to gunfire incidents, particularly in areas where violence has historically gone unreported because of resident intimidation. The Milwaukee Police Department calculated that, in the areas where ShotSpotter is deployed, only 14 percent of gunfire is reported to 911.[39] Fear of retribution for reporting crime is a serious concern in many communities; ShotSpotter allows the police to circumvent this issue and respond to gun violence, equipped with detailed situational awareness.

ShotSpotter is part of a broader trend to use new technology for situational awareness. Smart policing also increasingly relies on social media, especially Twitter, to keep communication open between police and citizens, increasing situational awareness for both. When the Vancouver (Canada) Canucks made it to the finals of the Stanley Cup playoffs in 2011, the Vancouver Police Department used Twitter to connect with participants and respond directly to questions from the crowd. The public response was overwhelmingly positive. But on June 15, the Canucks lost in the final game of the playoffs, and spectators rioted in the streets of downtown Vancouver. The VPD’s Twitter feed became a critical tool for communicating with spectators and tracking developments. After the riot, VPD used Twitter and Facebook to inform the public about how to submit riot evidence. More than 16,000 people started following the VPD in the next few days, and the department saw a 2,000 percent increase in Facebook followers. Thousands of civilian “journalists” submitted videos, photographs, and tips to the VPD over the next months, providing an unprecedented amount of evidence on the incident.[40]

Acoustic sensors and tools that mine social media to produce patterns from structured and unstructured data dramatically increase the effectiveness of those fieldworkers—police patrol officers and detectives, probation officers, and the like—who must allocate their time and use their discretion to protect the public.

Disasters: Rapid Integration and Dissemination of Data

Team Rubicon, a volunteer organization primarily comprising military veterans, deploys volunteers into disaster areas across the country. Relying on deep expertise in military protocol and logistics, Team Rubicon supports its volunteer responders in some of the toughest disaster situations. But allocating volunteers effectively is always a challenge, especially in unfamiliar contexts.

After Superstorm Sandy, Team Rubicon partnered with Palantir, a software firm, to aid in its recovery operations. During the time I worked to implement data analytics in NYC, the obstacle I most frequently encountered was the conviction, among agencies, that their legacy data simply precluded its use in conjunction with any other system. Recent data-mining breakthroughs surpassed the knowledge curve of many public IT officials, who were more focused on keeping an older data product functioning. Palantir’s work lies at the heart of bringing innovations in big data to the public sector. Local governments use Palantir’s data-integration platform, called “Gotham”—developed originally to aid the FBI and the CIA’s counterterrorism efforts—to unify and analyze isolated data sets. Palantir’s platforms unlock data sets traditionally siloed in disparate locations. In the chaotic aftermath of Superstorm Sandy, Palantir put its system to the test. Within 24 hours of the storm, Palantir deployed a cadre of engineers to New York City. From a bus in the Rockaways, these engineers supported NGOs to make efficient, emergency resource allocation decisions. On site, Palantir customized Gotham, building a mobile interface for volunteer first responders from Team Rubicon to receive, send, and gather critical information. Forward-deployed engineers customized their system to unify and distribute National Oceanic and Atmospheric Administration data, demographic data, electric service maps, Federal Emergency Management Agency (FEMA) projections, and damage assessments to partners in the field.

Members of Team Rubicon used smartphones with Palantir software to centralize and distribute needed information. The customized mobile application brought together critical information and allowed analysts to efficiently direct Team Rubicon volunteers to the most crucial tasks. For example, the software collected building-damage assessments through volunteers using their mobile application, prioritized damaged buildings for volunteer intervention based on a vulnerability analysis, estimated the number of volunteers needed to adequately address the building’s issues, and flagged certain structures for asbestos risk.

Direct Relief, another nonprofit involved with emergency response, faced its own challenges after Sandy. As Andrew Schroeder, Direct Relief’s director of research and analysis, puts it: “In a disaster context, we are trying to hone in on need as quickly as possible and disaster areas as quickly as possible so we can understand where we need to plug in.”[41] As an NGO with limited staff, Direct Relief had to quickly and efficiently monitor and distribute supplies to hundreds of federally qualified health providers in the flood zone.

Ten days before Sandy hit New York and New Jersey, Schroeder was presenting at a conference on the theoretical application of Palantir Gotham to public-health emergency responses after a hurricane. Within a week, he was working with engineers to apply the concept in real time to decide on strategic pre-deployment caches of medical supplies, as Sandy approached New York. Direct Relief developed a social vulnerability index through demographic and housing information, and correlated those data against the constant stream of risk-assessment models generated by FEMA. Direct Relief could forecast where the medical needs would be, even before the storm made landfall. This data-driven modeling helped Direct Relief overcome the communications challenge in the first 48–72 hours after the storm. Health providers were completely out of contact—cell service and phone lines had gone down. There was no way for Direct Relief to know which providers needed assistance. With limited contact, Direct Relief used proxies, such as the electric-grid outage maps and whether local pharmacies were down in a particular area, to predict which groups needed assistance. Direct Relief volunteers were then sent to clinics in these vulnerable areas to confirm on-the-ground needs and coordinate medical-supply delivery.[42]

Public Health Analytics for Pest Control

West Nile virus, an ailment once rare and relatively unknown in the United States, is now an annual danger in many suburban communities. In Suffolk County, New York, a large suburban and rural county on Long Island, officials began seeing West Nile cases in the early 2000s. “We realized that we were going to be dealing with West Nile virus every year,” says Dominick Ninivaggi, superintendent of Suffolk County Vector Control.[43] But dealing with West Nile proved more challenging than anticipated. Initial tactics focused on identifying and mitigating mosquito reproduction in areas with high rates of virus-infected mosquitoes. The effort proved ineffective, and, after a detailed investigation by Ilia Rochlin, an entomologist working for Suffolk County, the agency found that high rates of infected mosquitoes did not correlate to high rates of infected humans. Given this information, what was driving infection rates?

Vector Control agencies typically use two tools to mitigate the spread of West Nile virus: killing adult mosquitoes with pesticides; and treating catch basins with pesticides to eliminate mosquito larvae. But Suffolk County, given its size and limited resources, cannot possibly treat all catch basins. When a case occurs, the county has to make a quick decision as to whether to spray the affected area with pesticides—a decision that requires careful weighing of costs, risk of outbreak, and the community impact of intervention.

Rochlin and Ninivaggi developed a model to assess the risk of outbreak using a combination of statistical methods and geographic information systems. Through modeling, they found relationships between human West Nile cases, landscape factors, population demographics, and weather patterns. Initial results showed a complex interaction between these factors and human cases of West Nile virus.

Using this hot-spot analysis, Vector Control now targets larvicide efforts in established hot-spots and uses aerial adulticide spray only where quantitative evidence supports the use of pesticides. By being strategic in the use of analytics, the agency has saved time and money, while still providing a high level of public safety.

Chicago Rodent Control

Street rats—common to every city—are a threat to urban infrastructure, food supplies, and public health. Research studies have shown that exposure to rodents can trigger asthma attacks, particularly in young children. Chicago’s Department of Innovation and Technology (DoIT) and Carnegie Mellon University’s Event and Pattern Detection Lab (EPD Lab) partnered to use predictive analytics to solve the rodent problem. Chicago’s goal was not only to generate rodent hot-spot maps, but also to anticipate rodent outbreaks before they happen.

Tracking and identifying rodent nests is a perennial problem for Chicago. Without good data on where rats were located, proactive prevention would be impossible. To solve the information gap, the team turned to Chicago’s rich data set, containing 4 million requests to 311, covering everything from pothole complaints to rodent-control requests. Researchers at EPD Lab correlated these rodent-control requests with other demographic and 311 data to find leading indicators of rodent outbreaks. Their findings: a 311 call relating to garbage triggered a seven-day window where rodent-control requests spiked in the same area. Using this key insight with other indicators, such as water-main breaks, the city could identify the size of the rat population and predict the likelihood that the population would strike in a particular area.

Prior to this predictive innovation, the city’s rodent-control team worked through 311 requests chronologically, responding on a first-come, first-served basis. With this new information, the team revamped practices to focus on a proactive, location-based strategy to clear out rats in high-risk areas first, before moving to the next risky area. As a result, the city reduced its 311 rodent-control requests by 15 percent from 2012 to 2013.[44]

Intelligent Data Analytics for Evaluating
Public Safety

You cannot fix what you cannot measure. NYC government realized this fact during a spike in fatal accidents due to slow 911 response times during the summer of 2013. Despite the city’s $88 million upgrade of its computer-aided dispatch system, the source of the delay remained elusive. Theories included delays associated with the new software, operator error, or indeed something else entirely.

The Mayor’s Office of Data Analytics (MODA), in collaboration with the NYPD, FDNY, EMS, and Verizon, developed a method to measure every stage of the emergency-response process—from the second that a resident dials 911, to the exact moment that emergency responders arrive on the scene. The analytical model stitches together data from decades-old agency systems into a comprehensive picture of the city’s emergency response. The city can now track the speed with which 911 operators interface with emergency medical-service responders, isolating these transactions from dispatch and travel time.

Integrating these data sets across legacy platforms presented challenges: each public safety agency has a different unique ID for each incident and records these data in systems that do not communicate with one another. To connect the systems, MODA developed an analytical matching technique based on time, type, and location of call. This software script automates the process of linking together complex data sets into a comprehensible whole. Rather than overhauling each agency’s unique and functional system, the mayor’s team developed a creative work-around to link isolated information. By isolating each facet of the emergency response, city leaders can detect inefficiencies and blockages, respond to inquiries about what exactly caused each individual delay, and make strategic decisions about where to invest future resources.

Turning to data to drive operations is not just about delivering better service; it also saves agencies time, money, and other resources. These applications of predictive analytics in public safety were all implemented during a fiscal crisis. By making upfront investments in technology, public safety organizations lower costs and increase capacity for targeted interventions.

Implementation is not easy. These cases show us that whether you are using old data in new ways, as with Chicago’s rodent-control program, or creating new data assets through ShotSpotter, successful implementation requires key institutional factors:

  • Substantial top-level support for the initiative
  • Technical partners to aid in implementation, testing, and refining
  • Supportive and enthusiastic staff
  • Coordination across siloed departments and agencies

Police agencies have long capitalized on these factors to implement data-driven policies. It is apparent, once again, that police and other emergency-response agencies are leading local government in using sophisticated data-mining tools to target resources, solve problems, and pre-position in times of crisis. The breakthroughs for these agencies will again set a pattern for government, as predictive analytics power the advance of ever more effective government.


Chapter 5. Regulatory Reform with Data and Technology

The progressives and muckrakers undoubtedly had it right at the turn of the twentieth century: corrupt government and unregulated business foisted high costs and dangerous conditions on urban America. Forced by the muckrakers’ exposés, government adopted rules and procedures that led to tougher enforcement.

But a century later, these well-intentioned regulatory reforms have morphed into rule-bound, job-killing, expensive, and cumbersome processes—or, even worse, systems captured by the few to erect barriers to entry for prospective competitors. Government conducts inspections, issues permits and licenses, and decides the requisite paperwork and private actions that must accompany each. Regulation is not inherently bad. Rules establish a clear set of standards for how business is to be conducted in the city. Rules ensure consistency, a minimum threshold of verified safety, and proper functionality. In cities of all sizes, regulation ensures that we can live in homes that are properly heated and wired, that our children are transported in school buses that are safe, that waste is properly handled, and that our drinking water is safe for consumption. We can dine in restaurants, purchase exotic foods, and socialize in public places without negative effects on our safety or health.

However, sometimes rules and the mechanisms to accomplish a goal become onerous and counterproductive. Each exposé on bad behavior by some business can produce a rash of new enforcement procedures against all, including those entities that operate with the highest standards. For every aspiring entrepreneur hoping to capture an opportunity in a regulated trade, the enforcement procedures can translate into another obstacle. As time goes on, new regulation often ceases to serve its original purpose, suffocating citizens and law-abiding businesses without producing the intended benefits. The regulation itself—the need to treat every applicant identically, no matter the complexity of his request or her qualifications, and the enforcement of incomprehensible rules—stifles citizens’ freedoms and small-business growth. All too often, regulatory processes elevate a narrow definition of professionalism over common sense. Obsolete processes, the products of a bureaucratic insensitivity to others’ time, or the time value of money, unnecessarily increase the cost of government, as well as the costs that it imposes on others.

Rule and Licensing Origination

Regulatory torment reaches a peak at the local level, where federal and state rules cascade downhill for augmentation and enforcement. Bureaucrats in various agencies apply rules without much attention to their underlying purpose or regard for oversight by other agencies over the same business, or activity. With thousands of pages of regulation to process, monitor, and report on, public officials focus on administering the regulation instead of pursuing its objective.

Regulatory rationality—an approach that combines both the need to ensure safety and allow for economic expansion—would begin with elected officials developing a set of principles that frame their approach and then utilizing a transparent process that involves feedback about how to accomplish the goal most efficiently.

Desired outcomes can be skewed by regulations interacting with one another. This intricate network of competing interests and rules in a complex economy can easily alter outcomes in unintended ways. Agencies with one mission may create unforeseeable consequences in another (e.g., regulations designed to encourage low-income homeownership create risks for financial regulators).

We need to find a better way to balance health and safety with job growth. Cities need small businesses to thrive and flourish. In America, 23 million small businesses account for more than half of all U.S. sales. According to the Small Business Administration Office of Advocacy, 99.7 percent of U.S. firms are small businesses (fewer than 500 employees).[45] During a four-year period, high-patenting small-business firms produced 16 times more patents per employee than large-patenting firms. Yet only half these new establishments survive five years or longer.[46] It continues to be difficult for small businesses to navigate red tape, regardless of their importance in our local economies. Today, data and technological breakthroughs allow city governments to streamline the regulatory process. Some solutions are inexpensive and culture-oriented, while others include total reform of processes and operations. Cities are addressing antiquated approval processes by developing targeted regulations that allow good actors to fast-track through licensing and permitting, as well as applying predictive approaches to target inspections. These efforts provide a pathway for small businesses to enter and grow in the local marketplace.

Regulatory review commissions, which look for more efficient procedures, should extend to the initial rationale for red tape. For instance, New Jersey governor Chris Christie created the State of New Jersey Red Tape Review Commission, which has helped lead changes that leverage digital government to produce more efficient processes. Assisted by New Jersey’s Executive Order 2, which directs state agencies to leverage information systems and other technologies to increase efficiency, the commission reported in 2012 that six digital advances had been implemented: migrating occupational licensure online; an electronic system to accept reimbursement claim submissions; establishing an online process, from the initial application to tax-related reporting, for the Urban Enterprise Zone program; an electronic procurement and bidding system; an online system for posting required notices; and new technologies used by the Juvenile Justice and Motor Vehicle Commissions, such as a proposal to permit online road-test appointments and electronic submission of fingerprints.

The Regulatory Study Commission, an Indianapolis effort in the 1990s, started by challenging the very purpose of each rule, then asked whether it accomplished the purpose effectively, or at all. That effort was based on fairly simple principles:

  • Regulations should be used only as a tool to achieve a policy objective as a last resort; the use of regulations indicates the failure of other means to achieve the objective.
  • The cost of regulation should be no greater than the benefit that it creates for the community.
  • Regulations must be simple, fair, and enforceable.
  • Regulations must be written to ensure imposition of the minimum constraints necessary to accomplish the public purposes of safety and accurate information, and to facilitate market transactions.
  • Regulations can exceed existing federal or state standards only when there is a compelling and uniquely local reason to do so.
  • The regulation should clearly benefit the consumer or the public.
  • The regulation should complement other laws and rules.

These principles should be considered during creation because once legislation is fully implemented, special interests, bureaucracy, and legislative procedures make changes much harder to execute, even if the change itself is not particularly controversial. At the local and state levels, streamlining business regulation can produce results; indeed, the regulatory review process in Indianapolis during the 1990s eliminated 40 percent of regulations and an equal portion of fees.[47] (Approximately 426 businesses were thus relieved, in some way, of licensing, which resulted in the city losing about $85,000 in licensing fees.[48] Yet because, in some cases, the cost to manage and process the permit or license was more than the actual cost of the permit, the impact of streamlined operations was an immediate, fiscally positive step forward.)

A threshold question for regulation should be whether a permit or license is necessary at all, or whether stronger enforcement of existing rules would suffice. For example, many cities require dog licenses to prove that dogs have received rabies shots. Why not just require the shots and have dogs wear vet-issued, evidence tags instead?

During his final term, Mayor Michael Bloomberg reformed the complex regulatory climate for New York residents and businesses. A review panel identified a first set of specific permits so onerous that they needed immediate attention. These included permits for place of assembly, certificate of occupancy, sidewalk cafés, street-tree work, range hoods, grease interceptors, food-service establishments, and gas authorization. Some of these permits required complicated annual applications consuming great time and expense.

Perhaps even more important, the public now has a dynamic, one-stop platform called NYC Rules, where residents can participate in the rule-making process. New Yorkers who visit NYC Rules can learn how rules are adopted, view a list of proposed rules, and submit text comments or upload documents, such as charts or spreadsheets. Website visitors can sign up to receive a weekly e-newsletter to stay current on rule-making activity and recently proposed, and adopted, rules. Effective review processes cannot be controlled exclusively by the agency doing the promulgating, as an agency will always be more likely to rationalize its needs over those of the public or an official, such as the mayor, who has offsetting responsibilities (such as also promoting economic development).

Reducing a risk to zero, of course, is never difficult: just eliminate the function. If developers, say, do not construct buildings, no worker or passerby will ever be injured. If a restaurant does not open, it will produce zero danger to a future consumer. The New York City Council recognized the need to look more broadly at trade-offs of proposed rules when it adopted NY Local Law 46 of 2010, which required the Mayor’s Office of Operations to review agency rules prior to their publication. The law required the Mayor’s Office of Operations to analyze whether the rule is understandable, in plain language, and whether it minimizes compliance costs. When I was deputy mayor in New York, the operations group reported to me. They could, it turned out, accomplish their goals better by allowing the “crowd” of stakeholders to collaboratively address a problem nominated by the agency, offering alternative solutions to improve the final rule. A process that requires clarity of purpose, transparency, and broad input by stakeholders—as a rule is being originated—will more likely result in the right regulatory balance. As of March 2012, the Office of Operations had reviewed and certified more than 100 proposed rules before they became laws.[49]

Processing

The decision to license, permit, register, or regulate is only the threshold. Once government decides to act, it should license in the most efficient manner. More than 40 percent of the 7,000 small businesses surveyed in the 2013 United States Small Business Friendliness Survey, conducted by Thumbtack.com, are subject to licensing regulations by multiple jurisdictions, or levels of government.[50] In cities today, documents are still bound in red tape and shuffled from office to office, increasing the administration and compliance costs of regulation.

Although fast-tracking is not as common at the municipal level, other governmental bodies fast-track good actors. For example, the Federal Transportation Security Administration’s TSA Pre program allows select frequent fliers to receive expedited screening benefits during domestic travel. Here the burden on government is reduced, and the time and cost burden on the consumer is also reduced. In this example, both parties benefit, without a major impact or risk to overall airport health and safety. City governments that are building specialized regulations following this model have the opportunity to create benefits for many groups, while avoiding a high-risk negative result. By relying on a data-driven, transparent model, public officials will be able to reduce favoritism and ensure equitable treatment.

Other approaches that increase efficiencies have been worked into existing city portals. For example, the NYC Development Hub, linked to the NYC Business Express portal, was launched in 2011. This online plan-review center accelerates the approval process for construction projects in the city. In its first year, the city approved more than 300 new building and major projects, estimated to generate more than $1 billion in economic activity.[51] This approval process was three times faster than the previous paper-plan approval process. Cities should not be overwhelmed when thinking about reforms related to permitting and licensing. Looking at a private-sector example, little steps make a large difference when aggregated. In 2012, Starbucks cut ten seconds from every transaction by card or cell phone. At the individual level, this does not seem substantial; but over the year, it added up and reduced the time in line by 900,000 hours, making digital approaches and technology more consumer- and employee-focused.[52] Cities can look to these examples for the kind of quick reforms that make a large difference in service quality for the customer and increase efficiency.

Digital Processes of Identifying Enforcement

A slightly more advanced opportunity for cities to integrate data and technology into the regulatory process can be seen in the use of advanced digital processing for enforcement and compliance. This approach requires reliable data and thrives where municipal governments collect data through social media and civic engagement applications—places such as Boston, New York, and Philadelphia. In addition, cities that share data horizontally across departments, through centralized databases, further reduce bottlenecks in the system, while increasing transparency and the ability to collaborate with others.

Cities should be using available data to the greatest extent—even if those data are housed in different departments. Integrated data analysis can be used for predictive policy implementation, but can also be used in support of inspection and monitoring reforms. As noted in Chapter 1, after illegal apartment conversions and undocumented renovations caused a series of firefighter deaths in New York, the Bloomberg administration began using preexisting data in a predictive manner to proactively address potential fire concerns. Using data collected on buildings with serious past fires, the city deduced that four indicators—unpaid property tax, foreclosed property, building age (specifically, before 1938), and socioeconomic status of the neighborhood—provided enough information to target inspection services. This successful approach, by targeting limited inspection services to areas and buildings where the four indicators are present, resulted in over a fivefold increase in the remediation of risky conditions.

Digitized information allows collaborative agency efforts and can be used to target compliance. In New York, the Business Integrity Commission (BIC) scanned nearly 500,000 pieces of paper and aligned 27 inconsistent databases with more than 5,000 fields to establish a centralized system providing data to its 81-person staff. Consolidating more than 20 years of data supported a systematic way to regulate more than 2,000 businesses, by providing quality service to good actors and a strategic plan to catch bad ones. Partnering with the Mayor’s Office of Analytics and the Department of Health and Environmental Protection, the data were used to conduct a hot-spot analysis that cross-referenced industry data on grease production, with restaurant-permit data and sewer-backup data. As a result, the BIC achieved an increase in identification of violations by 30 percent, while reducing manpower dedicated to grease enforcement by 60 percent.[53]

In addition to integrating data, cities have an opportunity to use technology for proactive monitoring and compliance. In Baltimore, the Bureau of Environmental Health was awarded an innovation fund loan to shift its more than 11,500 annual inspections of food facilities, day-care facilities, schools, and the like to a portable tablet platform to allow for inspections to be recorded in the field.[54] Using a flexible regulation model, much like that of Starbucks, the city anticipates that this program will pay for itself through reduced overhead while reducing wait times and improving customer service. Applying a model like this allows inspectors to share information with city agencies in real time. At the same time, inspections are tracked by time and location across the city, which allows for accountability, transparency, and informed resource allocation and management.

Today, as digital systems replace traditional paper-based processes, cities have the ability to understand, say, how long it takes for an inspector to complete one inspection; where, on a map, code violations are more likely to occur; and how to allocate resources to ensure the greatest impact. Incorporating these digital approaches throughout the regulatory process—including the origination and review of regulations, licensing and permitting, and compliance and monitoring—has shown higher-quality results, with far lower transaction costs and fewer job-killing delays.


Conclusion

City leaders today possess the technological tools to usher in a new era of reform, based on responsiveness. A dazzling array of tools provide exponential opportunity—wireless devices in the hands of fieldworkers, sensors that constantly transmit critical infrastructure information, big data and predictive analytics, social media, sophisticated GIS systems, and more. However, these innovative leaders must first get over, around, or through a century-old bureaucracy dedicated to rule-driven routines. The earlier definition of progressive government required extremely limited public-employee discretion as the only way to avoid abuses; but in a more complex society, such a system is very costly and alienates citizens with red tape.

Data collection, analysis, and dissemination allow the reintroduction of problem-solving discretion, which facilitates personal and collaborative engagement with communities. In the past, flexibility made for a free-for-all era of government that easily sidestepped into corruption and party politics; today, the benefits of flexibility can be retained and complemented with data-driven accountability and transparency. Officials who want to ride this wave of responsive change must incorporate the following principles:

1. Humility
These sophisticated tools, that will make public officials so much smarter, work at full strength when in the hands of officials whose humility leads them to use the tools to better capture the knowledge and experience of their employees and citizens. In a digital world, civil servants can benefit from massive amounts of input that can be curated, organized, and “socialized” with citizens to invent new solutions and to approach problems in wholly new ways.

2. Focus on Outcomes
This reform movement will orient itself toward outcomes and away from perfunctory checklists. Data, from the community and from enterprise-wide big data, will point the way toward outcomes. For instance, whether an offender returns to prison will be more important than whether he finished drug counseling, and whether a job-training program works for certain individuals will be stressed over whether the training group graduated so many trainees.

These chapters have shown how the age of bureaucracy turned professionalism into myopic specialized approaches, where the sheer number of people and affected groups involved made “coproduction” of public goods enormously difficult. Through the generation and sharing of data, across departments and outside city hall, a new definition is emerging, one which stresses real outcomes, and privileges understanding of how to incorporate the perspectives of others.

Working with others carries a number of benefits, including speeding service delivery, as governments’ new responsiveness produces solutions alongside citizens, rather than forcing them through archaic, serial processes. Officials can be pragmatic, tailoring policy in response to changing citizen needs, instead of forcing them to tailor their requests from the government into discrete, artificial channels that take time without actually delivering solutions.

3. Preemption and Prediction
Officials using these new digital tools can act preemptively to find the signal within the noise and realize where and when to target their resources to tackle the most risk-prone—rather than spreading resources too thinly or, worse, simply guessing. Floods of new data become available every day; a preemptive government knows where, and how, to look in this growing resource to anticipate an issue. Officials have seen that some buildings are more likely to burn, some streets are more likely to have accidents or potholes, some families are more susceptible to ongoing violence, and so on. These improvements in data access and analysis allow government to deliver more effective services.

4. Personalization
Perhaps government should not or cannot emulate Amazon, but it certainly can go a long way in personalizing services. Residents who so desire should be allowed to register with their governments to receive notices of events, follow-ups on requests, automatically generated notices of renewal, and more. With currently available technology, citizens should demand that they do government on their terms, not according to the convenience of the bureaucracy.

5. Openness and Transparency
The open-data movement rests on the sound foundation that the more easily the public can access public information, the more likely we’ll find answers. This transparency movement relies on more than simply opening up data; it also includes making it understandable, visualizing it for impact, and responding to observations from app developers, as well as watchdogs.

As with other cities, Chicago’s open-data efforts have released hundreds of data sets and elicited the analytic talents of the public and academia. Boston’s Mayor’s Office of New Urban Mechanics is using tools, such as the Citizens Connect app, to elicit information from citizens on incidents and needs, to help resolve problems more quickly. Not only can this kind of data-sourcing turn citizens into the eyes and ears of government; it can also provide a valuable resource to help make responsive government into preemptive government, one better able to allocate resources and enforce rules to get real results.

In these chapters, I have presented cases and best practices from key areas of governance, now emerging as hotbeds of new data-enabled solutions, to problems that exemplify the shift to government’s age of responsiveness. Areas of government administration, infrastructure management, public safety, and regulation have long been mired in heavy layers of bureaucracy—meant to preserve control and top-down authority over large budget expenditures and services that play large roles in citizens’ lives. A new form of accountability that relies on analysis of information to identify outliers will facilitate an empowered citizenry and a smarter set of public employees. Collectively, cities are showing a willingness to break from history and return agency to government employees and citizens themselves, using technology to empower these groups and to monitor and ensure desirable outcomes.

Responsive government requires cities to reassess their administrative structures and break down the barriers between departmental silos by sharing information and responsibility. Responsive governments open data, perform analysis in cross-cutting ways that look at outcomes, and take action against root causes. Programs like Baltimore’s CitiStat have paved the way, allowing cities to do more with limited resources, dropping dangerously high crime rates by managing police performance with data, and evaluating this performance based on outcomes.

Just as data analytics have allowed cities to begin using their human resources more wisely, they have also begun to allow for better management of infrastructure. Assets like bridges and sewers are always in need of maintenance (and budgets are too small to do it all). Some governments are using sensors to evaluate weak points and degradation before they get worse, part of a strategic switch from a maintenance protocol that reacts to expensive disasters, to one focusing on less expensive preventive maintenance. Sensor networks can keep high-risk, overweight trucks from destroying pavement, as well as provide the data to better manage congestion. Other sensors can provide data to governments and citizens to better manage collective and individual consumption of resources, such as water, electricity, and even parking. Governments can better understand the behavior and needs of citizens, while citizens can use data on their own activities, for convenience and to save money.

In public safety, we have seen that having the right data, and knowing what to do with that information, can mean the difference between crimes happening or not. Hot-spot policing has allowed cities to put a preventive police presence in high-risk areas, better informed by massive data-collection programs like NYC’s CompStat. New analytic approaches, like those in Santa Cruz, use advanced earthquake-aftershock modeling to more accurately predict the location of likely future crimes. Cities such as Philadelphia have been experimenting with data analytics, not only to predict the location of future crimes, but also to gauge who may commit them—thereby assigning risk to probationers of recommitting crimes, allowing the city to boost oversight resources for those most likely to commit more crimes. Predictive policing, of course, won’t prevent every crime. Programs like ShotSpotter help by allowing police to perform better even when forced to work reactively; now in 75 cities, such rooftop sensors can triangulate the location of gunshots far better than the human ear, getting police to the scene of a crime faster.

Speed is important when responding to gun violence, but even more important when responding to disaster or public health issues. When Superstorm Sandy struck the East Coast, Palantir partnered with Team Rubicon to get needed data to those on the front line, so that they could act quickly and accurately, and create a collective store of knowledge. Cities can use data to deal with chronic problems as well, such as data-driven larvicide applications in Suffolk County, New York, to keep the West Nile virus at bay; or Chicago’s analysis of calls to 311 to track indicators of rodent outbreaks and tackle them before the rodents spread disease.

We saw how responsive government affects the regulatory environment, protecting health and safety in much more efficient ways. New technologies hold the answer to striking a better balance between protecting such needs and encouraging economic growth. Digitization can allow for streamlining the process of permitting new businesses and restaurants, helping permitting agencies better balance their help to entrepreneurs, while preserving the health and safety needs of the public. Cities and states have understood the need to refocus regulation on achieving its goals with minimum negative impacts, and have passed legislation and executive orders to reintroduce that idea. We now have the digital tools to remake regulatory processes to meet this new expectation.

We are poised to achieve historic breakthroughs in responsiveness. The aforementioned tools—when complemented by the right leadership, changes in how we do and reward public work, and the respectful, broad engagement of residents and the tech community alike—will produce a new era of government responsiveness.


Endnotes

  1. “Trust in Government Nears Record Low, but Most Federal Agencies Are Viewed Favorably,” Pew Research Center for the People and the Press, October 18, 2013.
  2. John Wagner, “O’Malley Hopes the Numbers Add Up,” Washington Post, February 16, 2007.
  3. City of Chicago, Open Data Annual Report 2013, http://report.cityofchicago.org/open-data-2013.
  4. Stephen Goldsmith, “Boston’s Pioneering Way of Innovating,” Governing, September 12, 2012, http://www.governing.com/blogs/bfc/col-boston-mayor-office-new-urban-mechanics-mitch-weiss-interview.html.
  5. See http://grade.dc.gov/about.
  6. See Citywide Performance Reporting portal: http://www.nyc.gov/html/ops/html/data/data.shtml. Features include monthly agency-performance reporting; mayor’s management report, a biannual agency-performance report card; capital projects dashboard, which tracks the schedule and cost of the city’s major infrastructure and technology projects; 911 end-to-end response-time reports for police, fire, and EMS calls; 311 reporting on the city’s response to service requests from 311 callers; NYC’s open-data catalog; SCOUT (Street Condition Observations), which maps street conditions such as potholes and catch-basin defects and allows users to track the progress of repairs; street and sidewalk cleanliness ratings for all five boroughs; NYCStat stimulus tracker, which tracks the city’s use of federal recovery funds provided through the American Recovery and Reinvestment Act of 2009 (ARRA); NYC service performance tracker, which reviews the performance of major volunteer and civic engagement initiatives managed by NYC Service; and NYC fleet daily service report.
  7. Karen Zraick, “After Fatal Fire, City Vows Crackdown on Illegal Apartments,” New York Times, April 26, 2011, http://www.nytimes.com/2011/04/27/nyregion/after-fatal-fire-city-vows-crackdown-on-illegal-apartments.html?_r=0.
  8. More information can be found in the city’s press release, PR- 193-11, dated June 7, 2011.
  9. Don Pagel, personal interview, September 23, 2013.
  10. See http://www.publicdeliberation.net/cgi/viewcontent.cgi?article=1232&context=jpd.
  11. See http://pbnyc.org/content/about-new-york-city-process.
  12. See http://www.participatorybudgeting.org/blog/new-york-election-results-participatory-budgeting-wins-big.
  13. Personal interview, February 21, 2014.
  14. Jessica Mador, “Cutting-Edge Technology Makes New 35W Bridge a Model for Future,” MPR News, September 16, 2008, http://www.mprnews.org/story/2008/09/11/35w_technology.
  15. Governors Highway Safety Association, http://www.ghsa.org/html/meetings/awards/2012/12nm.html.
  16. New York City DOT, http://www.nyc.gov/html/dot/html/pr2012/pr12_25.shtml.
  17. Barry Pelletteri, personal interview, August 29, 2013.
  18. “About Congestion Pricing,” Virginia Department of Transportation, http://www.virginiadot.org/info/resources/congestion_pricing/about_congestion_pricing.pdf.
  19. Caterina Di Bartolo, “AREA C in Milan: From Pollution Charge to Congestion Charge,” Eltis: The Urban Mobility Portal, November 2012, http://www.eltis.org/index.php?id=13&study_id=3632.
  20. Smarter Sustainable Dubuque, http://www.cityofdubuque.org/index.aspx?NID=1344.
  21. Duke Energy, http://www.duke-energy.com/north-carolina/savings/power-manager.asp.
  22. “How It Works: Sensors,” SFpark, http://sfpark.org/how-it-works/the-sensors.
  23. “SFpark Announces 11th Meter Rate Adjustment,” SFpark, http://sfpark.org/2013/08/23/sfpark-announces-11th-meter-rate-adjustment.
  24. Sonali Bose, personal interview, August 29, 2013.
  25. Lauren Sager Weinstein, personal interview, August 28, 2013.
  26. Transport for London, Travel in London, report 5, https://www.tfl.gov.uk/cdn/static/cms/documents/travel-in-london-report-5.pdf.
  27. “Patrick-Murray Administration and City of Boston Expand Success of Citizens Connect App Across the Commonwealth,” Mass.gov Administration and Finance, December 17, 2012, http://www.mass.gov/anf/press-releases/fy2013/patrick-murray-administration-and-city-of-boston.html.
  28. Luis Muñoz, personal interview, August 28, 2013.
  29. COPS office, http://www.cops.usdoj.gov.
  30. Erica Goode, “Sending the Police Before There’s a Crime,” New York Times, August 15, 2011, http://www.nytimes.com/2011/08/16/us/16police.html.
  31. “Dr. George Mohler: Mathematician and Crime Fighter,” Data-Smart City Solutions, May 8, 2013.
  32. Zach Friend, “Predictive Policing: Using Technology to Reduce Crime,” FBI Law Enforcement Bulletin, April 4, 2013.
  33. Philadelphia Adult Probation and Parole Department, 2006 Annual Report, http://fjd.phila.gov/pdf/report/2006appd.pdf.
  34. More formally, Berk and Barnes used “random forest forecasting.”
  35. Geoffrey C. Barnes and Jordan M. Hyatt, “Classifying Adult Probationers by Forecasting Future Offending: Final Technical Report,” National Criminal Justice Reference Service, March 2012, https://www.ncjrs.gov/pdffiles1/nij/grants/238082.pdf.
  36. Amaris Elliott-Engel, “Shortage of Probation Officers May Imperil Reforms, FJD Leaders Say,” The Legal Intelligencer, April 15, 2011.
  37. Pervaiz Shallwani, “‘Future’ of NYPD: Keeping Tab(let)s on Crime Data,” Wall Street Journal, March 4, 2014, http://online.wsj.com/news/articles/SB10001424052702304815004579419482346315614.
  38. Andras Petho, “ShotSpotter Detection System Documents 39,000 Shooting Incidents in the District,” Washington Post, November 2, 2013, http://www.washingtonpost.com/investigations/shotspotter-detection-system-documents-39000-shooting-incidents-in-the-district/2013/11/02/055f8e9c-2ab1-11e3-8ade-a1f23cda135e_story.html.
  39. Alex Knapp, “ShotSpotter Lets Police Pinpoint Exactly Where a Gun Was Fired,” Forbes, June 28, 2013, http://www.forbes.com/sites/alexknapp/2013/06/28/shotspotter-lets-police-pinpoint-exactly-where-a-gun-was-fired.
  40. “Case Study: Expect the Unexpected: Social Media During the Vancouver Stanley Cup Riot,” IACP Center for Social Media, November 15, 2011, http://www.iacpsocialmedia.org/Resources/CaseStudy.aspx?termid=9&cmsid=5658.
  41. Palantir Government Conference 8, Tysons Corner, Virginia, October 23, 2012.
  42. Direct Relief, “Hurricane Sandy Relief & Recovery,” http://www.directrelief.org/emergency/hurricane-sandy-relief-and-recovery.
  43. Dominick Ninivaggi, personal interview, August 28, 2013.
  44. City of Chicago, “Sanitation Crews Increase Preventive Rodent Baiting by 30 percent over 2012: Resident Requests for Rodent Control Services Down 15 percent,” http://www.cityofchicago.org/city/en/depts/mayor/press_room/press_releases/2013/july_2013/
    chicago_departmentofstreetsandsanitationcrewsincreasepreventiver.html
    .
  45. Small Business Administration Office of Advocacy, http://www.sba.gov/sites/default/files/sbfaq.pdf.
  46. SBA Office of Advocacy, “Frequently Asked Questions About Small Business,” September 2012.
  47. Stephen Goldsmith, “Principles for Regulatory Rationality,” e21, May 15, 2013, http://www.economics21.org/commentary/principles-regulatory-rationality.
  48. Editorial, “Small Gesture, Big Implication,” Indianapolis Business Journal, July 25–31, 1994.
  49. Neil Padukone, “Regulating the Regulations: How NYC’s Rule Review Keeps Regulations Just,” Data-Smart City Solutions, November 18, 2013, http://datasmart.ash.harvard.edu/news/article/regulating-the-regulations-how-nycs-rule-review-keeps-regulations-just-348.
  50. “United States Small Business Friendliness,” Thumbtack.com, survey, 2013, http://www.thumbtack.com/survey#2013/states.
  51. NYC Buildings 2012 Annual Report, http://issuu.com/nycbuildingsannualreport2012/docs/3652_nycbuildings_annualreport2012.
  52. Louis Columbus, “MIT Sloan & Capgemini Survey: Real Digital Transformation Happens When Customers Win,” Forbes, October 17, 2013, http://www.forbes.com/sites/louiscolumbus/2013/10/17/mit-sloan-capgemini-survey-real-digital-transformation-happens-when-customers-win.
  53. Shari Hyman, “Enforcement and Data: One New York City Agency’s Vision for a Level Playing Field,” Data-Smart City Solutions, August 13, 2013, http://datasmart.ash.harvard.edu/news/article/enforcement-and-data-280.
  54. Benjamin Weinryb-Grohsgal, “Making Inspections Mobile,” Data-Smart City Solutions, June 5, 2013,
    http://datasmart.ash.harvard.edu/news/article/making-inspections-mobile-254.
 
 

The Manhattan Institute, a 501(c)(3), is a think tank whose mission is to develop and disseminate new ideas
that foster greater economic choice and individual responsibility.

Copyright © 2014 Manhattan Institute for Policy Research, Inc. All rights reserved.

52 Vanderbilt Avenue, New York, N.Y. 10017
phone (212) 599-7000 / fax (212) 599-3494