The University of Arizona

UA Geographer Works to Enhance Public Input on Public Projects

By Lori Harwood, College of Social and Behavioral Sciences | December 17, 2012

Keiron Bailey travels the world using his own branded methodology, SPI, or Structured Public Involvement, which seeks to improve public satisfaction with public design and management.

Keiron Bailey
Keiron Bailey
Participants rate public works projects using keypads.
Participants rate public works projects using keypads.

Did you know that the government is mandated to obtain public input for most “public works” projects?

Public works projects include interstate highway corridors, transit developments, new bridges, routing of electrical power lines and nuclear plant remediations. Keiron Bailey, an associate professor in the University of Arizona School of Geography and Development, has been involved with crafting and implementing the public participation component of all of these projects.

“These decisions cannot be made in cigar-filled rooms without taking into account what people want,” Bailey said.

Bailey travels all over the world using his own branded methodology, SPI, or Structured Public Involvement, co-developed with Ted Grossardt of the University of Kentucky. SPI seeks to improve public satisfaction with public design and management by integrating geospatial and geovisual technologies into broad-based multi-stakeholder processes.

His research has been funded by the Federal Highway Administration, the National Science Foundation and the Federal Transit Administration. In 2008, Bailey and his colleagues won the Greg Herrington Award for Excellence in Visualization, awarded by the National Academies’ Transportation Research Board, for a transportation and land-use planning project.

Historically, the methods used to garner public input are unstructured. Typically, a public meeting is called; people show up and voice their support or complaints about the project. Focus groups perhaps are organized. However, often there is no way to effectively gather and use this data. And in an effort to artificially obtain a “consensus” from the public, certain voices may be excluded or discounted.

Bailey and his colleagues don’t try to obtain consensus or have members of the public vote on their favorite option. Bailey explains that when you ask people to choose their favorite option, you actually back the program sponsors into a corner. And even if the sponsors decide to pick the most popular option, the people whose favorite option wasn’t chosen leave the process unhappy.

Instead, Bailey asks the participants to rate the project options in terms of suitability on a scale of one to nine, using visualization techniques to present the plans and options. Each option is tied to certain project parameters, such as, in the case of the bridge project, the height, symmetry and complexity of the bridge.  

The public stakeholders evaluate options via keypads, and Bailey and his colleagues collect a range of data points, which allow them to refine the options based on the parameters. More nuanced, preferred options are brought to the next meeting.

Bailey’s use of geovisualization techniques is often viewed as the sexiest part of the project, but to Bailey, they are just a tool to help people make better decisions. By creating 3-D visualizations of the options, participants develop a better understanding of what they are evaluating, as well as the trade-offs that may exist between options. For example, when trying to decide if a highway should be three or four lanes, the geovisual techniques help people weigh the aesthetics against traffic flow impacts.

At the end of the public participation process, the program sponsors may have a few options that are a statistical tie in terms of suitability. They can then choose to look at the range of scores. For example, a bridge design with a mean of six, with many high and low scores, may be a more polarizing option than a bridge design with a mean of six, with scores clustered in the mid-range.

To Bailey, SPI is democracy in its most basic form: one person, one response, anonymous and counted in real time. Bailey and Grossardt based SPI on the core principals of philosopher John Rawls, who posited three aspects of justice: distributional justice, or who gets how much; procedural justice, or how we decide who gets how much; and access to justice, or who should be included in the deliberations.

“We argue that distributional justice, in the form of perfectly equitable distribution of costs and benefits across diverse populations and regions, is not attainable. Designing systems around this goal contributes to their failure. Instead, we argue that a closer approach to distributional justice is possible through an epistemology that maximizes procedural justice and access to justice,” said Bailey. “This means soliciting input anonymously, simultaneously and equitably through a one-person, one-input system and evaluating the process with similar rigor and transparency.”

One of the chief aims of his methodology is to avoid the rampant “gaming of the system” that he has seen proliferate in public participation processes. Bailey relays tales of planners, designers and consultants so heavily invested in specific outcomes that they effectively work to shut down participants who are leaning toward the “wrong” choice. And then there’s the special interest groups that hope their volume will drown out the rest of the participants. The result is a public so disenchanted with the process that its members don’t even show up to participate, a problem called “civic disengagement.”

To maximize the value of the process for the public and the project sponsors, the SPI methodology focuses on four metrics: inclusion, clarity, efficiency and quality.

Inclusion deals with the number and diversity of people participating in the process. Although online participation options are usually incorporated, Bailey says there is no consensus that online methods create better inclusion. It does, however, produce lower-quality measures. Bailey and his colleagues focus instead on holding meetings at convenient locations, such as college campuses, when attempting to involve younger citizens.

Clarity is the measure of how useful the public data is to the sponsors in terms of helping them make a legitimate decision that reflects community desires, often when the range of feasible options is constrained by engineering, legal, financial and other considerations. 

Efficiency deals with how much public participation was obtained with the money that was spent. Bailey cites one group that spent $16 million of public money over 15 years, and the process only managed to polarize the public. His group came in and, two years and $200,000 later, garnered more participants and quality metrics.

Quality deals with the open, real-time citizen stakeholder evaluation of the decision process. To Bailey, this metric is vital, yet it is often missing because it is not mandated. In fact, many program sponsors believe there is a disincentive to collecting this data. Bailey has to fight the presumption that just because members of the public may not like the options available to them, they will trash the process during review.  Extensive SPI data supports the opposite view: People appreciate the chance to offer meaningful input even if their preferred solution is not chosen.

One example is the PGDP (Paducah Gaseous Diffusion Plant) collaborative visioning project. Bailey was part of a team that helped the U.S. Department of Energy and other government agencies create an “end-state vision” of the community’s preference for land use at the PGDP nuclear plant site, one of the world’s largest nuclear plants, ahead of scheduled decommissioning.

Not surprisingly, there was a bevy of problems associated with this project. “No one really likes nuclear plants,” said Bailey. “Let’s be realistic. You know you are presenting options where none of them will be universally loved.”

Compounding the problem was that the public had low levels of trust in the program sponsors. People did not feel they could evaluate future options for the site (e.g., it could be fenced off; turned into a wildlife park; or reused as a nuclear or industrial plant) when they did not think the government was being honest about the environmental risks.

“The people asked for independent data from Russian scientists based on the aftermath of Chernobyl,” said Bailey. “Some of the organizers were outraged.”

But Bailey and his colleagues adhered to the SPI methodology, which included presentation of the independent-sourced information as requested, and the stakeholders ended up giving the process an overall quality ranking of eight.

In fact, last year, the PGDP project was evaluated for the Reinhard Mohn Prize in Vitalizing Democracy, which is the nearest equivalent to the Nobel Prize in the field of participation work. Although the PGDP project did not win, the organizers were impressed by the inclusion of the quality metric, and Bailey and Grossardt were invited to the Bertelsmann Foundation headquarters in Berlin to discuss their work.

Bailey foresees a time when the quality metric will be mandated, and he is all in favor of it. Together with other metrics, the quality data is essential, he feels, to holding the government accountable for effective public participation and thereby delivering better governance for all.

Bailey currently has quality metrics from more than 30 projects in six states. Having demonstrated the success of SPI in the U.S., Bailey is now exploring how well it works in other contexts and cultures and is spearheading an effort to create an international matrix of quality data with other practitioners.

“So far, the data supports my thesis," Bailey said, "that people from all over the world want more public participation in governance.” 

Contacts

Keiron Bailey

School of Geography and Development

520-621-1652

kbailey@email.arizona.edu