Do Community Engagement Toolkits Work? Research from California
This is a good example of:How the public actually views the effectiveness of common public engagement tools
Community and urban planners have developed a range of tools to engage the public, particularly the most affected residents, in development decisions. Despite widespread usage of these tools, however, there is little information about their effectiveness, or lack thereof, when used with neighbors and communities.
UC Berkeley convened focus groups in communities surrounding four Bay Area Rapid Transit (BART) stations: Pittsburg/Bay Point BART station, Lake Merritt BART station (Oakland), San Leandro BART station and Diridon station (San Jose). Researchers then asked participants on the usefulness of several tools. UC Berkeley modified tools among the focus groups depending on each station area’s particular needs and issues. For example, affordable housing was a top topic in San Leandro, so researchers primarily tested the suite of affordable housing tools used.
Let’s Talk About Affordable Housing: Who, What, How & Why? by Nonprofit Housing of Northern California (Pittsburg and San Leandro)
Traffic-Lite: Great Communities Have Less Traffic by the Great Communities Collaborative (Diridon)
TOD 101 by Reconnecting America + Plan for Tomorrow by National Multi-Housing Council (The above two presentations were modified and combined into one powerpoint, shown in the Diridon focus group).
NIMBY to YIMBY by the California Department of Housing and Community Development (San Leandro).
How Dense Are You? by the Great Communities Collaborative Leadership Institute (quiz on affordable housing and density; Pittsburg and Lake Merritt).
Dialing Up and Dialing Down by the Great Communities Collaborative Leadership Institute ( a “guess the density” game used in Pittsburg and an altered version used in Diridon and Lake Merritt).
Findings (grouped by theme)
Though there were many elements in the tools that frustrated focus groups, or caused them to lose trust and be skeptical, there were also clearly identifiable features that participants found persuasive, convincing, or that they just plain liked.
Regular People and Their Needs – “Put a face on people who live in affordable housing”
1) Participants liked and responded to pictures and stories about real people in the tools, and pictures of children and families seemed to be a particular plus.
2) Though there were few personal stories in the tools, those that did exist got a strong reaction.
Attractive Buildings, Design, and Landscaping – “I mean, which one would I rather see a mile from my house?”
1) Images of attractive buildings that happened to be affordable and/or high-density developments helped people to visualize something inoffensive, or even attractive, in their neighborhoods.
2) Focus group members were drawn to buildings with characteristics appropriate to the local context, including design elements and density scale. For example, lower scale suburban neighborhoods like more home-like features while more urban residents like a higher level of density and design variation.
3) Focus group members wanted to see interior plans as well as exteriors when asked if they favored a certain design.
Community Benefits – “What’s in it for them?”
1) Focus group members cited parks, open space and public art as desired amenities, but also mentioned retail as a benefit.
2) Generalities such as “Reduced parking ratios can provide the opportunity for greater open space” – though often true – are not as convincing as showing communities where a concrete amenity such as a new park, day care or playground will be part of the package.
3) Tools should tell the story of a better community, both for the wider community and for the immediate neighborhood.
Concrete Examples – “These are practical things that people can relate to”
1) Focus group members found concrete examples useful and relatable—and the more local, or the more relatable, the better.
2) For concrete examples, focus group members related to examples such as pragmatic trade offs,, testimonials or geographic comparisons
Transparency, Professionalism and Straightforward Information – “Here, you’re gonna get a little skepticism…”
1) Participants were frustrated by instances where assertions were offered without evidence.
2) Focus group members were likewise very appreciative when they were presented with tools that gave clear citations of their sources, and other similar information.
3) Any tool (brochure, presentation) needs a definitions section for newcomers to planning.
Keeping the Local Context in Mind – “This is car town.”
1) Tools not attuned to the community where they are being used will not reach that audience. This does not mean all examples have to come from the neighborhood, but rather from one with similar attributes.
2) Participants from suburban areas felt less comfortable with “guess the density;” instead, they wanted to know how could density be tailored to suit their area.
3) Statistics, studies and models used in the tools should also be as local as possible. With national averages or models, focus group members were more easily able to discount the information.
4) Focus group participants were frustrated when design tools dominated discussions when their biggest concerns were not design-related (e.g., school crowding).
The Perils of Surprise and Misdirection – “Why don’t you just tell us the truth
1) Tools like “guess the density” that seem to have a counter-intuitive ”gotcha” type answer can leave participants feeling manipulated. What tool developers find surprising can be regarded as deceptive. However, some of the “guess the density” exercises can be helpful when the discussion, not the tool itself, is the center of learning.
Dialogue with other Focus Group Members – “I think that’s a good point she’s got . . .”
1) There were some interesting and instructive dynamics that came from the focus group dialogue itself, suggesting tool review can be both a diagnostic and community building exercise.