When surveys get used as a blunt instrument

The stakes can be very high

At the top of my street is Grey Lynn Primary School. My daughter went there for six years. That whole time, and until very recently, the one piece of road crossing safety the school could offer us was a daily lollipop sign patrol on the painted zebra crossing. Surrey Crescent is a wide connector road, with substantial traffic volumes each day.

We expect drivers to slow down, but they don’t always. In one week alone earlier this year, 350 drivers went past the school travelling at speeds greater than 60km/h. Speed is a key determinant of road safety; the faster cars travel, the more crashes occur, and the greater the risk of injury and death.

20210828_081510.jpg

In the past few months, Auckland Transport has established a safety trial outside the school. It followed a co-design process with the school, and uses speed bumps, planter boxes, kerb buildouts, street art, road narrowing and other cues to deliberately slow driving speeds outside the school. It’s based on well-established human factors research that shows drivers unconsciously slow down when they’re given visual cues to do so. They reduce crashes and save lives, and they’re often more effective than speed limit signs.

These changes haven’t been universally welcomed, even to the point that there were protests that halted roadworks there one day, reportedly because they were “incredibly dangerous” to vehicles. Our local business association, clearly having heard some of the indignation, decided in the last week to circulate a survey to members about the changes.

Garbage in, garbage out

It's a longstanding principle of survey design that if it’s garbage in, it’s garbage out – if a survey is poorly designed, the resulting data will be invalid. Sadly, this survey seems to be poking its head out of the dumpster. Given my last post on survey fatigue, I thought a discussion on poor survey design, using this as a case study, might be a useful follow-up.

Now I like to attribute good intent. I know a good many members of the association and I’m a member myself. We all want our children to get to school safely. But if you’re looking for a good example of poor survey design, this is the perfect place to start. And when you’re dealing with an issue as important as people’s lives, you really really don’t want to put a badly-designed survey out in the field.

Let’s start with the survey introduction. After a brief introduction about the purpose, and a certain amount of grumbling that they weren’t consulted in the design, they go on to say the following (as quoted directly from the survey form):

We do accept that we are ALL learning from these safer school project but our preliminary view is that some of the solutions deployed seem fairly primitive in terms of use of more modern technology, create unnecessary street clutter, are visually distracting and potentially create unintended consequences such as encouraging more frequent cross of Surrey Crescent. We are also concerned about early morning and late afternoon sun both blinding motorists and cyclists on occasions depending on time of year etc.

The Association supports safe access, including enhanced safety measures around schools, as a matter of principle but some of the changes are generating significant concern. Before we submit our response we would appreciate your views to ensure we present a balanced and representative view.

So let’s just stop there. Putting aside the poor syntax, and my curiosity about what they expect Auckland Transport can do about the angle of the sun, the association has told the reader in no uncertain terms what it thinks about the designs. This is classic priming of the reader towards a particular view. Any response that comes out of this will be informed by, and potentially swayed by this perspective. Yes, there’s a hat-tip towards road safety, but it’s drowned by the long list of problems. It’s no use saying “we would appreciate your views to ensure we present a balanced and representative view” because they’ve already started down the path of presenting a very clear view of their own.

Then we start on the survey itself. We are asked ‘how do you rate the traffic calming measures’ given the option of using a 5-point scale from very poor to excellent. Now, people might rate the visual elements very highly, but the speed bump much lower, but they’re given no option to reflect on the elements, just a gut response on it overall. For those who haven’t made up their mind, there’s nowhere to say “I don’t know.”

In response to the next question on ‘Do you support the traffic calming measurers continuing?’ we’re given a 6-point scale from ‘totally opposed’ to ‘very supportive’. Again, no room for the undecided in this one. And why 6-point here and 5-point in the one previous?

Things start going off the rails further when they ask ‘The speed bumps have generated a lot of comment. Do you support this specific traffic calming measure?’ Again, priming the reader, and this time they offer yes, no or other. Quite what they mean by other is completely unclear.

Next up is a question ‘There has been comments that cyclists in particular are having difficult navigating through the narrowed road, particularly in conjunction with the speed bumps. Would you like to comment?’ From whose perspective? As a driver who never rides a bike but can offer a half-based reckon? As a cycle commuter? Or as a parent who rides to school with their children? All have equal weighting with this question. I do wonder if the association is planning to have a conversation with the school trustees, principal or even children themselves to hear what they think of the changes, or will their consultation be limited to their members who might pass through the area?

I won’t go through all the questions, but I just need to draw attention to two more. The association asks ‘There has been a reduction in parking. In your view is this appropriate in the circumstances?’ It’s a well-known rule of city planning that people really don’t like losing carparks. It’s a question that’s inviting a negative response. But when I asked our local council representative how many car parks were lost, I was told just one. The question not only waves a red rag at a bull, it gives no sense of the scale of change – which in reality, is a very minor adjustment.

Lastly, for this post at least, they ask ‘How frequently do you use Surrey Crescent?’ Seemingly innocuous until you see the responses: ‘Multiple times a day, Daily, Used to use Surrey Crescent but now avoid, Not at all.’ They’ve invited the reader to give a negative view, with no positive counterpart, in the midst of what should be a straightforward piece of data collection about frequency of street use.

To their credit, they do give people the opportunity to identify which parts they think are working well and which aren’t, and how to improve the trial. But these few well-constructed pieces are left in the rubble of poor design, and leading statements and questions. 

What would have been a better approach?

In a previous post I cautioned about over-reliance on, and over-use of surveys, and the consequent dangers of survey fatigue. If we go back to the start of this post, perhaps this exercise could have started with the fundamental question of ‘do we actually need a survey now, or should we build an informed conversation with our members?’ The association could have asked the designers to present at one of their regular meetings (or in these days of COVID lockdowns, via zoom), to explain the rationale behind the changes, and built an informed discussion. It could have waited for evidence on the impact the trial is having on traffic speeds and parking capacity.

My colleagues in this field would argue a survey like this is not well positioned for assessing the usability of a street trial; for example if a cyclist can fit through a space or not. When a survey is being completed remotely from the actual street interventions, they will invariably reflect a blend of people’s observations, as well as their biases and frustrations. For usability of a street design, it’s better to observe at the street intervention site itself how these things are working, ideally with people in the community.

But I design surveys for a living. I understand their attraction, and used well, they can be an important resource.

So if the survey really was needed, the first rule is to engage people who know how to design questions well. The second rule is to review and pilot – send the survey to a few people with an eye for detail who will sense-check it for unintended meaning, alternative options that should be considered, and to iron out any grammatical errors or typos.

It may well be that there are aspects of the trial street designs that will need rectification. A potentially good time to run such a survey would be when the street changes are well-bedded in, at a point when people are used to them and can comment from a constructive perspective of what could be improved, rather than at a point where they’re in the midst of adjusting to change.

Getting back to where this post began, the back story is simply about children getting to school safely. Yet, as Auckland Transport assesses the value of the trial project, it will in all likelihood be hearing from the business association on the views of its members – when in reality their evidence base is simply drawing from an open invitation to offer uninformed reckons through both leading framing and leading questions.

And as input to decisions that affect the safety of primary school children, I find that deeply worrying.

What I’m highlighting here are the dangers of poor survey design, and the risks of treating findings from such research as credible evidence. Surveys should be treated as a finely calibrated tool, not a blunt instrument to wield indiscriminately.

To which we can only conclude, just because you can survey, doesn’t mean you always should.

My thanks to Hamish Mackie for feedback and additional reflections.

Note: All quotes are as they appeared in the survey – another good reason for review and piloting.

Disclaimer: I have undertaken consulting work for Auckland Transport and Waka Kotahi (New Zealand Transport Agency), most recently in 2017 and 2019 respectively. I am also a member of the business association featured in this post.

Previous
Previous

Dovetail's 2021 in review

Next
Next

Survey fatigue: Are we plundering people’s finite resources of patience and trust?