For anyone who follows Australian politics and / or social justice issues January 2017 has been a torrid month. This was the month that discontent and anger at the Centrelink ‘Robo-debt’ disaster, already widespread in the community and simmering during the closing months of 2016, came to a boil and erupted out of the social media forums where it had started festering and finally caught the attention of the broader community via mainstream media.
The focus of the anger was a scandal surrounding the government’s use of data matching to identify and chase debts from welfare recipients who, according to the data matching exercise, had been overpaid. The problem was that the data matching was being driven by a very poorly designed algorithm and inadequate processes and communications protocols with the result that many of our society’s most vulnerable people were being pressured to pay money they didn’t actually owe. The distress this caused thousands of people was absolutely awful and, even worse, absolutely unnecessary. All kinds of experts and boffins decried the exercise: the economics editor of The Age, Peter Martin, was gloriously scathing in his columns labelling it a “shakedown” and “inhuman” (Comment, The Age, 4 Jan. 2017). The Australian Lawyers for Human Rights criticised the legal process behind the scheme, saying it was “wrong at so many legal levels that it’s hard to know where to begin”.
Welfare and community legal organisations, when they could drag themselves away from their phone lines crammed with calls from frightened Centrelink clients, roundly condemned it.
One expert opinion that surfaced during an interview in The Guardian was that of Paul Shetler, the former head of the government’s digital transformation office.
“The man handpicked by Malcolm Turnbull to head the government’s digital transformation has said the error rate in Centrelink’s data-matching process is so unfathomably high that it would send a commercial enterprise out of business.”
Shetler was highly critical of the IT aspects of the scheme, unsurprisingly, and if you would like to see what he has to say about that, then you can read that here.
It’s an interesting interview overall, but what really caught my eye was what Shetler had to say about culture:
“Paul Shetler, the former digital transformation office head, criticised the government’s response to its latest IT crisis, telling Guardian Australia it was symptomatic of a culture of blame aversion within the bureaucracy.”
Part of the speculation that arose on twitter (where I have been following #notmydebt) was how such a poorly conceived, designed and implemented project ever managed to get off the drawing board, squirm its way through the internal project management processes and foist itself onto benighted welfare recipients. Many people were wondering how this whole thing made it past testing and risk assessment processes.
“’It is literally blame aversion, it is not risk aversion,’ Shetler said. ‘They’re trying to avoid the blame, and they’re trying to cast it wide.’
‘The justifications that have been given I think are just another example of the culture of ‘good news’, reporting only good news up through the bureaucracy.’”
For me that scans: if your workplace culture is not one where people feel entitled to ask questions, express doubt, highlight risk, take chances, and discuss the possibility of failure then your organisation is at grave risk of being forced to experience these lines of questioning and speculation in the public arena and after something has actually gone wrong. Perhaps gravely wrong. And perhaps with damage done to other people on your conscience.
Shooting the Messenger:
In the interview Shetler says:
“I’m sure that the bureaucracy was being told at every single level that everything was OK.
“That’s how it works in the bureaucracy. Bad news is not welcomed, and when bad news comes, they try to shift the blame.”
No one likes to be clobbered for sticking their necks out at work.
But do you want the project managers, product managers, and managers in your organisation to figure they have an ethical duty to highlight potential risk? Furthermore, good honest discussion of risks can be an opportunity to develop countering strategies that make processes more robust, perhaps even more innovative.
But to end up having those robust and innovative discussions you need the right culture.