So after slightly more thought about the medical privacy breach committed by Boag and Walker, and while their actions are vile, this news about the breach is probably the best we could hope for in another sense.
It’s appalling that Boag - a long shit-stain on the underwear of New Zealand politics since her days in Jim Bolger’s office - decided to make the personal information of sick Kiwis a weapon in the National Party’s increasingly desperate bids to unseat the government at any cost.
However, there is a silver lining.
This patient information does not appear to have leaked because of systemic weaknesses in New Zealand’s healthcare systems. It doesn’t appear to be the result of, for example, hacking or worst of all, shoddy handling of patient info.
Rather, it appears to have been an insider threat: information which was legitimately shared with the rescue org (which needs to know if people they transport may have COVID-19) which was then stolen and leaked by a person in a privileged position.
I’m not an infosec person, but I’ve been working in banking for a couple of decades. This sort of things is essentially impossible to prevent with technical controls. That’s because when someone with legit access chooses to abuse their power - whether personally, or using their position as a C*O to order a subordinate to break the rules (or, indeed, laws) - the most you can usually hope for is to flag and follow up.
If people who have the right to do things (or cause things to be done) are crooks, there is almost no technical control you can implement to stop that.
The few controls that you can implement will be so cumbersome that the damage caused to people trying to do their legitimate jobs will be crippling; in the case of a rescue helicopter you could have patients literally die because you’ve made it too hard for the rescue teams to do their jobs.
At some point you need to be confident that you aren’t hiring crooks, and try to mitigate the damage if you have. You would have to have a workplace culture where it’s safe to tell the CEO that they don’t need access to systems with patient data, and staff feel safe to say no when asked to act illegally.
Do you think Michelle Boag - the person who used a rescue helicopter to shuttle her passport from her home to the airport when she forgot it - is the sort of person who creates that culture? I don’t.
Organisations that deal with deeply personal, important information - banks, hospitals, insurance companies, and so on - where the exec drive a culture of fear, a culture where obedience to the individual is more important than following ethics and principals are dangerous. You can’t build technical controls for sociopaths and psychopaths. You can’t build technical controls that stop someone using the threat of throwing staff out of their jobs to get what they want.
When “just do it or else” is the standard of leadership, this sort of criminality follows. And that’s bad. But that’s not about shoddy IT systems. That’s about asking why Boag and people like her get to be (and remain) directors and execs.
OK, so it’s a slim silver lining. But likely it means that your patient data is probably safe on a day-to-day basis, because most people aren’t as evil as Boag.