Kiwicon 8 Day 2

Opening

Pyro and metlstorm dancing. Kiwicon delivers on entertaining openings. metl was clearly gleeful at “spending your money on smoke machines and fire.”

Root Causes: Complex System Failures and Security Incident Response

@hypatiaca and @hashocothorpe talking about bridging the ops/security divide at Heroku.

Failures

Why do things fail? Things fail because you fuck with them.

(I only wish this was true. I deal with way too much stuff that fails when no-one has fucked with it enough.)

  • The Seattle floating bridges - they sunk one by overloading it while working on it.
    • Pontoon bridge across Lake Washington. So awesome! I knew nothing about them before this talk, and now, like the tunnel under the Baltic, I want to use one.
    • The plan to use the pontoons to store waste water during refurbishment was carefully thought out, including an exhaustive analysis of how much extra water could be safely added to the pontoons…
    • …but it didn’t take a storm into account - more water in the pontoons than expected.
    • The extra water opened existing small cracks in the pontoons, which then flooded.
    • Concrete full of water does not float.
    • And then the bridge sank.
    • Complex system failure has many parents.
    • There was much mockery of Wikipedia.
  • DevOps/SRE: There are many (mostly terrible) definitions, but the one they offer is: the care and feeding of computer systems in a changing world.
  • Code is the theory, prod is the practise.
  • Retrospectives: looking at actions in the context at the time. People aren’t (generally) shit, but systems and information are.
    • Assume good faith. Really. No, really.
    • Create a safe environment: don’t hang people out to dry. Let everyone participate. And so on and so forth.
    • Identify and praise preventing errors as part of the retro.
    • Obsess over the systems, not the actors (people).
    • Sleep is super important! If your system has underslept people, they will fuck up. This is not their fault. Not sleeping for 72 hours is the equivalent of being really, really shitfaced, for example.
    • You wouldn’t praise someone for being at work drunk, why would you praise them for being at work wasted on tiredness?
  • Ops and security care about networks, just in different ways.
    • Robustness (resistence to failure), resilience (recovery failure), anti-fragility (lolno) - systems that benefit from the right mount of stress.
    • The idea of creating “anti-fragility” - a term the speakers aren’t fond of, but describes a useful pattern.
    • An example of anti-fragility would be weight training - with the right amount of stress you get stronger.
    • Anti-fragility might include, say, a bug bounty program: stressing your systems with people who will disclose weaknesses, rather than exploit them.

Common Ground

  • Ops and security are places you get to by apprenticeship, not by training courses.
  • There are training courses emerging in security, and they’re (mostly) terrible.
  • Both are schools of hard knocks.
  • In both roles, while you don’t have a lot of institutional power, while you also have many masters.
  • Interviewing people (the ones already on your teams) leads to “smells” about sketchiness in code.
  • “Ops olfactory” is a Heroku tool which generates the same sort of traces, and octo follows up with questions.
  • Types of smell:
    • e.g. resentful about being under-resourced.
    • Will those smells lead to problems?
    • Note the example is focused on the system (people being overworked = problems) not the individual.
  • The requirement to be an influencer of positive behaviour and the amount of the job that is people work generally means there’s a lot of emotional labour…

Emotional Labour

  • Theories around the idea of emmotional labour began with and were rooted in the study of flight attendants, a job that has mandatory bubblyness.
  • Emotional labour is work - it’s a real, under-estimated thing. People burn out on it, and it’s under-recognised.
    • I would add that is exacerbated in tech environments by having so many people who are crap at interpersonal stuff, and actively hostile to any discussions around it.
  • When we don’t put the energy in, we fall back on toxic behaviour.
    • “The No-Asshole Rule” is important.
    • Aggression is a common result when things go bad.
  • The “four rules” of creating positive learning environments should apply; it’s not just direct hostility than can create an unpleasant environment: e.g. feigned surprise, well actually, backseat driving, and -isms.
  • “Operations therapy” - talking it through.

This continues some common themes throughout the conference: so much of security is an under-recognised/ill-managed/“too-hard” human factors issue. That you need to make it easy for people to do the right thing.

ThroughGlassXfer - Ian Latter

Jesus what a scary-arse introduction. The talk delivers on the promise of making me crap myself.

  • Screens for data transfer. “1080p is 1.2 Gbps”.
  • Remeber VHS backups? Data transfer over the screen.
  • 1994: Microsoft and Timex had a watch programmed via CRT: hold your watch up to a screen, and the programs and data would download to the exposed bit of an EEPROM.
  • 1994 was also when QR codes were release.
    • Consider the QR as an optical packet in the ether of the screen.
    • Animated QR codes are possible.
    • Layer 4 problems you can’t really tell if you’re losing or corrupting data.
    • So there needs to be a transport protocol!
    • Adapting QR v1 to an orthodox protocol structures, embedding metadata in the early bits.
    • TGXf Transport Protocol.
    • High latency, resuming transfers, differing frame rates, ECC.
    • Can download via YouTube in realtime.
    • So you’ve now got a high-ish speed exfiltration mechanism. With no effective counter - because the PoC is QR codes, but it could be anything that goes on the screen.
    • There’s an app for realtime decoding.
    • Everything is fucked forever.
    • ANSI QR codes passing through SSH jump hosts. Rocking it 1989 style with animated BBS graphics.
    • Everything is fucked forever.

That’s horrifyingly useful, but what about a back channel? How to upload? How to control automatically?

  • The Leonardo Arduino can fake a keyboard and autotype the TGXf program into a host.

  • The implication is that we can now do bidirectional using the HID driver.

    • It’s a much slower rate (limitations in the HID protocol).
    • TKXf, the keyboard stuffer. Serial interface via the Arduino and HID.
    • This shit is Satan’s 1200/75 Modem.
    • Everything is fucked forever.
  • TCXf PPP. Now you can establish a PPP link over keyboard and QR codes, using the destination as a jump point in the remote network.

  • The tools are completely transparent to the attacker’s environment.

  • clientlessTGXf: there’s no client.

    • And any data - pixels, words, letters - can be used. We get to see a demo of using a video file + OCR to ship /etc/passwd to the attacker.
    • The components to bootstrap the attack can be boiled down to 300 bytes of shell. But I’m sure a clever attacker could implement them as an Excel macro or anything else you might legitimately have access to in the target network.
    • Hiding transfer media from the “red room” (i.e. getting a USB stick in and out of a secure air gapped room) - “be creative Pulp Fiction style.”

What is the implication? We are fucked.

Seriously, when Ian revealed his HID backchannel I threw my hands up in the air and announced, “I give up, we’re all fucked.”

What does this mean:

  • Use and discloure are the same thing. If people have a screen and keyboard, they can exfiltrate data or get an IP address on your network.
  • Off-shoring/right-sourcing/best-sourcing: this cannot be safe.

Essentially this is pretty much proof against any meaninfgul technological countermeasures. You could look for specific attacks, but it would be very impractical; you could try large scale statistical analysis (track all keyboards and video in your environments and see if any of them have odd patterns of behaviour) but that would be phenominally expensive and time-consuming.

After my initial reaction I found myself tying this into the general theme of many of presentations: this is a technological attack with no easy countermeasures. The counter is a human one, though. If you see someone sitting at their desk with their cellphone recording a stream of QR codes, you might ask them what they’re up to. If you’ve outsourced that role to someone who subcontracts it to someone else… well, you’ll never ever know. Don’t hire shitweasels. Know your people.

This was hands down the most disturbing “pure” technical presentation of the conference for me.

Peter Gutmann

Boom. Boom boom boom. A really interesting talk on WW II bomb disposal. I didn’t bother taking notes, but I’ll add that if anyone was wondering why the German fuses were so much more advanced than allied ones they should read Len Deighton’s Blood, Tears, and Folly which will explain to you the peverse incompetence of the British millitary.

Manipulating Human Minds - Christina Camilleri

  • A lot of the talk relies on deep theory, keep an open mind.
  • Social engineering - hacking the mind; study people, understand how and why people behave and react to prompts.
  • One key element is to make the victim feel like they will benefit from co-operating with you.
  • Christina likes social engineering because it’s the most challenging element of security.
  • You can spend tens of millions buying every tool at the RSA conference and you´ll be bypassed by human failure. (Like RSA Inc selling out their users for $10 million?)
  • Humans become the path of least resistance, especially as technical defences become more and more sophisticated.
  • You can prey on peoples’ sense of helpfullness and urgency.
    • Her example was tracking a target who was on holiday. She claimed she needed to come back from holiday urgently to do a last minute report for the boss.
    • But, oh dear, she’d forgotten her email and network logins.
    • The fatal phrase on the helpdesk when she claimed to need password resets to get back to work was “I’m not supposed to do this, but…”

Persuasion

  • One options is the central route: the direct method. Arguing without preparing the victim and crafting a story. Bluffing, intimidation.
  • Periperal route: Coming up with a story, appealing to the victim’s vanity, or other similar techniques.
    • For example, convincing people that it was their idea all along.
    • Milton Erickson’s art of “being artfully vague” is a useful idea Christina cites.
    • A study used three reasons to jump queues at a photocopier:
      • providing an excuse - “this is very urgent” - which was the least successful.
      • not providing an excuse (“May I jump the queue?”) - about average.
      • “Can I jump the queue to photocopy” - reinforcement - is the most successful. You seem ot be providing more information, which people react to, but you aren’t.
  • Framing is a key element: Christina recommends thinking about the the Cialdini 6:
    • Authority.
    • Liking.
    • Social Proof (that is, encouraging the target to indulge in herd behaviour).
    • Scarcity.
    • Reciprocity
    • Commitment and consistency.

…but simple attacks (tail-gating, shoulder surfing) still work.

Genders in Social Engineering

  • Whether there’s a gender advantage depends on the target. Women don’t neessarily have the advantage people think they do.
    • For example, women can’t take the authority role easily.
    • Women don’t like taking orders from other women.
    • Moronic stereotypes can be your friend.

I got more than a hint that Christina has suffered from having her success in this area dismissed as being the result of her being a woman, rather than based on her knowing her shit.

NLP

  • The NLP-derived ideas about facial/eye-movement cues are a bit faux-science, but can be useful to understand.
  • Mirroring is a great hint as to the level of engagement.

Story Time!

  • An attack on Boeing as part of the CTF.

    • ¨Naomi Woolf" - pretty, cute, smart, gamer girl - came into existance with her fake twitter and Facebook profiles.
    • She started friending people at Boeing, which snowballed into more friend requests.
    • People started wanting to meet face to face.
    • Information was gained.
  • Christina didn’t need to initiate any attack beyond the initial seeding: by using Facebook, Twitter, and LinkedIn to create a social media presence for “Naomi”. After she laid the groundwork the marks came to her.

  • And attack on a firm purporting to be from an accounting firm, mailing USB sticks with a cover letter.

    • The fake firm had webs sites, a companies office presence.
    • The letter contained references to a fictious tax change relevant to the targets.
    • It looked and sounded authoratitive, and offered the targets the possibility of gain.
    • A significant proportion the plugged in the USB stick as instructed.

Security Through Education

This ties into other speakers’ observations:

  • Do live fire exercises
  • Never victimise people - understand individual failures in the context of broader systems.
  • Modify systems to deal with failure.

0xkitty’s talk was fast-paced and information rich.

Lightning Talks

15 minute chunks. Pyro will be activated at 15 minutes. Which seems like an incentive to run over time to me.

Amon-Ra - Fallout from bus hacking.

This was the followup to his Kiwicon 7 presentation on weaknesses of the Christchurch electronic bus tag system, which created quite the flap.

  • Owned the smart cards for the Christchurch buses.
  • Site was taken down two days after Kiwicon, but only after the head of ECAN was interviewed by RadioNZ.
  • But they weren’t interested in fixing the problems until the Kiwicon talk - William had emailed them and then gone into the ECAN offices to try and convince them to take the problem seriously before Kiwicon, and got nowhere.
  • Stuff’s coverage was hostile to AmonRa, but the comments were generally positive.
  • Five months later they put the site back online.
  • They still haven’t replaced the vulnerable cards.
  • Nor have they rolled out the newer generation of cards, which are supposed to be less vulnerable.

Because William (and friends) were curious about what was going on behind the scenes, Chris from Insomnia put in an OIA request to get info about the reaction.

  • ECAN ignored it (sensing a theme here?).
  • Until the Ombudsman got involved.
  • The OIA dump was both interesting and disturbing on a number of levels.
  • The OIA dump showed that they knew of the vulnerablities. They initially told the press that they had been caught short by the lack of disclosure.
  • They refused to engage with local sec firms to fix the problems. Instead, they denounced the suggestion they should as evidence that William was a crook, trying to drum up business.
  • They went to the police. The police advised them to blackmail AmonRa to avoid disclosure, threating criminal action.
  • “The police and government shouldn’t be doing this.”
  • The vendors were either incompetant or lying to ECAN when ECAN asked questions.
  • They were lying to themselves about whether they’d done anything about it.
  • Lack of monitoring or ability to respond to the threat - they had nothing in place to flag suspicious transactions and card behviour.
  • They have no anti-fraud mechanisms.

Paul Ash - The National Cyber Security Office

  • Yeah, nah. Reagan was a fucking dick. Maybe political advocacy is not the best foot forward if you want to engage with people.
  • Jeeze. Can’t ditch the jargon.
  • Policy, not practise.
  • Why is this in the Prime Minister’s office?
  • Using the phrase “security state we want for New Zealand” probably wasn’t intended to be Orwellian. But it did.
  • “New Zealand needs to be seen as a safe secure place […] where human rights are respected.” Hmm.
  • “There will be compromises.”

The most disturbing part of this was the attachment to the PMs office, a political role. Well, and the reflexive reaching for the language of the Stasi state.

Caleb “alhazred” Anderson - I Know What You Did Last Wednesday.

  • “I bet I can hack the apartment VOIP phones.”
  • He did. The video phones in his apartment building can be remote hacked and the camera used with no notice to the apartment owner.
  • Much like Matthew’s talk on blu-ray players many of the decisions around the hack were claimed to be influenced by drink and laziness.

snare - Voltron Defender of the Something Something

snare likes hugs, apparently. snare spends more of his life inside gdb and lldb that he would care to think about.

  • “Fuck debuggers.”
  • Debuggers have terrible UIs.
  • People have tried to fix this. With UI enhancements written in GDB’s scripting language.
  • Which is apparently about as attractive as metlstorm cupping a pig’s balls.
  • gdb and lldb have added support for embedded python in *db.
  • So snare has written a bunch of code to support a JSON API exposed over various interfaces, and a nice ncurses front-end.
  • Supports HTTP. gdb-as-a-service. (That doesn’t sound good to me, but I’m not a gdb hacker.)

Marissa was the first person to see that being able to make flames shoot out of the stage by refusing to shut up is not a problem it’s an opportunity.

Her talk was around understanding the law.

  • “Legal” is mostly a contiuum. There are things which are socially unacceptable and have clear legal sanction. Do that, you’re in trouble.
  • There are things which are clearly legal and socially acceptable. No worries!
  • Everything gives you wiggle room. It’s technically illegal for two 15 year olds to have sec in New Zealand, but it’s vanishingly unlikely the police would bother prosecuting consensual sex, because the social view mitigates against it.
  • Where social pressure exists for or against an activity this significantly affects the likelihood that you’ll be prosecuted for something. The police have significant discretion in these things.

A good, interesting talk, and more than a little relevant to an audience that need some wiggle room to do their research and present it. And bonus points for glorying in fire.

Breaking AV Software - Joxean “@matalaz” Koret

I had a flat battery for this talk, so my notes are nonexistent. The upshot of the talk was fairly straightforward: your AV software is mostly hilariously insecure, and quite possibly the least secure thing on a modern Windows system. Joxean mocked AV vendors for using “90s technology” to protect their scanning engines from malicious activity.

Closing

So the high point of the closing was winning an award especially created for me: the “most 80s jacket”, after I spent Thursday evening making an iron-on transfer of the Kiwicon logo for my old acid wash denim jacket. It was the most 80s thing I could think to do with the most 80s item of clothing I own. The award made me very (unironically) happy.

Share