Categories
data free speech programming security

The uncrackable back door : The intersection of mathematics, AI, politics and ethics

The following is a lightly edited conversation I had with a tech-savvy friend who is not in IT. It was about the FBI trying to break the encryption on an iPhone so they could access potential information on criminal activity, but in light of the UK government seeking to add backdoors to all messaging platforms, for much the same reason, I thought it was a good time to revisit the arguments.

My friend’s comments are quoted, and the unquoted text is mine.

Imagine a technology existed that let you send things via post and to you and everyone else it looked like an envelope, but to the NSA it looked like a postcard, and they could read everything.

How does the NSA prove it’s them? How can we trust them? What if the FBI or your local police force can pretend to be the NSA? Couldn’t criminals, or your stalker ex do it too?

Maths doesn’t deal with legal balance. Either you let everyone in, or you let no one in. That’s the political choice. Is getting access to this phone more important than keeping other governments, such as China or North Korea out of phones they are interested in?

I don’t know if it’s an all or nothing situation though… are we saying that the courts shouldn’t be able to force entry into criminals data? Or are we saying that all data should be accessible to all outside existing privacy laws?

Think of the Enigma code. Once it was broken, Bletchley Park knew most of what the military was doing. If the Nazis knew it was broken, they’d have stopped using it, and all the work would have been for nought.

Enigma is a great example of why the code needed to be broken in the first place. That’s a chicken and egg scenario. But also a really interesting point! What if an iPhone is enigma, and say GCHQ cracked it. Would the evidence be allowed in court?

Is it not the case of Apple granting access to specifc phones; not being given the technique to do so?

What I’m worried about is the fact that big companies could hold justice and common law to randsom: that to me is equally as worrying as big brother, if not even more so. We can “elect” governments, and they can pass legislation to create international privacy agreements (as what Snowden’s revelations led to) We can’t elect Apple and I detest how Apple seem to be influencing justice; that is a very very bad sign of things to come.

Don’t even get me started over how data protection doesn’t exist between companies any more. Logon via Facebook anyone?

Is it not the case that Apple can access all this data anyway? So does Apple not have an ethical responsibility to disclose evidence for an individual case that has a court request attached to it? Guess not. Is that an appropriate level of power a company should have? To dictate what can and can’t be shared with courts?

Corporations already have too much power in the world. By not establishing a legal framework of when it is appropriate for a court order to be issued and have access (e.g to break and enter) we are basically letting sometimes serious criminals have a get out of jail free card. And that includes tax dodgers like Apple.

Apple can’t access the data at the moment, that’s the point. It only exists on the phone, encrypted with a key that’s password protected with a password only known to a dead guy.

Interesting. So none of his data was stored on Apples / 3rd party servers and it was all encrypted on the phone? What about all his comms traffic.
If I encrypt my (ah hem) Google Android phone, does that mean that my emails can’t be viewed by Google?

A lot of this comes down to trust. I don’t trust our govt nor the govt of others, but equally I don’t trust Google or Apple.

He switched off iCloud sync so it was all on his phone. However, as it was government issue, they could have changed that via policy if the FBI hadn’t tried to change the iCloud password, and hence locked the phone out of the government domain.

So they got locked out. That’s hilarious.

What I tend to do these days is try to remove my mind from the broader political implications and think about things at a ground level then I thought…. what if a phone contained information related to the death of my loved one.. then I realised there should be a controlled process in place to retrieve data legally and transparently.

I think the broader implications are important. If they can do it here, where else would it apply?

We have to think of real world scenarios : a murder in Glasgow, a child missing, that type of thing

Look at councils using anti-terror legislation to catch petty criminals, or DSS using it to follow people on benefits.

Imagine an encrypted padlock to a cabinet containing murder weapons.

Who watches the watchmen?

That’s conspiracy speak Craig. If we don’t trust the courts… then who can we trust?

It’s recorded activity. It’s not conspiracy if it actually happened.

courts are separate from government. They have been in Scotland since 1748.

I trust the courts. The problem is that many of these powers bypass the courts.

DSS is rarely a court matter.

Yes, but they are doing so illegally and that’s why new laws are coming in

And a backdoor for one is a backdoor for all. If the FBI have a post-it note with the pin for that murder weapon safe, it only takes one photo for everyone to have access.

The FBI is not the UK. We cannot control what Israel does but what we can do is create controls for the UK. so… if my loved one is killed, and there are photos on the phone.. then of course the police should have access! It’s a no brainer

True, so why would we want a situation that increases the risk of Israel, or North Korea, having the means to access something that sensitive?

What’s sensitive exactly? They don’t care about normal users!

Even if it means Journalists at News Of The World can also gain access to those photos?

That’s illegal! As is breaking and entering.

It didn’t stop them last time.

Yes.. and look what’s happened.

They renamed it to the Sun on Sunday, and carried on as normal?

Come on…. I’m saying that only the courts can have access.

Being illegal doesn’t stop things from happening. That’s why we lock our doors and fit burglar alarms.

and besides… they cracked the iPhone anyway!

That’s not how maths works.

Life isn’t maths. Life is ethics. Ethics are not maths

Yeah, there’s an Israeli company that will break into iPhones for anyone who pays.

What Israel does is up to them.

No, but encryption is maths.

But retrieving data is an ethical issue. It’s not black and white. It’s about appropriate use of powers

Like knowing when to put someone away for life, or releasing them in 10 years

It would not be acceptable for police to hack my phone without just cause, but it would be acceptable if they suspect me of plotting a terrorist act.

I agree, but when access to the data cannot be done without compromising everyone’s security, we have to ask where to draw the line?

We draw the line through the law.

CCTV inhibits crime in those areas, but we accept that it’s creepy to allow it in bathrooms.

Exactly. …There are laws regarding the use of CCTV

And many offices do not have CCTV inside because the risk of losing sensitive data is higher than the risk of crime.

You can only film in your property. That’s the law. But.. of course there is a difference between private companies and local government. And that’s where PFI come in….

Plenty of public CCTV as well

Not here there isn’t

Depends where you are, agreed.

There’s a camera on the bus.. I think, and at the primary school, maybe one in the shop…. but I don’t think big brother is watching when they can’t find muggings taking place at the Broomielaw!

That’s about effectiveness though.

Google is the one to watch

And Facebook

Yeah… but Facebook has countless terrorist pages funnily enough. So they can’t even monitor effectively. Let alone GCHQ.

Depends who has the most effective Algorithms. We don’t know what GCHQ is capable of. Just ask Snowden.

You know fine well it’s not about monitoring – it’s about textual analysis – patterns – heuristics. GCHQ is trustworthy. I have no problem with them whatsoever.

That’s cos you’re not Harriet Herman, or a union activist.

I really don’t, maybe I am naive, but I’m not scared. If I want to disconnect all I have to do is switch off the router and remove my sim
oh and stop using my bank card
and then become a missing person…

Not GCHQ, but …the police faced hard questions about covert monitoring of Jeremy Corbyn and other MPs

Well that’s not surprising. This has nothing to do with encrypted phones.

That security services were monitoring privileged conversations of individuals not suspected of criminal activity?

Does that come as a surprise? They may as well just have attended a meeting.

No. But it shows trusting the courts is naive when it comes to backdoors

Attending a meeting is enough to put you on a watchlist.

This is not the same as getting access to evidence for a crime that has taken place. If you want secrecy, you can meet in the woods. It’s very simple…

Sorry, but I do trust our system of justice.. I don’t necessarily trust the government and I certainly believe that there should be water tight controls that allow for breaking and entering into criminals data. And that includes data from corrupt politicians. It works both ways.

Digital forensics is a thing… with impossible encryption the whole thing falls down

Now… I like encryption… especially for B2B, but Apple are not gods! And private companies should never be above the law. If we let private companies rise above the law, we will be in a much worse situation than we are now… it’s already bad enough with tax avoidance.

It’s not about being above the law. It’s about a clear standard, and if police don’t have evidence to hand, they have to collect it. Sometimes cameras are broken. Sometimes weapons are lost, and sometimes you can’t get access to encrypted data.

They can only legally collect evidence if they have sufficient knowledge of a criminal activity.

And they have ways to set up intercepts in those cases, without physical access to the phone

Further Reading

Bill Gates say Apple should unlock the iPhone for the FBI

Feds ordered Google’s help unlocking nine Android phones since 2012

Troy Hunt: Everything you need to know about the Apple versus FBI case

Apple’s FBI Battle Is Complicated. Here’s What’s Really Going On

Continuing the Conversation About Encryption and Apple: A New Video From Mozilla

Encryption keeps us safe. It must not be compromised with ‘backdoors’ | Robby Mook

Open rights group: who’s checking on your chats in private online spaces?

Advertisement
Categories
ai data development free speech Uncategorized

2022 reflections

2022 seems to have been a strange year for a lot of people. There’s a lot of bloggers I follow whose output dropped a lot this year, myself included. Some of that I’m sure is a seeming loss of community, with changes to Twitter and Facebook, and I’m sure Google’s AMP as well, there’s been less drive-through traffic and less engagement.

I also think online discourse in many places is following the lines we see in politics where subtlety and nuance are increasingly punished and every platform is pushing shorter form content. We’re not giving ourselves time to digest and reflect.

And we should.

The pandemic is still here, but we’re adjusting, working from home is a natural state for many of us in tech, although that’s not an arrangement that plays to everyone’s strengths, so let’s make space for different companies with different cultures. There’s new ways of working to explore (hello the UK 4 day week experiment), people have moved jobs to take advantage of the change and create more family time.

But we can’t escape the world outside tech, and many of us are burning mental cycles on disease, on the massive weather events from climate change, on war, on the continued assaults by the far right, and watching inflation tickling upwards. It’s not an environment that leads us to our best work. It’s not an environment that helps us be in the moment.

Through 2016-2021 the world stared into the abyss of the rise of the far right, and the dismantling of certainties, before we were all thrown into lockdown. We were hoping for a turning point this year, but our leaders were lackluster in improvements, pulled us further to the right or were just plain incompetent. Instead of hope to counter the dispair, we got indifference at best Rather than turning away from the abyss, we collectively chose to build a car park next to it.

The greatest minds of our generation are building pipelines for ads for things we don’t need and can’t afford, whilst the AI engineers are building complex transformations that churn out uncanny valley versions of code, of mansplaining and of other people’s art. But of course the AI is built on a corpus of our own creations, and I don’t think we like the reflection looking back at us.

Ethics in technology isn’t just about accurately reflecting the world as it is, or how the law pretends it is (or seeks to adjust what is), STEM at its most important shows us the world as it could be. An airplane isn’t just a human pretending to be a bird. A car isn’t just a steel horse.

Yes, these advances in AI are cool parlor tricks, and they will lead to great things, but just like drum machines didn’t replace drummers, we need to get past the wave of novelty to see what’s really behind the wizard’s mask.

AI is dangerous. Look at how machine learning projected racial predictions on zip codes based on historical arrest data. Look at how many corrections Tesla’s “Self-Driving Mode” requires. Look how easily ChatGPT can be manipulated to return answers it’s been programmed not to. But, with the right oversight AI encompasses some very useful tools.

Let’s get out of the car park and look away from the abyss. What does the world AI can’t predict look like? After years of despair, what does a world of hope look like? What does the world you want for your children, grandchildren, nieces and nephews look like?

Land on your own moon. What’s your 10 year plan to change your world?

Categories
free speech leadership

No politics at work

No politics at work.

We’ve got these tabulating machines to send to Hitler.

Talking politics doesn’t get us paid.

There are plenty of companies withdrawing into their shells of privilege because the founders are scared of getting uncomfortable.

If you want a more uplifting picture of what happens when you talk politics at work : you save lives.

Whatever the antecedents of the recent wave of decisions to ignore people’s lives by “not talking politics”, the effects are defiantly anti-union, anti-women, anti-BLM, anti-trans and pro-christian-conservatism.

Even at companies who may have introduced these policies as a reaction against actual fascist viewpoints on their internal discussion boards (which I haven’t seen is the case, but I have heard some justify it that way), banning all politics supports the fascists.

It chills the speech of the oppressed, and gives fascists ammunition that “our free speech is under attack”.

There’s plenty of policies that allow you to talk about maternity leave and single payer healthcare without allowing blood and soul nationalism.

Companies that ban all politics are companies that have weak leadership who only want to align with the prevailing winds, in the most conservative way possible.

Companies, especially tech companies, can change the world. But too often they just reinforce the status quo.

Even companies that want to liberate us from the office, or liberate us from fossil fuels, or liberate us from Earth, still reinforce the power structures that led to the problems they claim to want to solve.

To summarise this, and other points,

“we’re uncomfortable with you having a life outside work”


Categories
code data development free speech security

Ethics in technology

This is an extension of a twitter thread I wrote in response to this tweet, and thread about the Cambridge Analytica revelations.

One of the key modern problems is how easy it is to access these tools. You don’t need professional training to string these together.

It’s as dangerous as if someone invented a weapon that could kill 10s or 100s of people, light enough to carry anywhere, and available in any store, without training. And expecting owners to police themselves.

People are terrified of AI. We know we don’t need AI to disable hospitals. We don’t need AI to intercept Facebook logins (although FireSheep and the pineapple are less effective now). We don’t need AI to send a drone into a crowded market.

Make a website the only place for government applications, such as medicare or millennials railcards and it’s easy to remove access for all citizens.

But combine all that with data and you can fuck up someone’s life without trying. You can give 2 people the same national insurance number or other id. You can flag them on the no fly list.

You can encode prejudice into the algorithm and incarcerate someone because they grew up in a black neighborhood.

The algorithm is God. The algorithm is infallible. Trust the algorithm.

Even when it tells you someone is more capable than the humans says she is, and punishes them.

(unless you’re under GDPR where you have the right to question the algorithm)

But tell anyone that people will use data for purposes they hadn’t considered (like using RIPA anti-terror legislation to see if someone’s in the school catchment area) then you’re paranoid.

Be paranoid. People will always stick crowbars in the seams. Whatever your worst case scenario for your code is, you’re probably not even close.


You can see my original tweet, and the repies, here:

The Guardian has a great interview on AI, existential threats and ethics on their podcast here.

Categories
data development free speech security

Government insecurity agencies

Given the SSL attacks that could be traced back to classing secure encryption as weapons subject to export restrictions, it’s clear that government security agencies have a deep conflict of interest that has led to significantly reduced security protection for their own citizens.

It’s clear that the Ransomware (or Ransomware as diversion) attacks on UK and US hospitals and many other sites are directly due to the NSA backdoor toolkit that was stolen earlier this year. Because if the government has a back door into a system, or an encryption platform, everyone has a backdoor, even if they don’t have access to it yet.

Which is why it’s great to see the EU outlawing backdoors in order to protect us as patients, service users, and data subjects, and I completely expect this will apply, like GDPR, to any system holding EU citizens data. So when the UK puts on its “we need a back door” legislation, companies need to choose to trade with the UK and compromise their security, or trade with the much bigger EU and protect their customers.

Encryption is like a lock, but it isn’t. It’s like a safe door, but it isn’t. Abstractions help to frame the problem, but they can obscure the issues. They make lawmakers think that what applies to banks applies to data.

(note: bank processes are optimised to replace credit cards because security works best when you can throw away a channel and start again if it’s compromised; this includes reversing transactions – which is hard to do when it’s the release of your personal data that needs reverted, rather than a row in a ledger than can be corrected by an additional row).

Encryption isn’t the problem. The San Bernardino iPhone had no useful intel. All the recent attackers in the UK were known, reported, and could have been tracked if they were prioritised. Banning encryption will have about as much impact as banning white vans. Breaking encryption weakens our security, threatens international trade especially with the EU, and when security holes lead to attacks on our hospitals and other infrastructure, bad security threatens our lives.

But so long as we’re afraid of terrorism, it’s OK for the populous to suffer?

Categories
free speech security

The graveyard of things

Dunnet head stone
End of the road

In the 1970s, UNIX was big, and so were the machines it ran on. The source code was controlled by those who sold the computers, and if you wanted to modify it so that you could fix things, or improve things, you were stuffed.

The tinkerers weren’t happy, so they created a charter, a licence to share, improve and adapt, so that you could create. Free Software was born. Free to be used, changed and distributed. It wasn’t for everyone but tinkered loved it, and it changed the world.

Fast forward to today, and one of the most famous users of open source, and part-time supporter, Google, stirs up trouble in its Nest division, when it announces not only that it will stop supporting an old device, but also that all existing ones will stop working: Nest’s Hub Shutdown Proves You’re Crazy to Buy Into the Internet of Things http://feeds.wired.com/c/35185/f/661370/s/4ebe676d/sc/15/l/0L0Swired0N0C20A160C0A40Cnests0Ehub0Eshutdown0Eproves0Eyoure0Ecrazy0Ebuy0Einternet0Ethings0C/story01.htm

The tinkerers have been duped. They don’t own the devices. They now have expensive hockey pucks.

So what could Google have done?

How about releasing the server code and allowing anyone to patch their device to talk to a local server? It might be less smart now, but it’s still smarter than a hockey puck.

Indeed, in a world where breaches are getting more common, and devices have more and more access into our lives, why isn’t local access an option? Maybe we need new standards, but most of this data has been accessible via usb for years.

This is your data and you should have the option to secure it to your network, and to keep collecting and using it no matter what changes happen to the original manufacturer.

Embrace tinkering. Reject dead man’s switches.

Categories
amnesty free speech

Amnesty

As a scientist, freedom of speech is very important to me, which is why I’ve given the Amnesty International banner a prominent placing above. Once speech gets shut down, thought can quickly follow.

I’m still working on getting my own speech out there. I’ll post details of papers and software soon.