Encryption & anonymity is a responsibility not a right – In defence of cryptoanarchy

Most of the world’s internet users feel little need to rely on encryption, beyond when it is completely obvious and implemented by default (e.g when performing online banking etc.). But when it comes to personal communications, where traffic interception by the State is highly likely in some jurisdictions, an outright certainty in others, the average user takes the “I have nothing to hide” attitude.

Assume by default, whether you live in a ‘democracy’ or under overt fascism, that the State intercepts everything. Constitutional rights are a facade. They are not enforced by the State, but to hold them to account. The onus for enforcing them lies upon us.

In today’s world, where advanced machine learning technology is freely available to all, and implementable by those with even the most elementary technical expertise, this attitude is naive at best, and wilfully negligent at worst, based on the outdated notion that all the State has to gain from unencrypted communications, and the identities of those involved, is some obscure and seemingly unimportant piece of information (who cares what I had for breakfast, you say?). This is based on a completely forgivable, but fundamental misunderstanding of not just what information can be directly extracted, but which can be inferred from it.

What can be inferred about you, indirectly allows things to be inferred about others, and others…

We live in an era of incredibly advanced analytical techniques, backed by astronomical computing resources (in which the State has the monopoly), where based on nth degrees of separation, and the extraction of cross-correlations between who they are and what they say, anything and everything we say directly compromises the integrity of the rest of society.

The information that can be exploited includes, but not in the slightest limited to:

  • Message content itself.
  • When it was sent and received.
  • Who the sender and receiver were.
  • Their respective geographic locations, including realtime GPS coordinates.
  • More generally, everything included under the generic term metadata.
  • And far more things than I dare imagine…

All of these things can subsequently be correlated with all other information held about you, and those you engage with, and similarly for them, extended onwards to arbitrary degree.

The simplest example of how this type of technology works is one we are all familiar with — online recommendation systems (including purchasing suggestions, and social media friend recommendations). By correlating the things you have previously expressed interest in (e.g via online purchases, or existing friendship connections, as well as theirs), advertisers can, with sometimes astonishing degrees of accuracy, anticipate seemingly unrelated products to put forth as suggestions of potential interest.

Alternately, I’m sure I’m not the only one who can attest to having received Facebook friend recommendations based on someone I had a conversation with at a bar, but with whom no digital information of any form was exchanged. But in fact we did — via the realtime correlation of our GPS coordinates, it can be inferred that we spent some time engaging with one another that couldn’t have been by coincidence.

But the depth of analysis that can be performed, can extend far beyond this, to infer almost unimaginably specific information about us.

Behind the scenes, the analysis behind this is incredibly sophisticated, and invisible to us, performing all manner of cross-correlations across multiple degrees of separation, or indeed across society as a whole, based on seemingly obscure information we’d never have given much thought to. This includes not just correlations with other things you have taken interest in (people you have communicated with, or products you have purchased), but also knowledge about which groups (defined arbitrarily — social, demographic, ideological, interest-based, whatever) you belong to, whether overtly specified or not, and the collective behaviour of the respective group. Mathematically, this can be revealed via membership to connected sub-graphs (graph cliques) within a social network graph.

Entire fields of mathematics and computer science, notably graph theory and topological data analysis, dedicate themselves to this pursuit. Machine learning techniques are perhaps the most versatile and useful of them all.

These identified sub-graphs can be associated as ‘groups’ (in any real-world or abstract sense), identified in some manner as having something in common, from which potentially entirely distinct characteristics of their constituents can be inferred via correlations across groups. Membership to multiple groups can quickly narrow down the specifics of individuals sitting at the intersection of groups.

This type of analysis is very much the crux of what modern machine learning does by design — advanced higher-order analysis of multivariate correlations, particularly well suited to social network graph analysis, where nodes represent individuals, and edges are weighted by advanced data-structures characterising all aspects of their relationships, far beyond just ‘who knows who’.

For clarity, many familiar with the term social network graph, interpret this in the context of the graph data-structures held by social media networks like Facebook. I am not. Nation states themselves hold advanced social network graph data-structures of their own, into which they have the capacity to feed all manner of information they obtain about you. There is very strong evidence that even in the so-called ‘Free World’, these major commercial players actively collaborate with the State, from which the State is able to construct meta-graphs comprising unforeseen amounts of personal information — even if it is ‘just’ metadata.

Much of the truly insightful information to be obtained from advanced graph analysis is not based upon local neighbourhood analysis (i.e you and the dude you just messaged), but by global analysis (i.e based collectively upon the information contained across the entire social graph, and its multitude of highly nontrivial interrelationships, most of which are not at all obvious upon inspection, that no human would ever conceive of taking into consideration, but advanced algorithms systemically will, without discrimination).

Machine learning is oblivious to laws or expectations against demographic profiling — racial, gender, political, medical history, or otherwise. Even if that data isn’t fed directly into the system (e.g via legal barriers), it’ll surely figure it out via correlation, with almost certainty.

For example, with access to someone’s Facebook connections, with current freely-available software libraries, an unsophisticated programmer could infer much about your demographics, political orientation, gender, occupation, and much more, with a high degree of accuracy in most cases, even if no information of the sort were provided on your profile directly. This elementary level of analysis can be implemented in 5 minutes with a few lines of code using modern tools and libraries. Needless to say, national signals intelligence (SIGINT) agencies have somewhat greater resources at their disposal than to hire a summer student for half an hour.

The Russian influence campaign prior to the Trump election is alleged to have actively exploited this exact kind of analysis, with an especial focus on demographic analysis (including geographic, gender, voting, ideological, group membership, and racial profiling), to create a highly individually targeted advertising campaign, whereby the political ads targeted against you may be of an entirely different nature to the ones received by the guy next door, individually calculated to maximise psychological response accordingly.

This technology demonstrably provides direct avenues for population control, in particular via psychological operations (PSYOPs). If it can be exploited against us by adversarial states, it can be exploited against us by our own. I work off the assumption that all states, including our own, are adversarial.

The political chaos this has created in the United States (regardless of whether or not the Russian influence campaign actually changed the election outcome) is testament to the legitimacy of the power and efficacy of these computational techniques. There would be little uproar over the incident, were it not regarded as being entirely plausible.

Similarly, it’s no secret at all that political parties in democratic countries rely on these kinds of analytic techniques for the purposes of vote optimisation, policy targeting, and gerrymandering. Indeed, commercial companies license specialised software optimisation tools to political operatives for exactly this purpose — it’s an entire highly successful business model. If politicians utilise this before entering office, you can be sure they’ll continue to do so upon entering office — with astronomically enhanced resources at their disposal, backed by the full power of the State, and the information it has access to.

It goes without saying that since the tools for performing these kinds of analyses are available to everybody, anywhere in the world, as freely downloadable software packages, and the entire business model of corporations like Google and Facebook is based upon it (and to be clear, that is the business model, into which astronomical resources are invested), what major nation state intelligence apparatuses (especially the joint power of the Five Eyes network, to which my home country of Australia belongs) have at their disposal, extends these capabilities into unimaginable territory, given the resources they have at their disposal (both computational and in terms of access to information).

The National Security Agency (NSA) of the United States, their primary signals intelligence agency, is allegedly the world’s largest employer of mathematicians, and also possesses incredible computing infrastructure. Historically, cryptography (a highly specialised mathematical discipline) has been the primary focus of this. In the era of machine learning, and what can be gained from it from an intelligence perspective, you can place bets this now forms a major component of their interest in advanced mathematics.

But the applications for this are not only applicable to catching terrorists (how many terrorist attacks have actually taken place on American soil to justify investments of this magnitude?). There is good reason that China is now a leading investor into AI technology, given their highly integrated social credit scheme, which has very little to do with terrorism, and far more to do with population control.

We cannot become a political replicate of the People’s Republic of China. But we probably are. In the same way that the Chinese people are in denial, so are we.

“But this couldn’t happen here! We are a ‘democracy’ after all?”

It’s now publicly well known that the NSA has a history of illegally obtaining and storing information on American citizens within their systems, in direct violation of the United States Constitution. It’s impossible to know via national security secrecy laws in what capacity it has been utilised, but the potential for misuse has utterly devastating implications for American citizens and the constitutional rights they believe in.

When applied from the nation state’s point of view (democracy or dictatorship, regardless), where the overriding objective is population control and the centralisation of power, the primary tool at their disposal is, and always has been, to manipulate and subjugate the people. With this kind of advanced analytic power at their disposal, their ability to do so is in fantastically post-Hitlerian territory. If Stalin were alive today, he would not subscribe to pornography magazines, he would subscribe to the Journal of Machine Learning, and spend evenings when his wife was absent, wearing nothing but underwear in front of a computer screen, salivating over the implementation of China’s social credit scheme, and its highly-integrated nationwide information feed, built upon a massive-scale computational backend, employing the techniques described above with almost certainty. He could have expanded the gulags tenfold.

But any kind of computational analysis necessarily requires input data upon which to perform the analysis. There are few computations one can perform upon nothing.

We have a responsibility to provide the State with nothing!
Let them eat cake.
Better yet, let them starve.
And may their graphs become disjoint.

When the State obtains information about your interactions, it adds information to their social graph. What Google and Facebook have on you, which many express grave concern about, is nothing by comparison. The enhancement of this data structure does not just compromise you, but all of society. It compromises our freedom itself.

Having prosperity and a quality of life is not freedom — Hitler achieved overwhelming support because the people thought so. Freedom means that at no point in time, under any circumstance, can they take it all away from us, or make the threat to do so. This is not the case, it never was, it likely never will be, yet it must be so.

In the interests of the formation of a Free State, and inhibiting the ever-increasing expansion of the police state, the extrapolation of which is complete subservience, we have a collective responsibility to:

  • Understand and inform ourselves about digital security, including encryption and anonymity.
  • Ensure full utilisation of these technologies by default.
  • Be aware of the extent of what modern machine learning techniques can reveal about ourselves, others, and all of society.
  • Be aware of how such collective information can be used against us by the State, assuming the worst case scenario by default.
  • Employ reliable and strong end-to-end encryption technologies, where possible, no matter how obscure our communications.
  • Conceal our identities during communications where possible.
  • Provide the State with nothing beyond what is reasonably necessary.
  • Oppose unnecessary forms of government data acquisition.
  • Oppose the integration of government databases and information systems.
  • Enforce ethical separations in data-sharing between government departments.
  • Legislate criminal offences for government entities misusing or falsely obtaining personal data.
  • Offer full support — financial, via contribution to development, or otherwise — to worthy open-source initiatives seeking to facilitate these objectives.
  • Do not trust ‘closed’ (e.g commercial or state-backed) security, relying instead on reputable open-source options. Commercial enterprises necessarily comply with the regulations and laws of the State (regardless which). To the contrary, open-source development is inherently transparent, offering full disclosure and openness.
  • Work off the assumption that any attempts by the State to seek backdoors, prohibit, or in any way compromise the above, to be a direct attempt to subvert freedom and work towards the formation of a totalitarian state.
  • Metadata contributes to the State’s social graph. Even in the absence of content, it establishes identities, relationships, timestamps, and geographic locations. These contribute enormously to correlation analysis with other information in the graph.
  • Be absolutely clear that any political statements to the effect of “we’re only able to access metadata” are made with full knowledge and absolute understanding of the implications of the above, and are deeply cynical ploys. They would otherwise not seek access to it.
  • Any political statements seeking any kind of justification based on highly emotive words such as terrorism, safety, protection, national security, stopping drugs, or catching pedophiles, should be treated with contempt by default, and assumed to be calculated attempts, via emotional manipulation, to subvert a free society and centralise power.
  • Modern history should be made a mandatory subject throughout primary and secondary education, in which case nothing stated here is even mildly controversial or requires any further substantiation. For this reason, no references are provided in this post.

All technologies can be used for good or for evil. There has not been a technology in history to which this hasn’t applied. The establishment of the internet itself faced enormous political opposition for highly overlapping reasons. Needless to say, the internet has been one of the most positively influential technological advances in human history, and one of the most individually liberating and empowering tools ever invented — an overwhelming force for freedom of information, expression, and unprecedented prosperity and opportunity across the globe.

I genuinely believe that the biggest threat humanity faces — far beyond terrorism, drugs, or pedophiles — is the power of the State.

History overwhelmingly substantiates this belief, and despite acknowledging the downsides, I make no apologies for offering my unwavering support to the crypto movement and what it seeks to achieve. The power asymmetry in the world has never been tilted in favour of terrorists — it always was, and always will be, in favour of the State, historically the greatest terrorists of them all.

One of the deepest and most eternally meaningful statements ever made in the history of political philosophy, came from one of its most nefarious actors,

“Voice or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same in any country.”

— Hermann Göring

One thought on “Encryption & anonymity is a responsibility not a right – In defence of cryptoanarchy”

Leave a Reply