Behavioural Patterns are the New Skin Colour

In a system that can't see you, your behavioural patterns become the things that are used to discriminate against you. That's all a digital system can really know about you. Physical attributes don't mean anything in a digital world, things like your ethnicity, gender, age, etc., aren't relevant to digital systems in the same way they are to bigots. But they can be inferred based on your behavioural patterns.

As systems increasingly categorize us based on our behavioural patterns they force us into different digital classes, some are ideal users, some are problematic. If you use a system in a way that doesn't violate any terms but that the operators didn't intend, you face increasing algorithmic discrimination in that system as you try to navigate it. These systems are the systems that we use in every day life, your bank accounts, social media's, iCloud's and Google Drives — you could be restricted or completely cut off at the whim of an algorithm, stuck without appeal because you won't even know which rule you broke.

And nobody knows what the rules are except for the ones who wrote 'em.