WHEN I was a girl, I liked to watch the moon. I wasn’t much of a sleeper so I would sit with my cheap plastic telescope and my sky atlas. It was comforting to see humans had found a way to reach something so far away. This craggy grey satellite was a sparking flint to human imagination. First, we learned what it was, then how to get to it. Our curiosity, our hunger for knowledge, our desire to solve problems have brought us great things.

Ingenuity is so folded into our lives it’s rendered invisible. The flick of a kettle, a key turning a latch, radio frequencies beaming information into palm-sized computers. All a testament to the mastery of our world and the desire to improve lives, but just like our lonely moon, this brilliance harbours darkness. Our knowledge will cast a shadow over the human project long after we’re gone ... CFCs, pesticides, Agent Orange, the atomic bomb.

READ MORE: Progress on land reform must not stand still

Now enter big data, analytics and an opportunity to use both to change the course of history. More of our personal information is available to others than ever before. As we interact with websites, carry out transactions and post on social media, we leave a richly textured digital footprint. We can’t always be sure who’s collecting what and how it might be used.

As technology advances, we can store enormous data sets and do deep analysis quickly. Individuals and groups can be profiled and used to predict behaviour. This data can be used to improve public health or digital services and to inform research at a scale previously unseen. We can use what we know about the population to better respond to their needs.

Knowledge empowers us, but it can also be used to harm and limit our freedoms. The benefits of massive information capture largely outweigh the risks, but the risks are significant: privacy breaches, data selling and a lack of anonymity. The time you spend online could be adding to a stranger’s picture of you. What they do with that information depends on the stranger in question.

And this is why the company Cambridge Analytica, used by Donald Trump during his presidential campaign, has been criticised over its use of information it took from people’s Facebook profiles. Behaviour change communications are nothing new. Using data sets to inform them is newer, but commonplace and arguably necessary when meeting the needs of a complex world.

The two are combined to influence civic behaviour, from increasing cervical smear attendance and reducing litter to informing public policy and finding better ways to support vulnerable groups. But it’s not all positive. Companies like to sell us things, and data helps them tailor goods to us so they can sell even more. Online, we’re subtly manipulated constantly by forces larger than ourselves, affecting our reasoning, our habits and our motivations. Regardless, it’s quite incredible the way humans work in symbiosis with tech, how we can lean on the ingenuity of others in the past to solve the problems of the present.

If something isn’t working, is not having the desired result or not reaching the right people, data can tell us what to do next. But all of this relies on the human touch, to define the problem and to use technology in service of a solution.

This dance between the digital and corporeal enlarges our imaginations beyond the limits of ourselves, for better and for worse. What we know we are able to do must come with a reflection on both the motivations for it and the potential impacts on the people, structures and institutions that comprise a free society – something we can’t afford to be ignorant of in light of how analytics may have been used to attack democracy.

In fiction, superpowers are generally used for good. They’re created by those with the desire to make a positive impact. Villains point their efforts in the other direction, wreaking havoc for personal gain.

In fiction, there’s little attention given to how the fruits of human excellence might be utilised for mass manipulation. How big ideas might be funded by oligarchs, great minds might be fertilised by demagogues, or put in service to the agenda of the power hungry. This complexity, this nuance belongs to the real world, where the stakes are geopolitical dominion. Augmenting our reach into the lives of others is a digital superpower, although nascent one. We’re just learning how to use it, but it’s magical thinking of the Gene Roddenberry variety to imagine a future where we’ve moved past oppressive power structures. Science and technology have given us unimaginable power, so, of course, they will be abused.

More than ever before, we have the ability to shape the human experience. We have designed our built environment to influence our interactions with it. We have created structures and systems that dictate where we can and can’t go, what we can and can’t do, who can go where and who can do what.

Without signs and instruction, the built world can send signals about the freedom to enjoy it. We might feel free until we lose our homes and find spikes in a doorway or divisions on a bench, making the freedom to sleep impossible. Or we might need a bathroom but can’t afford the cup of coffee or the entrance fee to access one.

There’s a growing movement against the environment being engineered to control how people behave. Online, the human hand is less visible, but we are shepherded all the same. Election tampering is not a forged ballot or a media conspiracy. It’s about how those in power can influence the free will of citizens. The information war is being waged against without us even realising it.

Cambridge Analytica is a modern parable of the dark side of human brilliance. We are folded into all that we make, offline and online. All advancement has the human imprint, for better or for worse. When our imperfect humanity meets technological power, we must all be on our guard. Democracy cannot be the casualty of progress.