We need to distinguish corporate spaces from their personal space and public space counterparts as proprietary technology becomes more deeply embedded in all aspects of life. Both public spaces and personal private spaces are intruded upon by technologies which inherently cause these spaces to take on qualities of corporate private spaces. This happens, for instance, when your neighbour’s Ring doorbell films you walking down a public sidewalk, when your municipality installs surveillance cameras, or when your smart TV records a conversation in your living room. In these cases, the conversations you have with family members don’t just take place between your four walls, but are also part of a datacenter somewhere; you aren’t just walking down the street, but also into datasets owned by private companies. Your activity in these personal and public spaces thus becomes part of corporate surveillance networks and is subject to being stored, reviewed, analysed, shared, and used to ‘nudge’ or manipulate us in ways that we are not even aware of.
Each of these cases subjects our physical worlds to the rules of the code – rules which are hidden within proprietary code or ‘black boxes’ – often without a person’s knowledge or consent, and often without a vote about whether and how corporations may embed themselves in these other types of spaces. This reality erodes autonomy and privacy as individuals in our homes and in our cities. It is also a problem for our autonomy as a society, as we increasingly lose control over our shared spaces.
At this point, it might feel reasonable to say ‘technology is bad and we don’t want to use it’. But abandoning technology altogether is not a realistic option. There is indeed a false dichotomy in the notion that we must surrender our rights and values if we want to use technology. This false dichotomy is based on the assumption that technology is inherently closed and centralised, and inherently accumulates power and wealth. We hold on to these assumptions because most of the technology that we interact with is built precisely with the goal of accumulating profit or power. When profit, power, and such assumptions form the foundation of technological development, the resulting technology and design processes are closed, and position people as subjects or consumers under surveillance and with reduced autonomy.
We need to reframe our assumptions and instead demand that the technology in our cities and homes be built to adhere to democratic principles like openness, fairness, and inclusivity. It is possible to build technology that is in line with shared values like these, and this requires us to consider how foundational goals and assumptions steer technological design, development, and implementation, all of which impact both the role of people as citizens and society in relation to that technology. This is the public stack approach, which advocates for founding technology upon shared values and scrutinised assumptions; upholding those values through open, participatory design processes and open-source technology; and positioning people as free and sovereign within democratic societies.
Hollandse Luchten is one example of the public stack in practice. This project in Noord-Holland arose out of a shared concern about air quality and involves local residents who build air quality sensors, install them, and gather local air quality data. People can use the data as they like: for personal insight; for shared community knowledge; to hold authorities accountable; or to advocate political change. Hollandse Luchten’s technology is open-source and assembled by residents themselves, so they are sure there is no secret microphone, video camera, or tracker that reports their activities back to a central holder. Conscious choices were made about how to send data (with LoRa connectivity hosted by TheThingsNetwork, a community-based initiative) and how to host it (at the Province of Noord-Holland, with agreements regarding the data’s open governance and accessibility). Of course, Hollandse Luchten does face challenges in adhering to the public stack – certain components of the sensors were purchased and are proprietary, for example – but its attempt to do so nonetheless results in technology that places citizens in a radically better position than its closed and proprietary counterparts, like the Amazon Smart Air Quality Monitor.
Hollandse Luchten demonstrates that it is possible to make technology for homes and cities that increases the personal and societal benefits we can draw from data, while minimising the detrimental aspects of proprietary surveillance technology. Equipped with the knowledge that change is possible, we need to demand that the technology in our public and personal private spaces is open – that we have enough information to properly judge whether or not it conforms with our shared laws and values.
The question is often posed of what we, regular people, can do to make a difference in this regard. People can indeed take certain steps to protect themselves and make a change – like choosing to not fill their homes and neighbourhoods with ‘smart’ technology, or by adopting privacy protection practices online – but the unfortunate reality is that people are often unaware they are interacting with technology, are powerless to avoid it, or simply do not have the time, knowledge, or energy to do so. In this regard, people can help protect personal and public spaces by voting, protesting, and contacting their representatives to demand more transparency and implement better legislation.
The majority of the burden of change lies with politicians and policymakers, who have a responsibility to protect our fundamental rights. Governments are well positioned to lead the shift towards technology that benefits society rather than erodes it. Governments can lead by example by using open-source technology themselves and by initiating participatory public processes around tech policy, implementation, and regulation. Governments can implement robust legislation that safeguards shared fundamental rights and values (like privacy and control over personal data) and put an end to surveillance capitalism. Governments can fund the development of open, fair, and inclusive technology. By extension, governments also have the responsibility to stop funding and investing in capitalist proprietary surveillance technology and instead collaborate more closely with people to ensure that public values are upheld in tech development.
We are fortunate to be able to choose our leaders in government. In future elections, let’s vote for those who are willing to uphold democratic ideals everywhere.