Mr. Nguyen Anh Tuan speaks at the Riga Conference 2020

Oct 14, 2019News

The Boston Global Forum, of which I’m the co-founder and CEO, sees a need for a new social contract, one that is suited to a world of artificial intelligence, big data, and high-speed computation and that will protect the rights and interests of citizens individually and society generally.

Social contract theory has a long history, predating even the writings of Thomas Hobbes and John Locke.

A central question of a social contract is the rules and power arrangements that a rational person would be willing to accept not knowing in advance their position in society – whether they’ll be among the many people subject to the coercive force of power or be one of the few exercising that power.

The idea of a social contract informed the constitutions of the world’s democracies. The principle of the rule of law; the idea that the power of government should be divided between branches of government in order to protect against abuses of power; the provision of individual rights, such as freedom of expression and the right to due process if charged with a crime – all of these developments flowed from the idea of a social contract.

Over time, the social contract placed positive obligations on government – actions that government had to take in order to promote the welfare of people and society.

The United States in the early 19th century pioneered universal public education – government’s obligation to provide children with free schooling. Europe later took the lead in providing universal health care. Income security was also part of the social contract in the form of policies such as the minimum wage, unemployment compensation, and social security for retirees.

These aspects of the social contract emerged in response to the demands of the Industrial Age. We now need to update the social contract to fit the conditions – including the threats and opportunities – of the Digital Age.

The Digital Age has the promise of empowering citizens, but it also could empower governments and private entities in ways harmful to the interests of citizens and society generally. Examples of the threat are already evident, for instance, in China, the use of facial recognition software as a means of political control, and in the use of personal data by Facebook, Google, and other platforms to influence our buying habits.

A new social contract is needed to blunt such developments and encourage beneficial ones.

The Boston Global Forum has developed a set of principles and guidelines that could inform the creation of a social contract for the Digital Age. We are calling it the AI World Society Social Contract 2020

We see five centers of power and responsibility that need to be taken into account in the social contract: government, citizens, firms, civil society organization, and AI assistants. In the US, where the Government has branches, the Executive, Legislative and Judicial, we foresee the emergence of seven centers of power.

And we see a need for a social contract that places a priority on transparency, privacy, and the empowerment of citizens as opposed to government.

In terms of government, we believe: Governments should create laws that promote transparency and accountability in data usage, both by government and private parties. Governments should facilitate and require independent audits of automated decision systems to ensure their compliance with laws and standards. Governments should create laws and monitoring systems to ensure that AI systems and assistants are serving the public interest. Governments should create systems that enable citizens and civil society organizations to have greater control over their personal data and its use.

The social contract would also serve to enhance the freedom, dignity, and aspirations of citizens. All citizens should have access to a device that connects them to the digital world. Citizens should be protected in their rights, including the right to privacy, and be protected from cyberattack. Citizens should, to the degree possible, have  ownership of their personal data. Citizens should, through digital systems, have an stronger voice in the political decisions that affect them. Citizens should have access to education pertaining to the use and impact of AI.

The social contract would also involve obligations on, and protections for, business firms. They would have rights and responsibilities and would be accountable for their actions through audits and other mechanisms. They would be subject to limits on data ownership, and face penalties for violating those limits.

The Boston Global Forum also sees a need in the social contract to address the role of civil society organizations. They would have rights and responsibilities, and would have a defined role in monitoring government and firms for compliance with their obligations. They also would be responsible in their own work for adhering to norms, standards, and laws, and face penalties for violating them.

Finally, we see a need in the social contract to address the status of robots, automated systems, and other AI assistants. They are a source of power in their own right and need to be regulated. The creation of AI assistants should comply with preset standards, norms, and laws, and they would need to be created in ways that allow a determination of whether these restrictions have been met. AI assistants should include ones that are specifically designed to monitor and control the use of AI so that it serves the interests of citizens and of society more broadly.

Over the course of the next year, 2020, the Boston Global Forum will be working to draft a Social Contract as AI World Society Social Contract 2020 consistent with what I’ve just outlined and that is responsive to the ideas that we receive from others.

A fundamental assumption of the social contract is that the five centers of power – government, citizens, business firms, civil society organizations, and AI assistants = are interconnected and each needs to check and balance the power of the others.

We welcome your participation in this effort. We believe that there is a compelling need for a social contract suited to the conditions of the digital age. We recognize that other organizations are engaged in related efforts, even if they might use a different term – such as an ethical code for the digital age – to describe what they’re doing.

The Digital Age is still in an early phase, and the policies and understandings that are established at this point will influence later choices. If we fail at this time to develop a social contract, it will be much harder to do so later.

Thank you for listening and, again, we invite your ideas on the shape of a social contract suited to the requirements of the Digital Age.