Governance Beyond Law: Soft Power and Norms

Laws set boundaries, but culture and norms shape how technology is lived.

When people talk about AI governance, they often think of formal law: regulations, enforcement, and compliance. But governance is broader than statutes. It also happens through soft power — the informal rules, cultural expectations, and industry practices that shape behaviour long before courts are involved.

Industry Codes of Conduct

Many technology companies publish voluntary guidelines on fairness, transparency, and accountability. These codes can set expectations and foster best practice, but they are also fragile: without external enforcement, they depend on corporate willpower and public image. They can inspire genuine progress — or act as a shield against deeper scrutiny.

Civil Society and Collective Pressure

Unions, NGOs, and grassroots movements play a growing role in AI governance. From workers demanding limits on surveillance tools to campaigners highlighting discrimination in algorithms, civil society pressure forces accountability where law lags behind. Boycotts, petitions, and public debate create reputational costs that companies cannot ignore.

Cultural Norms and Trust

Attitudes toward AI differ across societies. In some contexts, people trust automated systems as efficient and impartial. In others, scepticism runs deeper, especially where institutions already struggle with legitimacy. These cultural differences influence not only how AI is adopted, but how strictly it is expected to be governed.

Beyond the Letter of the Law

Governance does not end where the law begins. It is a layered system in which formal regulation, industry standards, and cultural norms interact. The strength of AI governance may depend less on the precision of any single law, and more on whether societies can cultivate a culture of accountability that extends across borders, industries, and communities.

Why It Matters

To focus on law alone is to miss the wider ecology of governance. AI is shaped not only by regulators and legislators, but by designers, workers, activists, and users. In recognising the role of soft power and norms, we acknowledge that governing AI is not just a legal project — it is a cultural one.