In conversation with Kent Walker: President of Global Affairs at Google

AI risks and potential

Events News

The ANU School of Cybernetics recently hosted an event featuring Kent Walker, President of Global Affairs at Google. The event brought together ANU School of Regulation and Global Governance, the ANU Tech Policy Design Centre, the ANU College of Law, and the ANU College of Computing, Engineering and Cybernetics in a collaborative and insightful conversation on AI risks and potential.

Kent Walker’s presentation was followed by a fireside chat hosted by Katherine Daniell with responses and questions from three distinguished ANU panellists, experts in AI systems, policy, technology, regulation, design, and implementation. The session concluded with a general discussion and Q&A with the audience.

Kent
Photo Credit: Sherice Kazzi

The panel discussion addressed key issues surrounding the governance and regulation of emerging technologies, with a particular focus on artificial intelligence (AI). The panellists, each bringing expertise from tech policy, global governance, and cybernetics, offered a range of insights into managing this complex landscape.

A central theme was the need to achieve the right balance in technology regulation.

Johanna Weaver used historical examples, such as the “red flag law,” (which required a person to walk in front of a car carrying a red flag) to illustrate how regulations that might seem sensible at the time, can quickly become outdated or ineffective. This serves as a cautionary tale to regulate with humility. Johanna also highlighted Australia’s role in shaping regional technology governance, given many countries in Asia-Pacific watch Australia’s regulatory approach closely.

The debate on open source and AI was another major topic. Johanna posed a question to Kent Walker about his views on this issue, noting the contrasting perspectives: one advocating for open access to democratise these tools, and the other concerned about misuse by malicious actors. Walker acknowledged the valid concerns from both sides, explaining Google’s approach of selectively open-sourcing models like AlphaFold while ensuring appropriate safeguards are in place.

Panel
Photo Credit: Sherice Kazzi

Kate Henne explored the regulatory challenges beyond AI, pointing out the broader digital infrastructure that supports these technologies. She stressed the need to consider “networked relationships” in governance, involving diverse stakeholders including government, civil society, and the private sector. Henne also highlighted the importance of integrating principles such as procedural justice and equity, which may be overlooked in risk-based regulatory frameworks.

Andrew Meares contributed perspectives from the field of cybernetics, emphasising the role of imagination and infrastructure in shaping the future. He shared insights from his research on Australia’s early digital innovations, like the Overland Telegraph Line, and discussed how these historical examples can inform our approach to emerging technologies. Meares also raised questions about the skills and environmental considerations necessary for supporting sustainable digital futures.

Kent Walker emphasised the need for skills development and AI’s potential to tackle environmental challenges, such as improving energy efficiency in data centres. The conversation highlighted the complexities of AI regulation and the necessity for innovative, sustainable solutions.

Throughout the discussion, the panellists explored the inherent tensions and trade-offs in governing technology within the Australian context. They emphasised the need for a nuanced, collaborative approach that balances efficiency, equity, security, and benefits for both people and the environment. The conversation highlighted the importance of continuous learning, adaptability, and questioning existing assumptions as Australia navigates the rapidly evolving technological landscape.

Audience
Photo Credit: Sherice Kazzi

During the Q&A sessions, audience members posed several questions around how tech leaders are managing complexities relating to international alignment and diversity of standards. Questions and responses delved into the ethical implications of emerging technologies, including how to balance innovation with privacy concerns including a discussion around alternative regulation models. Additionally, students sought guidance on addressing the digital divide, specifically how to ensure equitable access to technology in under-resourced communities and in relation to gender. Finally, there was significant interest in career advice, with students asking about pathways to contribute effectively to tech policy.

Overall, the conversation offered a compelling examination of the challenges and opportunities in managing transformative technologies like AI in Australia.

Watch video here

Author’s Note: This blog post was crafted from a live event transcription, with key themes and summaries shaped through automated insights. The author edited the final version to ensure both the accuracy of the content and the atmosphere of the discussion in the room were authentically captured.

You are on Aboriginal land.

The Australian National University acknowledges, celebrates and pays our respects to the Ngunnawal and Ngambri people of the Canberra region and to all First Nations Australians on whose traditional lands we meet and work, and whose cultures are among the oldest continuing cultures in human history.

arrow-left bars search times