The upcoming year will be pivotal for technology and AI governance and the role played by the open-source ecosystem. In recent years, ethical concerns surrounding the continuous development of AI systems and technologies have become mainstream, with undesired outcomes relatively well-defined.
However, the question arises: What lies ahead, and how does open source play a role in advancing current AI approaches while potentially entailing risks without adequate safeguards?
Who should take the necessary actions, and are the multitude of frameworks, principles, and standards truly protecting us from harm, or are they merely tools for navigating compliance complexities?
Should standards and soft laws suffice, and if not, how can we transition to hard laws and governance? Is global governance necessary, and can effective mechanisms be implemented? Where and what are our collective thresholds? Who is assisting in navigating the ethical conundrums and complex risks posed by these often nebulous systems?
Lastly, what are the best ways to guide developers and implementing agents in ethically and transparently testing, validating, verifying, and providing technical oversight for AI capabilities (both human-machine and machine-machine)?
Key Discussion Points:
A: Present Open Source context and introduce speakers
B: Different dimensions of how the new context is unfolding that leaders, decision-makers, and users should be aware of
C: How can listeners, whether in real-time or via the podcast, gain a better understanding of adequately discussed and under-discussed aspects concerning technology governance, open source and LLMs after this session?
Other potential dimensions:
• Sloganeering and «marketeering»: «optimization, cost and speed»
• Beyond governance compliance thinking
• AI, open source and nefarious uses/impact on national security and warfare
• Consumer trust and confidence impacting adjacent areas
Join our webinar as we provide diverse and profound insights on the necessary conversations that organizations, companies, and individuals need to have.