Tread warily on AI
THE Paris AI Action Summit has revealed a deep schism within the international community on the roadmap for nurturing artificial intelligence. A joint statement pledging to “promote AI accessibility to reduce digital divides” and “ensure AI is open, inclusive, transparent, ethical, safe, secure and trustworthy” was signed by around 60 nations, including host France, summit co-chair India and key stakeholder China. However, the US and the UK were conspicuously missing from the list of signatories. Exuding Trump-like belligerence, US Vice President JD Vance warned global leaders and tech industry executives that “excessive regulation” could stifle the rapidly growing AI industry. He left no room for doubt that America was unhappy about Europe’s efforts to reduce AI risks, even as Britain raised concerns over national security and global governance.
The US, led by an impatient President, wants to plunge headlong into the sea of possibilities that AI offers, partly in a bid to steal China’s thunder. Trump’s enthusiasm is shared to some extent by Prime Minister Narendra Modi, who has said that AI is “writing the code for humanity in this century”. However, the PM’s appeal — AI must be developed for the global good — may not resonate with Trump, who likes to chart his own selfish path rather than following an internationally accepted course.
Displaying deft diplomacy, India has provided music to the US President’s ears by saying that the current focus should be on fostering AI innovation, with regulation being a secondary consideration. However, the rampant misuse of AI makes it imperative to establish a global framework aimed at enhancing trust, safety and transparency. The cautious approach adopted by Europe makes sense. Adequate checks and balances should be in place to ensure that the AI whirlwind doesn’t overwhelm the world with deepfakes and disinformation. In due course, a call can be taken on easing regulatory norms. Opening the floodgates at this critical juncture can prove to be a recipe for disaster.