Can't find what you're looking for?
View all search resultsCan't find what you're looking for?
View all search resultsNew technological systems do not protect human rights by default.
n July, United States President Donald Trump’s administration held an event titled “Winning the AI Race,” where it unveiled its AI Action Plan. Like the billion-dollar data-center deals announced during Trump’s trip this past May to the Persian Gulf, the plan is meant to enhance American leadership in AI. But since neither the plan nor those earlier announcements mentioned human rights, it’s fair to question what it even means for the US to “win” the AI race.
Many in Washington and Silicon Valley simply assume that American technology is inherently – almost by definition – aligned with democratic values. As OpenAI CEO Sam Altman told Congress this past May, “We want to make sure democratic AI wins over authoritarian AI.” This may be a good sentiment, but new technological systems don’t protect human rights by default. Policymakers and companies must take proactive steps to ensure that AI deployment meets certain standards and conditions – as already happens in many other industries.
Recent reports from the United Nations Working Group on Business and Human Rights, the UN Human Rights Council, and the Freedom Online Coalition have called attention to the fact that governments and companies alike bear responsibility for assessing how AI systems will affect people’s rights. Existing international frameworks require all businesses to respect human rights and avoid causing or contributing to human-rights abuses through their activities. But AI companies, for the most part, have failed to acknowledge and reaffirm those responsibilities.
These calls to action reaffirm obligations already shouldered by other industries. Most large companies know that they must conduct human-rights impact assessments before procuring or deploying new systems, integrate human-rights due diligence into product design and business decisions, include contractual safeguards to prevent misuse and provide meaningful remedies when harms occur.
The challenge with AI is not that the standards are unclear. It is that too many companies, and governments, are acting as if the standards do not apply. Consider Trump’s AI deals in the Gulf. If finalized, these investments could cement the region’s ambition to become a global AI hub, raising troubling questions about whether the US and its tech leaders are abandoning long-held commitments.
In the United Arab Emirates, the US has approved advanced chip transfers to G42, an Emirati AI firm, as part of a broader plan to build a massive AI campus in Abu Dhabi. In Saudi Arabia, a new state-backed company just unveiled multibillion-dollar agreements with major US firms to acquire chips and build infrastructure. And Elon Musk’s Starlink has also been cleared to operate in the Kingdom. None of these announcements mentioned protections to ensure that the technology won’t be used for surveillance or repression.
The risk is not hypothetical. The UAE is known to have used spyware against journalists and dissidents, and Saudi Arabia has engaged in all manner of transnational repression, as well as remaining deeply implicated in the humanitarian crisis in Yemen. New AI capabilities significantly increase governments’ power to trample basic rights, such as by synthesizing detailed information about dissidents, conducting real-time surveillance, analyzing people’s social-media posts and communications and controlling AI models’ output.
Share your experiences, suggestions, and any issues you've encountered on The Jakarta Post. We're here to listen.
Thank you for sharing your thoughts. We appreciate your feedback.
Quickly share this news with your network—keep everyone informed with just a single click!
Share the best of The Jakarta Post with friends, family, or colleagues. As a subscriber, you can gift 3 to 5 articles each month that anyone can read—no subscription needed!
Get the best experience—faster access, exclusive features, and a seamless way to stay updated.