Home artificial intelligence Generative A.I. could encourage brainwashing

Generative A.I. could encourage brainwashing

0

A.I. can be a weapon for dictatorial governments

The next great gift from the free world to authoritarians is likely to be generative AI. The world’s dictators have become aware of the transformative power of generative AI to produce original, compelling material at scale as a result of the viral introduction of ChatGPT, a system with uncannily human-like capabilities for writing essays, poetry, and computer code.

Generative AI refers to a class of artificial intelligence algorithms that can autonomously create content such as images, music, text, and videos that have not been previously inputted into the algorithm itself. Generative AI algorithms often use deep learning techniques, such as generative neural networks (GAN), to create content that appears authentic and indistinguishable from that created by humans. These algorithms can be used in many fields, such as art, music, fashion, design, video games, and even book writing.

However, the heated debate that has developed among Western industry executives regarding the dangers of disseminating cutting-edge generative AI tools has largely overlooked autocracies whose consequences are most likely to be harmful.

Up until now, the main worries about generative AI and autocrats have mainly been about how these systems could amplify propaganda. ChatGPT has previously shown how generative AI can automate the spread of false information. Generative AI heralds a change in the speed, scope, and legitimacy of dictatorial influence operations, especially when combined with improvements in targeted advertising and other new precision propaganda tactics.

As the technology matures, this advantage will be increasingly important in giving open societies time to understand, detect and mitigate potential harms before autocratic states leverage the technologies for their own ends. But the free world risks squandering this advantage if these pioneering tools are easily acquired by authoritarians.

Regrettably, it is difficult to keep sophisticated AI models out of the hands of autocrats. Technically speaking, generative AI models are very trivial to steal. Although models need a lot of resources to construct, they can be quickly replicated and modified at a low cost once they are created.

However, several companies efforts to keep generative AI open source can be easily taken advantage of by AI researchers of autocratic states.

Instead, companies should approach the development of generative AI with the care and security precautions required for a technology with a significant potential to feed dictatorship and abstain from revealing the technical details of their cutting-edge models to the public.

To build on existing policies that restrict the export of surveillance technology, democratic governments should make clear the strategic significance of generative AI and impose immediate export restrictions on cutting-edge models of this kind to unreliable partners. Only reputable organizations with sound security procedures should be eligible for federal research funding for this type of AI.

The alternative is a well-trodden route, in which tech companies support techno-authoritarianism by combining commercial incentives with naivete.

Although the power of A.I. can be a weapon for dictatorial governments, we shouldn’t ignore that they can also turn democratic governments into dictatorial ones. Manipulation is everywhere on every level, from marketing to politics. Therefore, a government that declares itself as democratic doesn’t mean it is because sometimes, it’s better to not get caught. AI shouldn’t be a privilege of governments if we want a less authoritative world. Therefore, as AI evolves, we need other kinds of AIs that everybody could adopt to contrast the downsides of them.

NO COMMENTS

Exit mobile version