Japanese Experts Envision AI Turning on Us if We Don’t Act Now

PopTika / shutterstock.com
PopTika / shutterstock.com

In a rare case of cooperation, Nippon Telegraph and Telephone (NTT), Japan’s largest telecom company, and the influential paper Yomiuri Shimbun put together a proposal on artificial intelligence (AI). Published as a massive warning that AI would “in the worst-case scenario, democracy, and social order could collapse, resulting in wars” if something is not done to limit it and its implications.

Their solution? The implementation of laws to keep elections, as well as national security on systems that are blocked from AI. In support of their theory, they offered that AI has already been degrading human dignity. NTT and Yomiuri cited research their executives have conducted, with the use of a study group led by researchers at Keio University in Tokyo. Per the Wall Street Journal’s take on it, these companies are some of the most influential people on policy in Japan.

Meanwhile, the European Union has been undertaking steps to encourage safe testing of AI scripts before implementation. Pushing legislation that takes away emotion recognition in schools and workplaces is one of the biggest steps in keeping things safe.

Here in the US, the House of Representatives has strangely called for the ban of Microsoft’s Copilot AI. While governments undertaking such rules is not uncommon, it’s usually just specific products. Not sweeping generalizations, and especially not about Microsoft-based applications.

As it is, many are wondering if we have already allowed AI to get too far. Clips of new AI-powered robotics that can run, jump, chase, or shoot have been released. While they are being done individually, it’s not outside the realm of possibility that the federal government in the US or even another developed nation is developing ones that can do it all.

With ample research showing how rapidly AI has been advancing and adjusting, it leaves a lot of possibilities open, and many of them are honestly quite horrific if left unchecked.