New technologies face numerous tests that go beyond whether they work. Will consumers use them for daily or near-daily tasks? Are there pragmatic business models associated with them? How will they scale?
And how will regulators and lawmakers respond?
Artificial intelligence — arguably one of the most important digital technologies going into 2019 — could soon face export regulations set by the U.S. Commerce Department. The idea of imposing such restrictions is nothing new, with worries about how China and other countries might use the technology for military purposes, along with ongoing trade issues and concerns related to U.S. tech being stolen, helping to fuel that effort. In November, the federal agency included AI on a list of technologies that could be subject to export controls.
Now, according to the latest reports, the Commerce Department has set Jan. 10 as the deadline for public comment on the proposal. Tech companies are weighing in, according to The New York Times, and are encouraging regulators to take a light hand with AI export rules ahead. Their argument has three main points: Restrictions could harm companies in the United States and help international competitors. They could stifle technology improvements. And they may not make much of a difference.
AI in China
After all, China is emerging as a hotspot for AI development, with experts previously saying that the country is only one step behind the U.S. in that field. What’s more, China has ambitions for using AI to predict crimes, lend money, track people in the country, help with traffic snarls and censor the internet, among other things. The country has been throwing vast sums of money at its artificial intelligence push, and has already spent billions of dollars on research focused on that area. Additionally, China is gearing up to launch a multibillion-dollar initiative to bankroll startups and academic research focused on growing artificial intelligence.
Meanwhile, in the payments and commerce world, financial institutions in the United States and elsewhere are making moves toward using AI for fraud prevention and anti-money laundering efforts. As PYMNTS research has demonstrated, the adoption of true artificial intelligence — as opposed to machine learning — has been slow, though interest in the technology and its potential for greater use is promising.
According to the PYMNTS research report “The AI Gap: Perception versus Reality in Payments and Banking Services,” true AI systems, by contrast, are used by only 5.5 percent of financial institutions that were interviewed to help construct the report’s findings. Far more popular — besides data mining — were less sophisticated technologies, including Business Rules Management Systems (BRMS), which enable companies to easily define, deploy, monitor and maintain new regulations, procedures, policies, market opportunities and workflows.
AI Dual Use
The potential Commerce Department export rules could reportedly include not only AI, but also computer vision, speech recognition and natural language understanding technologies — and could apply not only to China, but also Russia and Iran. Export controls could range from “new licensing rules for AI exports to outright bans,” according to The New York Times.
The report noted that enforcement could be challenging, given that AI has a dual use — that is, it can be deployed in both commercial and security-military settings. “Trying to draw a line between what is military and what is commercial is exceedingly difficult,” said R. David Edelman, a technology policy researcher at the Massachusetts Institute of Technology. “It may be impossible.”
That would hardly be the only challenge of stricter AI export controls.
Can You Really Bottle up AI?
Despite persistent popular opinion, technology development — like most of modern scientific discovery — is not some “a-ha” moment experienced by a single individual, as tales about the likes of Benjamin Franklin, Thomas Edison, Isaac Newton and Albert Einstein would have people believe. Rather, technological and scientific development is usually a case a small advances building upon other half-steps of progress, with a larger collective of researchers responsible for more breakthroughs than just one person.
That especially holds true for AI, as the Times report noted, given that “research on the technology is often done collaboratively by scientists and engineers all over the world. Companies rarely hold onto the details of their AI work, as if it were a secret recipe. Instead, they share what they learn, in hopes that other researchers can build on it.”
In short, AI would seem to bear little resemblance to the tasks of nuclear secrets during the Manhattan Project of World War II (to use one of the most dramatic examples of the flow of technology and scientific information). During that time, papers about research related to the development of the Allied atomic bomb were kept out of scientific journals via governmental control (that absence, in fact, alerted the Soviets that the U.K. and the U.S. were indeed trying to make an atomic bomb, but that’s a long story for another day).
When it comes down to it, putting more controls on the export of AI might end up serving as an example of using a paper bag to protect against a rainstorm. “A lot of the computer code for AI is published on sites like Arxiv.org, a repository of academic and corporate research,” the report said. So, many policy experts believe that if the United States restricts the export of AI products and services, it will have little effect on the progress of AI in China and other countries.”
For now, the march toward true AI continues in payments and commerce (where self-driving cars and AI appear headed for a long-term marriage). Regulators might try to at least slow down the transfer of AI software, but it is difficult to imagine how they will stop the global flow of information, at least the data related to AI commercial deployments. If anything, as reports have noted, more western AI development might end up taking place in Europe if any U.S. controls were to be overly strict.