Prepare for Artificial Intelligence to Produce Less Wizardry | WIRED

It has become very expensive to train modern networks; in fact it has become so expensive, that some companies are choosing not to use AI methods at all.

A new research paper by Neil Thompson et. al. argues that it is, or will soon be, impossible to keep increasing computational power at rate needed to advance machine learning. This highlights a looming problem for AI and its users, since behind all the progress in the AI field we were throwing ever-more computing resources at the problems.

For example, in 2012, University of Toronto created a breakthrough image-recognition algorithm using two GPUs over five days. In 2019, researchers from Google and Carnegie Mellon used about 1.000 special AI chips and six days to develop a more modern image-recognition algorithm. Another example is a translation algorithm by Google that required 12.000 special AI chips running for seven days. The cost? An estimate put it at a whopping $3 million for renting this much computational power.

Source: Prepare for Artificial Intelligence to Produce Less Wizardry | WIRED

Photo by Eric Krull on Unsplash

Scroll to top