A gridded image collage of thousands of photographs Andrej Karpathy
Students from Fast.ai , a small organization that runs free machine-learning courses online, just created an AI algorithm that outperforms code from Google’s researchers, according to an important benchmark.
Fast.ai’s success is important because it sometimes seems as if only those with huge resources can do advanced AI research.
Fast.ai consists of part-time students keen to try their hand at machine learning—and perhaps transition into a career in data science. It rents access to computers in Amazon’s cloud.
But Fast.ai’s team built an algorithm that beats Google’s code, as measured using a benchmark called DAWNBench , from researchers at Stanford. This benchmark uses a common image classification task to track the speed of a deep-learning algorithm per dollar of compute power.
Google’s researchers topped the previous rankings, in a category for training on several machines, using a custom-built collection its own chips designed specifically for machine learning. The Fast.ai team was able to produce something even faster, on roughly equivalent hardware.
“State-of-the-art results are not the exclusive domain of big companies,” says Jeremy Howard, one of Fast.ai’s founders and a prominent AI entrepreneur. Howard and his cofounder, Rachel Thomas, created Fast.ai to make AI more accessible and less exclusive.
Howard’s team was able to compete with the likes of Google by doing a lot of simple things, which are detailed in a blog post . These include making sure that the images fed to its training algorithm were cropped correctly: “These are the obvious, dumb things that many researchers wouldn’t even think to do,” Howard says.
The code needed to run the learning algorithm on several machines was developed by a collaborator at the Pentagon’s new Defense Innovation Unit , created recently to help the military work with AI and machine learning.
Matei Zaharia , a professor at Stanford University and one of the creators of DAWNBench, says the Fast.ai work is impressive, but notes that for many AI tasks large amounts of data and significant compute resources are still key.
The Fast.ai algorithm was trained on the ImageNet database in 18 minutes using 16 Amazon Web Service instances, at a total compute cost of around $40. Howard claims this is about 40 percent better than Google’s effort, although he admits comparison is tricky because the hardware is different.
Jack Clark , director of communications and policy at OpenAI, a nonprofit, says Fast.ai has produced valuable work in other areas such as language understanding. “Things like this benefit everyone because they increase the basic familiarity of people with AI technology,” Clark says.