Last year June, Tesla’s Vice President of Autopilot Software Chris Lattner was replaced by a deep learning expert, Andrej Karpathy. Just before the inauguration that shook the electric vehicle world, Andrej Karpathy had worked as a research scientist with Elon Musk’s OpenAI, an artificial intelligence nonprofit organization backed by Elon Musk. The fields of computer vision and autopiloting are cumulative results of his professional path across convoluted neural networks and artificial intelligence.

Just like Dr. Leibo, Karpathy has previously worked at Google DeepMind, focusing on deep learning. Karpathy also created one of the most original and well-respected deep learning courses in Stanford, which focuses on creating a neural network that can identify multiple discrete and specific items within an image. The neural network can also label the image with those specific items and report back to the user about the featured subjects within the scope of that image. For example, it can take an image of a white pair of shoes and display “white tennis shoes” back to the user. The course also teaches the reverse operation, whereby the system takes a description from the user, articulated in natural language, and finds that given object within the image.

Fledgling undergraduate students such as myself that aspire to have their own vision or angle to a field just as Andrej Karpathy does are left wondering how he had come to his own resolution with his passion. During an interview with Andrew Ng on his deep learning lecture on Coursera, Andrej Karpathy mentioned how he first got into it as a student:

“I took a class on deep learning when I was in my undergrad in the University of Toronto. One time, there was a discussion about the Restricted Boltzmann machine stringed on endless digits. I really liked the way how Jeff, the professor, talked about the mind of the network. There was a flavor of something magical happening when it was training on those digits. And that was my first exposure to it. When I was doing my masters degree at the University of British Columbia, I took a class on machine learning which was another opportunity to delve more into this field of deep learning. I had also gotten very interested in artificial intelligence and took a lot of classes on it. But many of the classes were somewhat unsatisfying. I had decided that machine learning was the direction I was interested in. It is almost like a new computing paradigm, I would say. Because normally humans write the code, but in this case, you just have to make the input and output specifications with lots of examples, after which the optimization would write the code. They’d sometimes write the codes better than we do.”

Predicated on his experience, Karpathy spoke of how computer intelligence has evolved to surpass its limited capacity of needing stored repositories of specialized, explicit code. For aspiring students, Karpathy advised not to abstract away from the things too much. He said that he had implemented his own library in JavaScript called “common.js” just to test his understanding of convoluted neural networks. His philosophy seems to be hinged on the fundamentals, as should any academic pursuit. He implores us to always start from scratch and understand every bit of the details that make up the work of art.

Copyright © The KAIST Herald Unauthorized reproduction, redistribution prohibited