[ad_1]
In-depth learning may require a new programming language that is more flexible and easier to use than Python, said Yann LeCun, director of AI research on Facebook today. It is unclear whether such language is necessary, but this is contrary to the deeply rooted desires of researchers and engineers, he said.
LeCun has been working with neural networks since the 1980s.
"There are several projects on Google, Facebook and other places to design a compiled language that can be effective for deep learning, but it's not at all clear that the community will follow because users only want to use Python LeCun said in a phone call with VentureBeat.
"The question is now, is this a valid approach?"
According to the recent GitHub Octoverse report, Python is currently the language most used by developers working on machine learning projects. This language forms the basis of Google's PyTorch and Google TensorFlow frameworks.
LeCun presented a document exploring the latest trends and speaking to new generation computer chip manufacturing companies at the International Conference on Solid State Circuits of the IEEE (ISSCC) which is being held today in San Francisco.
The first part of the article is devoted to the lessons that LeCun has learned from Bell Labs, including its observation that the imagination of researchers in artificial intelligence and computer scientists tends to be related to hardware and software tools.
Artificial intelligence is more than 50 years old, but its current rise is closely related to the growth of computing power provided by computer chips and other hardware.
A virtuous cycle of better hardware resulting in better algorithms, better performance and allowing more people to build better hardware has only a few years, said LeCun, who worked at Bell Labs in the 1980s and has ConvNet (CNN) AI reads postcodes on postal envelopes and bank checks.
In the early 2000s, after leaving Bell Labs and joining New York University, LeCun collaborated with other stage personalities, such as Yoshua Bengio and Geoffrey Hinton, in conducting research to revive interest in neural networks and increase the popularity of in-depth learning.
In recent years, advances in hardware, such as on-site programmable gate matrices (FPGAs), Google Tensor Processing Units (TPUs), and graphics processing units (GPUs) have played a major role. in the growth of the sector. Facebook would also work on its own semiconductor.
"The type of material available has a great influence on the kind of research people do. The direction of AI in the next ten years will therefore be greatly influenced by the type of material available, "he said. "It's very moving for computer scientists, because we like to think, in the abstract, that we are not bound by the limitation of our material, but by us."
LeCun highlighted a number of AI trends that hardware manufacturers should take into account in the coming years and made recommendations on the type of architecture needed in the near future, recommending developers to take into account the growing size of deep learning systems.
He also spoke of the need for equipment specifically designed for in-depth learning and equipment capable of handling a single batch, rather than having to bundle multiple learning samples for effectively exploit a neural network, which is the norm today.
"If you run a single image, you will not be able to exploit all the available calculations in a GPU. Basically, you're going to waste resources, so the lot requires you to think of some ways to train neural networks, "he said.
He also recommended dynamic networks and hardware that can be adjusted to use only the neurons needed for a task.
In the article, LeCun reiterated his belief that self-supervising learning will play a major role in advancing advanced AI.
"If self-directed learning allows machines to learn huge amounts of basic knowledge about how the world works through observation, one can hypothesize that some form of common sense might emerge." , writes LeCun in his article.
LeCun believes that future deep learning systems will be largely trained in self-directed learning and that new high-performance hardware will be needed to support this self-learning.
Last month, LeCun spoke about the importance of self-supervised learning with VentureBeat as part of a story about AI predictions in 2019. Hardware that can handle self-directed learning will be important for Facebook, as well as for autonomous driving, robotics and technology. many other forms of technology.
[ad_2]
Source link