Today, AI can design machine learning systems called neural networks in a process called Neural Architecture Research (NAS). But this technique requires considerable resources in time, processing power and money. Even for Google, the production of a single-convolution neural network – often used for image classification – requires 48,000 GPU hours. Now, MIT researchers have developed a NAS algorithm that automatically teaches a convolutional neuron network in a fraction of the time, barely 200 hours GPU.
The acceleration of the process by which IA designs neural networks could allow more people to use and experiment with NAS systems, which could advance adoption. from the AI. While this is certainly not easy, it could be a step forward in giving more people and companies artificial intelligence and machine learning, freeing them from the towers of technology giants.