Google is using AI to design AI processors much faster than humans can

(Image credit: Pixabay)

To one extent or another artificial intelligence is practically everywhere these days, from games to image upscaling to smartphone "personal assistants." More than ever, researchers are pouring a ton of time, money, and effort into AI designs. At Google, AI algorithms are even being used to design AI chips.

This is not a complete design of silicon that Google is dealing with, but a subset of chip design known as placement optimization. This is a time-consuming task for humans. As explained by IEEE Spectrum (via LinusTechTips), this involves placing blocks of logic and memory (or clusters of those blocks) in strategic areas to make the most of the available real estate, both for performance and power efficiency.

It might take a team of engineers several weeks to map out the ideal placement because it's a complex task with a ton of variables. In stark contrast, Google's neural network can produce a better design for a Tensor processing unit in less than 24 hours. This is similar in concept to the Tensor cores that Nvidia uses in its Turing-based GeForce RTX graphics cards, just with different goals in mind.

That's interesting in and of itself, but equally so is the type of AI Google is using. Rather than leverage a deep learning model, which requires training AI on a large set of data, Google is using a "reinforcement learning" system. The short explanation is RL models learn by doing.

There is a reward system involved, so RL models proceed in the right direction. In this case, the reward is a combination of power reduction, improvements in performance, and area reduction. I'm simplifying a bit, but basically, the more designs Google's AI does, the better it becomes at the task at hand (making AI chips).

"We believe that it is AI itself that will provide the means to shorten the chip design cycle, creating a symbiotic relationship between hardware and AI, with each fueling advances in the other," Google's researchers explain. If this works out for Google, it seems inevitable AMD, Intel and Nvidia will eventually try the same approach, too.

You check out the technical details in a paper posted to Arxiv.

Thanks, LinusTechTips.

TOPICS
Paul Lilly

Paul has been playing PC games and raking his knuckles on computer hardware since the Commodore 64. He does not have any tattoos, but thinks it would be cool to get one that reads LOAD"*",8,1. In his off time, he rides motorcycles and wrestles alligators (only one of those is true).