The 2012 source code for AlexNet, the precursor to modern AI, is now on Github thanks to Google and the Computer History Museum

YouTube YouTube
Watch On

AI is one of the biggest and most all-consuming zeitgeists I've ever seen in technology. I can't even search the internet without being served several ads about potential AI products, including the one that's still begging for permissions to run my devices. AI may be everywhere we look in 2025, but the kind of neural networks now associated with it are a bit older. This kind of AI was actually being dabbled with as far back as the 1950's, though it wasn't until 2012 that we saw it kick off the current generation of machine learning with AlexNet; an image recognition bot whose code has just been released as open source by Google and the Computer History Museum.

We've seen many different ideas of AI over the years, but generally the term is used in reference to computers or machines with self learning capabilities. While the concept has been talked about by science-fiction writers since the 1800's, it's far from being fully realised. Today most of what we call AI refers to language models and machine learning, as opposed to unique individual thought or reasoning by a machine. This kind of deep learning technique is essentially feeding computers large sets of data to train them on specific tasks.

The idea of deep learning also isn't new. In the 1950's researchers like Frank Rosenblatt at Cornell had already created a simplified machine learning neural network using similar foundational ideas to what we have today. Unfortunately the technology hadn't quite caught up to the idea, and was largely rejected. It wasn't until the 1980's that we really saw machine learning come up once again.

In 1986, Geoffrey Hinton, David Rumelhart and Ronald J. Williams, published a paper around backpropagation, an algorithm that applies appropriate weights to the responses of a neural network based on the cost. They weren't the first to raise the idea, but rather the first that managed to popularise it. Backpropagation as an idea for machine learning was raised by several including Frank Rosenblatt as early as the '60s but couldn't really be implemented. Many also credit it as a machine learning implementation of the chain rule, for which the earliest written attribution is to Gottfried Wilhelm Leibniz in 1676.

Despite promising results, the technology wasn't quite up to the speed required to make this kind of deep learning viable. To bring AI up to the level we see today we needed a heap more data to train them on, and much higher level computational power in order to achieve this.

In 2006 professor Fei-Fei Li at Stanford University began building ImageNet. Li envisioned a database that held an image for every English noun, so she and her students began collecting and categorising photographs. They used WordNet, an established collection of words and relationships to identify the images. The task was so huge it was eventually outsourced to freelancers until it was realised as by far the largest dataset of its kind in 2009.

It was around the same time Nvidia was working on the CUDA programming system for its GPUs. This is the company which just went hard on AI at 2025's GTC, and is even using the tech to help people learn sign language. With CUDA, these powerful compute chips could be far more easily programmed to tackle things other than just visual graphics. This allowed researchers to start implementing neural networks in areas like speech recognition, and actually see success.

In 2011 two such students under Goeffrey Hinton, Ilya Sutskever (who went on to co-found OpenAI) and Alex Krizhevsky began work on what would become AlexNet. Sutskever saw the potential from their previous work, and convinced his peer Krizhevsky to use his mastery of GPU squeezing to train this neural network, while Hinton acted as principal investigator. Over the next year Krizhevsky trained, tweaked, and retrained the system on a single computer using two Nvidia GPUs with his own CUDA code. In 2012 the three released a paper which Hinton also presented at a computer vision conference in Florence.

Hinton summarised the experience to CHM as “Ilya thought we should do it, Alex made it work, and I got the Nobel Prize.”

It didn't make much noise at the time, but AlexNet completely changed the direction of modern AI. Before AlexNet, neural networks weren't commonplace in these developments. Now, they're the framework for most anything touting the name AI, from robot dogs with nervous systems to miracle working headsets. As computers get more powerful we're only set to see even more of it.

Given how huge AlexNet has been for AI, CHM releasing the source code is not only a wonderful nod, but also quite prudent in making sure this information is freely available to all. To ensure it was done fairly, correctly—and above all legally—CHM reached out to AlexNet's namesake, Alex Krizhevsky, who put them in touch with Hinton who was working with Google after being acquired. Now, considered one of the fathers of machine learning, Hinton was able to connect CHM to the right team at Google who began a five-year negotiation process before release

This may mean the code, available to all on Github might be a somewhat sanitised version of AlexNet, but it's also the correct one. There are several with similar or even the same name around, but they're likely to be homages or interpretations. This upload is described as the "AlexNet source code as it was in 2012" so it should serve as an interesting marker along the pathway to AI, and whatever form it learns to take in the future.

Best CPU for gamingBest gaming motherboardBest graphics cardBest SSD for gaming


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

TOPICS
Hope Corrigan
Hardware Writer

Hope’s been writing about games for about a decade, starting out way back when on the Australian Nintendo fan site Vooks.net. Since then, she’s talked far too much about games and tech for publications such as Techlife, Byteside, IGN, and GameSpot. Of course there’s also here at PC Gamer, where she gets to indulge her inner hardware nerd with news and reviews. You can usually find Hope fawning over some art, tech, or likely a wonderful combination of them both and where relevant she’ll share them with you here. When she’s not writing about the amazing creations of others, she’s working on what she hopes will one day be her own. You can find her fictional chill out ambient far future sci-fi radio show/album/listening experience podcast right here. No, she’s not kidding. 

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.

Read more
gotg llama
Blasting AI into the past: Modders get Llama AI working on an old Windows 98 PC
SUQIAN, CHINA - JANUARY 27, 2025 - An illustration photo shows the logo of DeepSeek and ChatGPT in Suqian, Jiangsu province, China, January 27, 2025. (Photo credit should read CFOTO/Future Publishing via Getty Images)
'AI's Sputnik moment': China-based DeepSeek's open-source models may be a real threat to the dominance of OpenAI, Meta, and Nvidia
Nvidia GR00T N1 robotics
Nvidia's GTC keynote inevitably went all in on AI but I'm definitely here for the Isaac GR00T robots
OpenAI logo displayed on a phone screen and ChatGPT website displayed on a laptop screen are seen in this illustration photo taken in Krakow, Poland on December 5, 2022.
OpenAI boss suggests there's the equivalent of Moore's law for AI and it's 'unbelievably stronger'
SUQIAN, CHINA - JANUARY 27, 2025 - An illustration photo shows the logo of DeepSeek and ChatGPT in Suqian, Jiangsu province, China, January 27, 2025. (Photo credit should read CFOTO/Future Publishing via Getty Images)
The brass balls on these guys: OpenAI complains that DeepSeek has been using its data, you know, the copyrighted data it's been scraping from everywhere
Seattle, USA - Jul 24, 2022: The South Lake Union Google Headquarter entrance at sunset.
Google is rolling out an even more AI-heavy search engine mode because 'power users want AI responses for even more of their searches'
Latest in Hardware
Crucial X9 external SSD on blue background
You can pick up the 2 TB version of my favorite budget external SSD for less than $0.06 per GB, transfers 300+ GB of data in 6 minutes
AMD Strix Point APU chip, held in a hand, with the reflected light showing the various processing blocks in the chip die
AMD's next-gen 'Gorgon Point' APU outted and seemingly sticks with RDNA 3.5 graphics which is disappointing for handheld gaming PCs if accurate
The Lenovo Legion LOQ gaming laptop on a blue background
Okay, so it's not technically in the Amazon Big Spring Sale, but this is the cheapest RTX 4070 gaming laptop you'll find today
A close-up photo of an Nvidia RTX 4070, with its heatsink removed, showing the AD104 GPU die and the surrounding Micron GDDR6X VRAM chips
With Nvidia Ace taking up 1 GB of VRAM in Inzoi, Team Green will need to up its memory game if AI NPCs take off in PC gaming
A collage of Radeon RX 9000 series graphics cards, as shown in AMD's promotional video for the launch of RDNA 4 at CES 2025
AMD's CEO claims 9070 XT sales are 10x higher than all previous Radeon generations but that's just for the first week of availability
Samsung 3D monitor
Samsung has a crack at ye olde glasses-free 3D monitor thing but its new cheaper 49-inch ultrawide OLED is far more interesting
Latest in News
Naoe looking at the wrist blade in Assassin's Creed Shadows
Ubisoft says don't compare Assassin's Creed Shadows' success to Valhalla: The latter launched in Covid's 'perfect storm' and feedback on platforms 'less affected by review bombing' is stellar
Tarn Adams, who cofounded Bay 12 Games with his brother Zach, talks about their single-player simulation game "Dwarf Fortress" during an interview at their home office in Poulsbo, Washington, west of Seattle, on December 9, 2022. - A cult favorite among indie game fans, "Dwarf Fortress" has been available for purchase on the Steam online store since December 6, a first for this title that has been distributed for free since its debut in 2006. The real-time management game, set in a medieval-fantasy world and involving overseeing a group of dwarves seeking to build a mighty fortress, has climbed to the fourth best-selling weekly title on Steam. (Photo by Jason Redmond / AFP) (Photo by JASON REDMOND/AFP via Getty Images)
Dwarf Fortress' creator is so tired of hearing about AI: 'Press a button and it writes a really sh*tty, wrong essay about something—and they still take your job'
Crucial X9 external SSD on blue background
You can pick up the 2 TB version of my favorite budget external SSD for less than $0.06 per GB, transfers 300+ GB of data in 6 minutes
Image of illuminated manuscript-style drawings from the game Pentiment.
Random characters kept swearing in Obsidian's font-obsessed murder-mystery when its procedural error system ran amok: 'Naughtiness abounded'
minecraft diamond level sword
Minecraft's never going free-to-play because as it stands it's 'the best deal in the world'
A Lagiacrus render from Monster Hunter Generations, photoshopped over a screenshot of Wilds' Scarlet Forest region.
Oh my God, it's happening: Monster Hunter Wilds is finally bringing a fan-favorite sea snake home from the war