Why GPUs are Essential to AI, and How You can Profit from Contributing

Earn rewards for contributing your computer’s GPU power to decentralized AI

AI Network
7 min readMay 5, 2023
Image made with Lexica.art, edited with Photoshop

GPUs are essential to the function of AI.

Everyone knows what AI is — the popularity of artifical intelligence has skyrocketed since Nov 2022 when OpenAI released ChatGPT, the free AI chatbot that’s been answering all manner of complex questions and disrupting industries from writing to coding.

Since then other AI models & services have seen immense rises in popularity — Stable Diffusion for images & Eleven Labs for speech recognition, as two examples — and AI has been added to a host of existing services (like Notion.so and ChatGPT-4 to Bing, after Microsoft bought out OpenAI).

ChatGPT racked up more than a million users in just 5 days of release (Instagram had to wait almost 3 months to reach a million and Netflix almost 4 years, just for comparison), and it’s safe to say it’s had a profound effect across multiple markets.

The importance of AI is difficult to ignore, especially as the tech develops at lightning-fast pace and the space grows even faster.

But what of the inner workings of these almost miraculous services? Are they as complex and enigmatic behind the scenes as your average developer may have you believe?

Different AIs run on different models, and these models vary service-to-service, but one physical component underpins a lot of how they practically function — GPUs.

GPU vs CPU Processing Power

For the uninitiated, a GPU is a graphics processing unit, and pretty much every modern laptop, desktop computer and gaming console has them.

They were originally developed to handle the complex computations required for rendering images & videos for gaming & computer-aided design (CAD), and were first created only as recently as the mid-90s.

GPUs differ in their processing function to how CPUs function. CPUs — central processing units — are the typical units found in your laptop or desktop computer at home, and are designed to run a wide variety of tasks like running operating systems and applications.

Physically-speaking, CPUs are chips which house a few high-performance processing cores with a large-cache memory, and are optimized for single-threaded performance. This means they excel at executing a single task or a few tasks at a time.

They’re great for running your computer’s operating system and a browser, music player and photoshop at the same time, but if you open 1000 tabs in Chrome and simultaneously run the entire Adobe Creative Suite, the CPU will have a tough time processing all that information in parallel and your computer will slow down.

GPUs, on the other hand, are chips which contain hundreds or thousands of simple processing cores and have smaller cache memory. These cores are designed to work together to execute complex calculations in parallel, allowing GPUs to handle large amounts of data at once and perform multiple tasks simultaneously.

Image source

The chart above demonstrates the difference between CPU and GPU processing time (in milliseconds) when tasked with processing pixels for images or video. As can be seen, the CPU takes longer to process the same number of pixels as the GPU, and significantly slows down as the number of tasks to be perfomed rises, taking as much as up to 6x longer than the GPU.

The Importance of GPU Processing Power in AI

Since their inception GPUs have found themselves increasingly used in AI and machine learning, and now they’re critical to how various models of AI function. Why?

Well, that’s down to how AI practically functions.

Take ChatGPT, for example. It’s a large language model (LLM), meaning it’s been trained on a vast amount of text data to learn the patterns between words, which is what allows it to answer questions in a relevant & coherent manner. It uses a neural network to model the relationships between words & phrases in language, and these networks consist of interconnected nodes that process and transmit information.

In AI a neural network is like a big machine-learned brain (and indeed the architecture is designed to simulate how a human brain functions).

The ‘GPT’ part of ChatGPT stands for generative pre-trained transformer, and the transformer part means the model processes all input data simultaneously. So when you input a question into ChatGPT and hit enter, it runs thousands of operations simultaneously through the nodes in the neural network to return a relevant answer.

Physically speaking, the power behind the nodes on the neural network are supplied by GPUs, and GPUs are used because they have the physical capability to process large amounts of data and perform multiple tasks at the same time.

GPUs are similarly essential to other AI models, like Stable Diffusion — the AI which generates images from text prompts — in the sense that information is processed through a neural network consisting of GPU nodes to return a coherent image.

Image source: Lambda

This chart shows the processing speed of various NVIDIA GPUs when outputting an image from Stable Diffusion. Given the chart shown in the previous section (CPU vs GPU comparison of processing speeds), CPU processing speeds are slower and become even more so when it’s overloaded with tasks (which is pretty much the case with how nodes in an AI neural network work), which means the use of CPU processing power in AI would slow things down exponentially when compared to GPUs.

The nodes in a neural network can only be practically run by GPUs, and therefore the GPU is, currently, essential to the function of AI.

Can You Get Involved in AI?

We’ve seen GPU power is critical to the function of various AI models, and indeed many wouldn’t function without it.

The GPU nodes in various AI neural networks are provided by a plethora of operators. Some are supplied by large corporations like NVIDIA, some are supported by smaller companies and some are provided by individuals.

All of these people & entities contribute resources to AI, allowing the field and the tech to function and push forward, and by supplying GPU resources, you could get involved too.

But why would you contribute resources to AI?

Well, apart from becoming involved in arguably the fastest growing market in the world right now, for the same reason many of us do anything; to receive rewards.

A simple and easy way of doing that is through AI Network (AIN); the AI development ecosystem open to all, built on its own blockchain to represent collaborative AI.

How to Get Involved with AI & Earn Through AI Network

The easiest way to get involved in AI and earn while you do it is to own a Runo NFT.

Runo stands for “Running Node”, and is the Node NFT Character of the AI Network GPU Worker. By adopting a Runo NFT and keeping your node running, you’ll be contributing resources to AI and earning $AIN tokens daily while you do it.

The amount you earn depends on the NFT you own — the larger the NFT worker the more $AIN you’ll be rewarded with. The 3rd round sale will be ready next week and you can purchase a Runo NFT using this link.

Different types of Runo (Running Node) NFTs on AI Network. The larger the NFT, the higher the rewards.

AI Network has partnered with blockchain infrastructure builder DSRV Labs (who support various blockchains, like Solana, Terra & NEAR protocol), and owning a Runo NFT represents ownership in a DSRV GPU.

Owning a Runo NFT directly contributes to supporting AI as a node, and not only are you rewarded for being a node in an AI Network, you’ll also be contributing to the decentralization and democratization of AI.

We’re currently reaching a turning point. The popularity of AI is unquestionable, as is it’s potential utility for the future. In the very near future it will be decided if AI goes the way of centralized control, like the huge tech companies that run Web2 search engines, social media platforms & harvest user data, or if it goes the way of decentralized and democratic control, where far more of the population will have a say in which direction AI could go.

Anyone who contributes their GPU power will not only receive rewards, but also contribute to putting control of future AI in the hands of the many, rather than in the hands of the few.

AI Network seeks to become the “Internet for AI” in the Web3 era. A decentralized AI development ecosystem built on its own blockchain, AI Network is open to all by gathering developers, creators, and GPU resources for development.

Join as a Captain to receive $AIN rewards for providing GPU resources. Join as a developer to gain access to GPUs and earn from writing open-source AI code. Join as a Creator to utilize AI Network’s developers, open source code, and GPUs to create your own AINFTs.

We represent the future of ‘collaborative AI’ to democratize AI and make it useful and utilizable for everyone.

--

--

AI Network
AI Network

Written by AI Network

A decentralized AI development ecosystem built on its own blockchain, AI Network seeks to become the “Internet for AI” in the Web3 era.

No responses yet