What Is A GPU? Why Is It Also Used In Deep Learning?

Semiconductor products called GPUs have been attracting attention in recent years. The reason is that GPUs are used in deep learning, which is the mainstream of AI today. GPUs are becoming more important among semiconductor products as they support the development of AI technology.

Therefore, in this article, we will introduce the outline and usage of GPU and also explain the reason why it has come to be used in deep learning.

What Is GPU?

GPU (Graphics Processing Unit) is a kind of semiconductor chip. Designed specifically for image processing, it is responsible for calculation processing for drawing 3D graphics on PC and game consoles.

CPUs are famous when it comes to semiconductor chips, but CPUs and GPUs have different uses. The CPU plays a role like a human brain and performs general-purpose and complicated calculation processing. On the other hand, the GPU was optimized for high-speed image processing, and there was basically nothing to do other than image processing.

However, there is now a technology called GPGPU (General Purpose Computing on Graphics Processing Unit) that allows the GPU to perform more than just image processing.

GPUs can be classified into two types: “built-in GPUs” integrated into the CPU and “discrete GPUs” that can be used by the GPU alone. Intel, which has the highest share of CPU, has the highest share of “built-in GPU”. On the other hand, the largest share of “discrete GPU” is NVIDIA, which is famous as a GPU maker, and occupies about 80% of the market.

GPU Usage

Generally, the GPU is mounted on the graphic board to perform image processing. A graphic board is a unit that collects parts for displaying images and videos on display and is installed on PC and game consoles.

In 3D graphics, 2D images are processed to make them look three-dimensional by adding shadows and depth. Also, by doing this continuously, you can express three-dimensional movements. When drawing 3D graphics, it is necessary to calculate the values ​​of each pixel that makes up the image in parallel at high speed, and the number of cores of the semiconductor chip is important.

The number of cores is like the number of workers, and the more cores you have, the more processing you can do at the same time. A typical CPU has only a few tens of cores, even if it is a high-end one, but a GPU has thousands of cores. Therefore, it can handle the large amount of parallel computing required for 3D graphics. However, keep in mind that the computing power of each core of the GPU is considerably lower than that of the CPU instead.

GPUs are primarily used in applications that require 3D graphics. Specifically, it includes video works such as games, animations, movies, and 3DCG for buildings and industrial products. Recently, GPUs have been used for new video technologies such as VR and AR, and although they are not image processing, they are also used for deep learning, and their applications are expanding.

Since GPU performance is directly linked to image processing performance, it is expressed in the form of resolution and frame rate. The resolution is simply the density of pixels, which is now commonplace in full HD (1920 x 1080), 4K (3840 x 2160) and 8K (7680 x 4320) for high-performance ones.

The frame rate (fps) is a numerical value that indicates how many images can be displayed in one second, and the higher the numerical value, the smoother the video can be expressed. To simplify, 60 or more fps means the smoothest. Under 60 up to 45 fps is average, and below that is bad.

Deep Learning And GPU

In recent years, GPUs have come to be used in deep learning, which is the mainstream of AI and is attracting attention. GPUs are used for deep learning instead of CPUs because, as mentioned above, GPUs are good at parallel processing.

In deep learning, it is necessary to learn from a huge amount of data and extract the characteristics of the data, but at that time, innumerable parallel calculations are performed.

For example, in the case of image recognition, features such as colour and shape are read from the image, but it is necessary to repeatedly adjust the conditioning to determine which feature should be focused on to get closer to the correct answer. Since the number of features and patterns is enormous, it has a mechanism that learns while repeating calculations of millions and tens of millions and gradually approaches the correct answer.

At this time, if you use a GPU that specializes in parallel computing instead of a CPU, you can reduce the learning time period for AI. It is generally said that the speed of parallel computing on the GPU is 10 times or more that of the CPU, so if you use the GPU, AI can learn in less than 1/10 of the time of the CPU.

Until now, GPUs have been designed specifically for image processing. However, recently, next-generation GPUs have been explicitly developed and optimized for deep learning. GPUs will become even more important in the coming era of ever-increasing demand for AI.

Originally a GPU specialized in image processing, and it is now a supporter of cutting-edge AI technology. It can be said that the improvement of GPU performance is directly linked to the development of AI technology as well as cutting-edge video technology such as VR and AR, thus Metaverse.

Currently, Nvidia is the top manufacturer of GPU for AI, but other semiconductor manufacturers such as Intel and AMD are trying to catch up. We cannot take our eyes off the GPU market trends.

Vishak
Vishak
Meet Vishak, TechLog360's Content Editor and tech enthusiast. With a Computer Science degree and a passion for all things tech, Vishak delivers the latest in hardware, apps, and games with expertise. Trusted for his in-depth reviews and industry insights, he's your guide to the digital world. Off-duty, he's exploring photography and virtual gaming landscapes.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.

More from this stream