What is an NPU?
An NPU (Neural Processing Unit) is a microprocessor that is developed specifically for neural network Machine Learning and Artificial Intelligence. You can find them on smartphone SoCs, PC processors, or as standalone units.
We cannot proceed further without shedding some light on what Machine Learning (ML) and Artificial Intelligence (AI) are.
Machine Learning and Artificial Intelligence
Machine learning is a branch of artificial intelligence that enables computers to learn from data and use it to perform tasks that would normally require human intelligence. Such tasks include: recognizing faces, understanding speech, or predicting outcomes. Machine learning algorithms can automatically find patterns and rules in the data. They learn from the data that they are exposed to and then use it to improve their performance. Smartphone cameras are a very good example of how ML works. The more pictures you take with your phone, the better it gets at processing images.
On the other hand, Artificial intelligence is a way of making computers or other machines capable of doing things that would normally require human intelligence. These tasks include understanding language, recognizing images, solving problems, or learning from data.
NPUs on smartphone SoCs
In simpler terms, An NPU core is a processing unit within an SoC. This unit is an essential part of a smartphone. The AI capability of a smartphone is determined by how powerful the NPU on the smartphone SoC is.
If your phone’s SoC has decent NPU cores, you would be able to do the following AI tasks. These include:
- facial recognition,
- object detection in photography,
- speech recognition in voice assistants,
- word recognition and translation,
- language recognition and translation
- using your phone with very little lag, etc.
However, if your phone comes with below-average or weak NPU cores, you may be unable to use speech recognition in voice assistants without your device lagging.
Types of NPUs From Different OEMs
Most NPU cores in smartphone SoCs are designed by each OEM (Original Equipment Manufacturers) to their specifications. Although there are generic NPUs around that OEMs can incorporate into their SoCs.
Here is a list of some OEMs and the names of their NPUs:
Qualcomm (Hexagon DSP):
This is one of the most popular and capable NPUs around today. It is an integral part of Qualcomm’s Snapdragon SoCs which powers many Android phones. More recently you can find it on PC processors like the Snapdragon X Elite. It is built to handle artificial intelligence, computer vision, and audio processing.
Samsung (Neural Processing Solution):
It is part of the latest lineup of Samsung Exynos SoCs which powers many Samsung phones. It can handle face recognition, scene detection, and image enhancement.
Apple (Neural Engine):
This NPU is a very crucial part of Apple’s Custom Silicon (A-series for iPhones and iPads and M-series for iMacs and MacBooks). It can handle face ID, Siri, and augmented reality.
Huawei (Da Vinci Architecture):
It is a part of Huawei’s own Hisilicon Kirin chipset lineup which powers some Huawei phones. It can handle natural language processing, image recognition, and video analysis.
Google (Tensor Processing Unit or TPU):
It is an integral part of the Google Tensor chipset that powers the Pixel 6 and Pixel 6 Pro models and upwards. Google’s Tensor SoC lineup is named after the Tensor Processing Unit. It can handle artificial intelligence, computer vision, and language processing. The TPU is supported by Google’s Tensor Flow software. It is also utilized in Google’s cloud services, such as Google Photos, Google Translate, and Google Assistant.
MediaTek (AI Processing Unit or APU):
The APU made its debut on the Helio P60 and has gone on to feature in most high-end MediaTek SoCs. It is supported by MediaTek’s Neuropilot and can handle Artificial Intelligence, image processing, computer vision, etc.
How NPUs work
A typical NPU is developed with a “data-driven parallel computing” architecture. This architecture is particularly good at processing a lot of multimedia data such as video, images, and audio.
Data-driven parallel computing is a way of using many computers or parts of computers to work on a problem at the same time. It is useful for solving problems that involve a lot of data. These include analyzing large datasets, simulating complex systems, or creating realistic graphics. Data-driven parallel computing can make the computation faster, more efficient, and more accurate than using a single computer.
Understanding Data-Driven Parallel Computing
To understand data-driven parallel computing, let’s use an analogy. Imagine you have a large pile of books that you want to sort by their titles. If you do it by yourself, it will take a long time and a lot of effort. But if you have many friends who can help you, you can divide the pile into smaller piles and give each friend a pile to sort. Then, you can merge the sorted piles and get the final result. This is an example of data-driven parallel computing, where the problem (sorting books) is broken down into smaller sub-problems (sorting piles) that can be solved independently and in parallel by different workers (friends). The results are then combined to get the final solution.
Data-driven parallel computing with smartphone NPUs is making smartphones that can learn and make decisions to work on a problem faster and better. For example, smartphone NPUs can help them take better photos and videos, recognize voices, or even play games.
Data-driven parallel computing means that the problem is split into smaller parts that can be done by different NPU cores at the same time. After the job is done, the results are combined to get the final answer. This makes the problem easier and quicker to solve. Data-driven parallel computing with smartphone NPUs can improve the performance, security, and user experience of smartphones.
Why NPUs are important
NPU cores are very important because they make smartphones a lot smarter. They do this by:
- accelerating the computation of Machine Learning tasks by several folds (nearly 10K times) as compared to GPUs.
- consume less power and improve resource utilization for Machine Learning tasks as compared to GPUs and CPUs
Real-Life Uses of NPUs
Here is a list of some real-life examples/user cases on the usefulness of NPUs in smartphone SoCs:
Taking better photos:
NPUs can help with enhancing the image quality, adjusting the brightness and contrast, applying filters and effects, and detecting faces and objects in the photos. For example, the Samsung Neural Processing Solution can handle face recognition, scene detection, and image enhancement.
Recognizing voices:
NPUs can help with understanding and responding to voice commands, translating speech to text, and generating natural-sounding speech. For example, the Apple Neural Engine can handle face ID, Siri, and augmented reality.
Playing games:
NPUs can help with rendering realistic graphics, simulating physics, and creating immersive sound effects. For example, the Qualcomm Hexagon DSP can handle artificial intelligence, computer vision, and audio processing.
Diagnosing diseases:
NPUs can help with analyzing medical images, detecting abnormalities, and suggesting treatments. For example, the Huawei Da Vinci Architecture can handle natural language processing, image recognition, and video analysis.
Transcribing and translating speech:
NPUs can help with converting speech to text, translating speech to different languages, and generating captions and subtitles. For example, the Google TPU enables live transcribing and live translations for the Pixel 6 models and upwards.
Creating and editing videos:
NPUs can help with stabilizing and enhancing the video quality, applying filters and effects, and detecting faces and objects in the videos. For example, the Google TPU can handle artificial intelligence, computer vision, and language processing for the Pixel 6 models and upwards.
Searching and organizing photos:
NPUs can help with finding and grouping photos by date, location, person, or event, and creating albums and slideshows. For example, the Google TPU is used in Google Photos, which can automatically organize and edit photos.
Answering questions and providing information:
NPUs can help with understanding and responding to natural language queries, retrieving relevant information from the web, and providing personalized suggestions and recommendations. For example, the Google TPU is used in Google Assistant, which can answer questions, perform tasks, and control smart devices.
Conclusion
We have seen that NPUs are special chips in smartphone SoCs that can learn and make decisions. They are useful for tasks that require a lot of data processing and learning, such as image recognition, voice recognition, natural language processing, and augmented reality. NPUs can perform complex mathematical operations faster and more efficiently than CPU cores or GPUs. NPUs also reduce power consumption and improve the battery life of a smartphone. This is because they can offload some of the work from the CPU cores and the GPU cores.
Different smartphone SoCs have different NPUs that have different names, designs, and features. Some of the most common NPUs are the Qualcomm Hexagon DSP, the Samsung Neural Processing Solution, the Apple Neural Engine, the Huawei Da Vinci Architecture, and the Google Tensor Processing Unit.
These NPUs can handle various tasks and applications, such as taking better photos, recognizing voices, playing games, diagnosing diseases, transcribing and translating speech, creating and editing videos, searching and organizing photos, and answering questions and providing information.
NPUs are constantly evolving and improving, as new technologies and innovations are being developed and implemented. NPUs are becoming more powerful, faster, smarter, and more energy-efficient, which can improve the user experience and satisfaction. So next time you are getting a phone, do well to check these specs. A good phone should have a dedicated NPU for machine learning and artificial intelligence.
Please leave a comment if you found this helpful and remember to:
- Subscribe to our YouTube channel
- Follow on Facebook
- Follow on WhatsApp
- Join our Telegram community
- Participate on Reddit
- Find us on Quora
0 Comment