Developers or those of you who want to learn more about the deep learning accelerator on the NVIDIA Jetson Orin Mini PC will be happy to know that NVIDIA has published a new article on its tech blog providing an overview of the deep learning accelerator (DLA) in use. with the Jetson system, which combines the CPU and GPU into one module. Providing developers with a vast NVIDIA software stack in a small, lowpower package that can be deployed at the edge.
Deep Learning Accelerator
“Although DLA does not have as many supported layers as GPUs, it still supports a wide range of layers used in many popular neural network architectures. In many cases layer support can cover the requirements of your model. For example, the NVIDIA TAO Toolkit includes a wide range of DLAsupported pretrained models, from object detection to action recognition. “
“While it’s important to note that DLA’s bandwidth is typically lower than that of a GPU, it is energy efficient and can offload deep learning workloads, freeing up the GPU for other tasks. Alternatively, depending on your application, you can run the same model on the GPU and DLA at the same time to achieve higher net throughput.”
“Many NVIDIA Jetson developers are already using DLA to successfully optimize their applications. Postmates optimized their delivery robot application on the Jetson AGX Xavier using DLA along with a GPU. Cainiao ET Lab used DLA to optimize their logistics system. If you want to fully optimize your application, DLA is an important part of Jetson’s repertoire to consider. “
For more information on using the Deep Learning Accelerator with Jetson Orin, please visit the official NVIDIA Blog using the link below.
Headings: Technology news, Main news