Cloud/Data Center:
Acceleration of frequently used highly parallel, highly vectorized workloads
Eg: Machine Learning/Deep Learning:
– Training and Inference of all types of AI workloads
– Streaming Data Processing
Data processing at the edge of the network, the point where the data is originally collected
Acceleration of AI/ML workloads reduces latency and transmission costs
Eg: Predictive maintenance:
– Preliminary data screening and real time analysis
– Instant response to requirements
Hand Held / Portable / Embedded Devices -
Acceleration of AI/ML Imaging workloads on the device where low power is of paramount importance
Eg:
– Object detection, Classification, Identification
– Augmented, virtual and Mixed reality (AVMR) Devices
– Dialog Systems
– Surveillance systems
Acceleration of DSP Algorithms
Eg:
Baseband processing:
NavIC Base Band Processor – for IRNSS based global positioning receiver devices
Synthetic Aperture Radar
Cloud/Data Center
Acceleration of frequently used highly parallel, highly vectorized workloads
Data processing at the edge of the network , the point where the data is originally collected
Hand Held / Portable / Embedded Devices –
Acceleration of DSP Algorithms
Eg: