Augmented and Virtual Reality live video streaming requires high-performance video encoding and transcoding. Using Codensity video processing units the video quality and bitrate efficiency of file-based encoding workflows is now possible for AR and VR live streaming applications.
Live video streaming platforms and services require a video encoding solution that offers both low-latency and high visual quality. NETINT’s ASIC-powered encoding solutions are the only available commercially and able to deliver end-to-end latency of less than 50ms with an operational cost that is 20-40x lower than software running on servers in a data center.
NETINT VPU’s are available in U.2 or PCIe form factors, enabling ten or more video transcoders to be installed in a single 1U server. A typical 1RU configuration can encode up to 160 1080p60 concurrent real-time streams in H.264, HEVC, or AV1, making them ideal for latency-sensitive AR and VR applications like sports live streaming and interactive gaming.
Video technology has demonstrated remarkable versatility, with the potential to transmit real-time, interactive experiences. Video can be used to encode social, entertainment, information, or software-based experience and deliver them to ubiquitous devices like PCs, TVs, and smartphones. Achieving more fluid collaborative and interactive experiences requires higher resolutions and framerates at lower latency. For services aspiring to scale, challenges have included cost, quality of service, visual quality, and motion-to-photon latency.
New cloud architectures promise to resolve user experience gaps. These architectures distribute compute to the network edge, lowering latency, decreasing backhaul traffic, and enabling new forms of data/sensor/display edge processing. Unfortunately, the cost of deployment and operations increases sharply at the edge of the network. The combination of smaller scale, expensive space and enclosures, remote management and maintenance, multi-site security, dedicated power system, and other factors make base stations and their server capacity expensive relative to large data centers.
To make edge computing economically viable requires breakthroughs in computing performance and hardware density. Simply getting to scale with social streaming services is extremely expensive; increasing the compute workload by running remote applications can drive insurmountable costs.
Making edge computing affordable requires realistic network topologies that balance the economics of scale with the need to drive low latency and high visual quality for interactive service delivery.
A new generation of dedicated video transcoders can reduce the compute requirement at the same quality by 10X and performance per watt by 20X improving the viability of cloud edge video services. NETINT, a pioneer in ASIC based encoding has introduced a family of dedicated video processors which combine video encoding, solid-state storage, and machine learning.
These processors radically reduce the server footprint required for interactive entertainment like gaming and mixed reality and can scale to economically deliver any native desktop, mobile, or head-mounted display application from the cloud edge.
Our ASIC-based technology, Codensity™, keeps advancing in scope and performance Right now, we have designed two proprietary solutions - G4 and G5. Talk to us to find out which solution is best for your needs.
NETINT Codensity ASICs-based solutions are available in five different configurations. Check out product pages or talk directly with NETINT Experts.
One size does not fit all! That's why our engineers will work directly with you to design the best solution, deploy, test, perfect and scale!
We are the only immediately implementable ASICs technology-based solution that once deployed can transport your video technology to unforeseen levels.
“We started NETINT with a big dream of combining the benefits of silicon with the quality and the flexibility of software for video encoding using purpose-built ASICs designed by us. In just 2021 NETINT customers encoded 200 billion minutes using our innovative video processing units.”
Alex Liu, Co-Founder, NETINT
Higher efficiency compared with SW on CPU.
Using ASICs you can achieve a 20x higher efficiency compared with SW on CPU. This means reducing your operational cost by an order of magnitude.
Reduced CO2 compared with SW on CPU.
Using ASICs you can reach environmental sustainability by reducing CO2 more than 80 times compared to software running on x86 CPUs in a data center.
1080p60 live streaming cost per channel.
With ASIC’s you can have it all – the flexibility and quality of SW with enormous power and cost benefits for as low as $150 per channel for a 1080p60 live stream.
With the growth of interactive streaming video applications including cloud gaming, Hyper-Scale video platforms are facing operational pressure to improve both video processing and encoding performance while maximizing power efficiency and simultaneously minimizing their environmental footprint.
The Integrated Game Streaming Platform, powered by NETINT ASIC encoding technology gives hyper-scale cloud gaming platforms increased levels of performance compared to CPU based software-encoding systems, while simultaneously reducing TCO by as much as 40x and carbon emissions 80x.
The Integrated Game Streaming Platform’s class leading performance is the result of harnessing the advanced encoding capabilities of NETINT’s T2 Video Processing Unit (VPU) with a Graphics Processing Unit (GPU) to create a totally integrated high-density game streaming solution.
The Quadra T2 is a next-generation low-latency, real-time Video Processing Unit featuring AV1, HEVC, and H.264 video encoding at up to 8K resolution with 10-bit HDR.
The addition of a GPU to the Integrated Game Streaming Platform enables the high-performance rendering of graphics-intensive 3D gaming workloads needed to deliver immersive cloud gaming experiences.
Fifty times increase in game streaming density compared to software .
Supports a wide variety of cloud gaming formats.
Optimized for metaverse video and cloud gaming applications. 8ms latency.
Enables advanced processing including object detection, classification, segmentation and ROI for image quality improvement and content adaptive rate control.
Multi-format support for operational flexibility.
High-performance GPU acceleration for immersive cloud gaming experiences
High capacity throughput for rapid deployment of additional gaming streams.
Video Cropping, Padding and Scaling for Encoding Ladder Generation and Image Composition. Video Overlay, YUV and RGB Conversion.
Utilizing the Integrated Game Streaming Server, mobile gaming platforms can reduce their server footprint by 50x compared to CPU-powered software game processing and encoding.
This increase in game rendering and encoding density expands the number of channels that can be encoded without increasing the rack footprint.
Higher density can be achieved with reduced power and without sacrificing video quality or latency.
The advanced architecture of the Integrated Game Streaming Platform leverages a high-performance GPU for real-time rendering of complex game graphics which are then encoded in real-time by the Quadra T2 VPU.
The combined capacity of the architecture enables 200 game streams to be processed and encoded concurrently.