NVIDIA plans to create a new type of data center chip

NVIDIA plans to create a new type of data center chip

Put a company Nvidia on Monday has a multi-year plan to create a new type of chip for its data centers, with the goal of pulling more functions from its main competitor, Intel.

She said NVIDIA in a press briefing: The models with limited features will debut within months, while full versions are planned within two years. She added that server companies, such as Dell and Lenovo, plan to integrate them into their devices.

It is noteworthy that Nvidia chips have long been used to improve video game graphics, but in recent years they have helped speed up tasks of artificial intelligence, such as: image recognition. Typically, chips are placed next to the Intel core processor, which can free up some computing work.

NVIDIA is now seeking to get more of the functionality with the proposed (data processing unit) chipset. It will combine the networking technology that Nvidia acquired through the purchase of Mellanox for $ 6.9 billion, with artificial intelligence and computing power from ARM. Nvidia last month agreed to buy ARM from SoftBank for $ 40 billion.

Manover Das, head of corporate computing at Nvidia, said in a press briefing that, using artificial intelligence, the chips can detect intruders trying to break into the data center. The chips will review network traffic for unusual patterns and seek to prevent them proactively. Note that these tasks previously required a combination of chips.

The ARM acquisition makes the company a greater presence in mobile computing, especially when it comes to bringing its AI technology to platforms such as: smartphones, computers, and self-driving cars.

To reinforce its commitment, Nvidia says it will build an AI supercomputer powered by ARM CPUs at the company’s Cambridge headquarters.

The company confirmed that ARM will continue to maintain the open licensing model and neutrality with existing customers. This opens the door to ARM designs that also apply Nvidia’s technology, especially their GPUs.

Leave a Reply