Intel Optane DC Persistent Memory Improves Search, Reduces Costs in Baidu’s Feed Stream Services

Intel today announced Baidu is architecting the in-memory database of its Feed Stream services to harness the high-capacity and high-performance capabilities of Intel Optane DC persistent memory.

Intel today announced Baidu is architecting the in-memory database of its Feed Stream services to harness the high-capacity and high-performance capabilities of Intel® Optane DC persistent memory. Paired with 2nd Gen Intel® Xeon® Scalable processors, building a new memory platform based on Intel Optane DC persistent memory allows Baidu to lower its total cost of ownership (TCO) while delivering more personalized search results to users. Intel and Baidu disclosed details of this deployment and other joint collaborations on Thursday at the 2019 Baidu ABC Summit in Beijing.

“For over 10 years, Intel and Baidu have worked closely together to accelerate Baidu’s core businesses, from search to AI to autonomous driving to cloud services. Our deep collaboration enables us to rapidly deploy the latest Intel technologies and improve the way customers experience Baidu’s services,” said Jason Grebe, Intel corporate vice president and general manager of the Cloud Platforms and Technology Group.

As companies like Baidu manage the explosive growth of data, the need to quickly and efficiently access and store data is imperative. With today’s news, Baidu is advancing its Feed Stream services to deliver more personalized content to its customers.

Baidu uses an advanced in-memory database called Feed-Cube to support data storage and information retrieval in its cloud-based Feed Stream services. Deploying Intel Optane DC persistent memory and 2nd Gen Intel Xeon Scalable processors enable Baidu to ensure high concurrency, large capacity and high performance for Feed-Cube, while reducing TCO.

Through close collaboration, Intel and Baidu architected a hybrid memory configuration that includes both Intel Optane DC persistent memory and DRAM within the Baidu Feed Stream services. With this approach, Feed-Cube saw a boost in search result response times under the pressure of large concurrent access. At the same time, single-server DRAM usage dropped by more than half, which reduces costs in terms of the petabyte-level storage capacity of Feed-Cube. Intel and Baidu have published a detailed case study of this work, including examples of other applications using Intel Optane DC persistent memory technology, such as Redis, Spark and function-as-a-services.

“Using Intel Optane DC persistent memory within the Feed-Cube database enables Baidu to cost-effectively scale memory capacity to stay on top of the continuously expanding demands placed on our Feed Stream services,” said Tao Wang, chief architect, Recommendation Technology Architecture at Baidu.

Today’s news comes on the heels of Intel and Baidu recently signing a new memorandum of understanding (MoU) aimed at increasing the collaboration between the two companies in Baidu’s core business areas. Baidu and Intel will continue to work together to enable new products and technologies that play an increasingly important role in growing core Internet business scenarios as well as critical applications and services. The deeper collaboration between Baidu and Intel will help Baidu provide a more diverse and engaging user experience to its customers.

Exit mobile version