Close Menu
  • Graphic cards
  • Laptops
  • Monitors
  • Motherboard
  • Processors
  • Smartphones
  • Smartwatches
  • Solid state drives
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Dutchieetech
Subscribe Now
  • Graphic cards
  • Laptops
  • Monitors
  • Motherboard
  • Processors
  • Smartphones
  • Smartwatches
  • Solid state drives
Dutchieetech
Processors

Omdia: AI boosts server spending however unit gross sales nonetheless plunge

dutchieetech.comBy dutchieetech.com5 December 2023No Comments4 Mins Read

Server unit shipments for calendar yr 2023 may drop by as a lot as 20% when in comparison with final yr, in keeping with projections by a market analysis agency. Nonetheless, revenues are rising for the server distributors, so whereas they could be promoting fewer servers when it comes to models, they’re promoting pricier, extra decked-out {hardware}.

In its newest market replace for cloud and information heart, Omdia forecasts server unit shipments to say no by 17% to twenty% this yr, whereas income is anticipated to develop by 6% to eight%.

Omdia cites the rise of heterogeneous computing for shift in spending. As a substitute of shopping for servers with simply x86 processors, clients are shopping for servers with GPUs, DPUs, AI processing and inferencing chips, and different silicon processors.


With so many costly chips going into the servers, Omdia predicts CPUs and co-processors to account for 30% of information heart spending by 2027, in contrast with lower than 20% within the earlier decade.

Unsurprisingly, this shift is being pushed by AI. Omdia mentioned there was a dramatic shift in information heart funding priorities this yr, pushed by a rush to construct AI capability, which made forecasting in 2023 extremely tough.


Nvidia is the preferred provider of GPUs for AI processing regardless of two prepared rivals in Intel and AMD. Final quarter, Nvidia bought virtually a half million GPUs to information heart clients, and people processors go for a rumored $40,000 per card.

And whereas Nvidia did not complain about provide on its most up-to-date earnings name, apparently it does have an issue with it. Omdia says main server OEMs like Dell, Lenovo, and HPE will not be capable of fulfil GPU server orders but, resulting from an absence of GPU provide from Nvidia. OEMs indicated a lead time of 36 to 52 weeks for servers configured with Nvidia H100 GPUs.

That is partially as a result of a couple of gamers are taking the entire provide. Omdia famous that each Microsoft and Meta are on observe to obtain 150,000 of Nvidia’s H100 accelerators by the tip of this yr – which is 3 times as many as Nvidia’s different main clients, Google, Amazon and Oracle.

These high-powered servers are additionally driving demand for higher energy effectivity and administration. Information heart operators must get extra compute energy out of the identical energy envelope resulting from constraints and energy provide. Omdia mentioned rack energy distribution income in 1H23 was up 17% over final yr, whereas UPS income progress for the primary half of 2023 was 7% forward of final yr.


“With a ramp {of professional} providers for generative AI enabling broad enterprise adoption in 2024 and past, the one factor that may curb the present fee of AI deployment is energy availability,” Omdia mentioned in its report.

AI can also be driving demand for liquid cooling, since air cooling is solely now not environment friendly for the extremely popular processors utilized in AI. Cooling distributors and server OEMs inform Omdia direct-to-chip liquid cooling is ramping according to its forecast for 80% income progress throughout the yr, and it famous that server vendor Tremendous Micro just lately mentioned that it expects 20% of the servers it ships in 4Q23 will use direct-to-chip liquid cooling.

Up by 2027, Omdia expects continued progress in rack energy density, server efficiency enchancment, and server fleet consolidation. There can be a powerful give attention to computing efficiency to allow the commercialization of AI. AI fashions will proceed to be a analysis undertaking requiring an excessive amount of tuning, even libraries of pre-trained fashions.

Due to the speed of developments, it expects significant server refresh cycles to happen, the place enterprises will prioritize gear consolidation and utilization enchancment throughout the refresh cycle. And it expects the development for hyper-heterogeneous computing, with servers configured with 1-2 CPUs and as much as 20 workload-optimized custom-build co-processors, will allow a major consolidation in server fleets.

Source link

dutchieetech.com
  • Website

Related Posts

Intel simply up to date us on sport crashes, and it’s not trying good

21 June 2024

Intel Publishes Steerage For Crashing Core I9 Processors, ETVB Bugfix On The Approach – Pokde.Internet

21 June 2024

Linux 6.10 Fixes AMD Zen 5 CPU Frequency Reporting With cpupower

6 June 2024

Intel Unveils Core Extremely Processor with Built-in AI Capabilities

6 June 2024

AORUS Tachyon, AORUS Master, AORUS Ultra, AORUS Elite, AERO G

6 June 2024

Intel particulars its Lunar Lake structure with spectacular enhancements

4 June 2024
Leave A Reply Cancel Reply

You must be logged in to post a comment.

Legal Pages
  • Disclaimer
  • Privacy Policy
  • About Us
  • Contact Us

Type above and press Enter to search. Press Esc to cancel.