SK hynix Unveils Industry’s First 16-Hi HBM3E Memory, Up To 48 GB Per Stack, PCIe 6.0 SSDs Also In The Works

SK hynix has unveiled the world's first 16-Hi HBM3E memory solution which comes packed with up to 48 GB capacity per stack. SK hynix Unveils Next-Gen HBM3E Memory With 16-Hi…

Continue Reading SK hynix Unveils Industry’s First 16-Hi HBM3E Memory, Up To 48 GB Per Stack, PCIe 6.0 SSDs Also In The Works

US Policymakers Are Reportedly Open To Potential Intel “Merger Deal”, As They Explore Options To Pull Team Blue Out Of The Danger

Intel's financial troubles have started bothering the US Commerce Department, which is now reportedly exploring options for recovery, including a potential merger. Intel's Too Important For The US's Semiconductor Ambitions,…

Continue Reading US Policymakers Are Reportedly Open To Potential Intel “Merger Deal”, As They Explore Options To Pull Team Blue Out Of The Danger

Personal training is losing its appeal, according to research – and AI apps could be to blame

On October 29, we reported that the American College of Sports Medicine (ACSM) had published the annual survey of its predicted top fitness trends for 2025. The research provides “valuable…

Continue Reading Personal training is losing its appeal, according to research – and AI apps could be to blame

AWS CEO estimates large city scale power consumption of future AI model training tasks — 'an individual model may require somewhere between one to 5GW of power'

AWS CEO Matt Garman estimates that future LLM training would require up to five gigawatts, and AWS is investing in alternative renewable sources to ensure it would have power available…

Continue Reading AWS CEO estimates large city scale power consumption of future AI model training tasks — 'an individual model may require somewhere between one to 5GW of power'

Samsung Expects HBM3E Integration In NVIDIA’s AI Accelerators By Next Quarter, Refuting Rumors of “HBM Business” Fall-Out

Samsung's ambitions of supplying HBM products to NVIDIA aren't over yet, as the firm announced that their 5th-gen HBM3E memory is slated to be used in NVIDIA's flagship AI accelerators.…

Continue Reading Samsung Expects HBM3E Integration In NVIDIA’s AI Accelerators By Next Quarter, Refuting Rumors of “HBM Business” Fall-Out