What just happened? Samsung has just announced the mass production of the industry's skinniest DRAM memory. These cutting-edge LPDDR5X chips measure just 0.65mm, shaving a decent 0.06mm off the typical dimensions. The company says this will allow enhanced thermal control suitable for on-device AI mobile applications.

Built on Samsung's 12nm process tech, the new packages will come in 12GB and 16GB sizes. They utilize an innovative 4-stack design made possible by some clever engineering tricks, including optimizing the printed circuit board, epoxy molding compound, and back-lapping process.

Beyond just letting smartphone makers pursue ever-thinner design fantasies, the slimline memory could offer performance benefits too. Samsung claims the reduced width can allow for improved airflow inside devices' chassis, providing better thermal management. The company says all this is important for "high-performance applications" such as on-device AI.

The 0.65mm packages are 9% trimmer than standard LPDDR5X modules, while heat resistance has been boosted by over 21% versus previous generation chips. However, whether this translates to thinner phones remains to be seen – handset makers employ a variety of slimming techniques beyond just using skinnier RAM.

Still, combined with other space-saving design choices for components like displays and batteries, the adoption of Samsung's new DRAM could enable slight reductions in overall device thickness.

Looking ahead, Samsung plans to push the LPDDR5X density even higher with 24GB 6-layer packages and 36GB 8-layer stacks, though we don't know how thin those future memory modules might be. Back in April, the company also unveiled what it claimed was the fastest LPDDR5x DRAM in the industry.

Any thermal headroom doesn't mean much if smartphone makers don't put it to good use, of course. Samsung's been trying to expand the scope of its Galaxy AI – the latest development is an upcoming suite of clever camera tricks like Auto Zoom and Portrait Studio cartoon filters. But it's unclear how many of those AI-powered features will actually run locally on the hardware.

Despite all the on-device AI hype, there aren't really any mind-bendingly complex generative models running on mobile chips yet. A few basic text prediction smarts, but nothing remotely taxing enough to stress a smartphone's thermals.