Billionaire Philippe Laffont Just Bought 2.8 Million Shares of This Unstoppable Nuclear Power Stock


By now, I’m sure you’ve realized that artificial intelligence (AI) is the next megatrend.

What you may not fully grasp, however, is that not all AI opportunities reside in the technology industry. Take the energy sector as an example. So far in 2024, that sector has returned 12.5% — pretty solid although it admittedly pales in comparison to the tech sector’s 35% year-to-date return.

Are You Missing The Morning Scoop?  Breakfast News delivers it all in a quick, Foolish, and free daily newsletter. Sign Up For Free »

While the energy sector as a whole may not be as lucrative as the technology industry, consider that uranium stocks have returned roughly 36% year to date — nearly identical to the tech industry’s result. What gives? Well, believe it or not, nuclear power has some pretty lucrative opportunities related to AI.

During the third quarter, Philippe Laffont’s hedge fund, Coatue Management, amped up its stake in nuclear powerhouse Constellation Energy (NASDAQ: CEG) — scooping up 2.8 million shares and increasing the fund’s position by 57%. Let’s consider how the AI narrative is impacting the nuclear energy space, and whether this would be a good time to follow Coatue’s lead.

The AI ecosystem can be complicated to understand. I like to think about it in terms of a metaphorical car. So if generative AI represents a car, then graphics processing units (GPU) are the engine powering the vehicle. But beyond the vehicle itself are the factories where vehicles are assembled and the roads upon which they drive. In my metaphor, data centers represent these vital pieces of infrastructure.

While companies such as Nvidia and Advanced Micro Devices have benefited greatly by selling the GPUs that AI requires over the last couple of years, so have companies specializing in data center services. Data centers house the networking connectivity infrastructure needed to connect GPUs, CPUs, and storage and memory chips.

As demand for AI processing power rises, so does the need to access these data centers. But once those data centers are built, keeping them running at full speed around the clock is costly — particularly in terms of electricity costs.

Training and inferencing workloads are expected to rise considerably as developers build more sophisticated and cutting-edge AI protocols. As that occurs, the amount of electricity being drawn to power those data centers will increase in tandem. Translation: AI companies need to identify more cost-efficient ways to power data centers.



Source link

About The Author

Scroll to Top