In partnership with
NEW LAUNCHES
The latest features, products & partnerships in AI
IMPLEMENTATION
Announcements, strategies & case studies
AI MODELS
Deployment, research, training & infrastructure
AI INFRASTRUCTURE
Buildout, financing, policy, energy & hardware
IN OTHER NEWS
Compelling stories beyond the usual categories
AI events
The Best way to get AI literate? Go to some awesome events
As a valued member of CO/AI, you're invited to join us at HumanX 2025, the premier AI conference shaping the future of technology.
Why Attend HumanX?
Connect with Industry Leaders:Â Network with C-suite executives, innovators, and policymakers.
Learn from AI Experts:Â Gain insights from top-tier speakers like Kevin Weil, Clara Shih, and Sridhar Ramaswamy.
Discover Real-World Solutions:Â Explore actionable strategies and solutions to drive business growth.
Don't miss this opportunity to be part of the AI revolution
What’s happening in AI right now
Tech giants scramble for power, chips, and infrastructure as enterprise implementation scales

JPMorgan Chase's deployment of AI tools to over 200,000 employees marks a watershed moment - not just for the financial giant, but for enterprise AI adoption writ large. The bank's massive AI implementation on AWS highlights both the immense potential and daunting infrastructure challenges as organizations push to scale AI. With nearly 1,000 applications now running on AWS, including core banking services, JPMorgan demonstrates how AI implementation is moving from experimentation to widespread deployment.
Power hungry
The computational requirements are staggering. Global AI processing is projected to consume 40 gigawatts of power by 2026 - equivalent to eight New York Cities. This isn't just about training large language models anymore; inference workloads from deployed AI systems are becoming a major driver of computing demand.
Meta has just announced a $10 billion data center in Louisiana while simultaneously pursuing nuclear power agreements to fuel its AI ambitions. The company aims to add 1-4 gigawatts of new nuclear capacity in the US by the early 2030s. They're not alone - Amazon, Microsoft, and Google are all scrambling to secure sustainable power sources, recognizing that traditional grid power may not suffice to power the compute necessary.
Infrastructure innovation
The push for AI-ready infrastructure is driving fascinating technical innovations. Schneider Electric and Nvidia have partnered to create data center designs supporting liquid-cooled AI clusters up to 132 kW per rack. The design introduces advancements in power management and cooling efficiency that could set new standards for AI infrastructure.
The ripple effects extend to telecommunications. Verizon achieved 1.6 terabit-per-second transmission speeds on its metro fiber network, covering 118 kilometers with 10 network hops. This breakthrough reduces space and power consumption by 50% per bit while delivering the bandwidth needed for real-time AI applications.
The chip race
Jensen Huang, Nvidia's CEO, frames AI infrastructure as essential for national technological independence, conducting high-level meetings with government officials globally. Yet while Nvidia remains dominant, viable alternatives are emerging. AMD's MI300 chip projects $5 billion in first-year sales, while Amazon advances its Trainium chip. In addition, the development of new AI chips in Austin highlights Texas’s growing importance as a semiconductor innovation hub.
Strategic implicationsÂ
This infrastructure build-out carries profound strategic implications. Companies must now think beyond just AI models and applications to consider the full stack of infrastructure requirements:
Power strategy: Securing reliable, sustainable power sources becomes a critical competitive advantage
Data center Architecture: Next-generation cooling and power management capabilities determine AI deployment capacity
Network infrastructure: The ability to move massive amounts of data efficiently enables real-time AI applications
Chip supply: Access to high-performance AI chips remains a potential bottleneck
We publish daily research, playbooks, and deep industry data breakdowns. Learn More Here
In Bagel’s most recent article, they reveal how large language models are evolving from prediction tools to cognitive agents. From breaking down math problems to understanding everyday logic, this is AI's next frontier.
Key takeaways:
Training-time methods: These approaches, including fine-tuning and curriculum learning, focus on enhancing AI during development by tailoring its abilities to solve complex tasks and building specialized skills for specific problem areas.
Inference-time methods: Techniques like chain of thought and self-consistency optimize AI’s reasoning and decision-making in real time, enabling more accurate and dynamic problem-solving without requiring retraining.
In other words, Bagel’s article explains how AI is evolving from simple prediction tools to smarter systems that can reason through problems and make better decisions in real time.
📬 Join 30,000+ readers exploring the cutting edge of AI research. Read Bagel’s most recent article to understand how reasoning will define the next leap in AI’s evolution.
AI generated art
A look at the art and projects being created with AI.
