• "Towards AGI"
  • Posts
  • Game Over, AMD? Nvidia’s AI Dominance & NVLink Openness Shock Computex 2025!

Game Over, AMD? Nvidia’s AI Dominance & NVLink Openness Shock Computex 2025!

Nvidia’s Masterstroke.

Here is what’s new in the AI world.

AI news: AI Chip War Escalates

What’s new: Game Over, macOS?

Open AI: Open-Source Under Attack

OpenAI: Satya Nadella's Commute Hack

Hot Tea: Elon Musk Chooses Azure

Jensen Huang Unleashes Next-Gen AI Chips and Opens NVLink to Rivals!

At Computex 2025 in Taipei, Nvidia CEO Jensen Huang unveiled a sweeping vision for the company’s AI future, announcing groundbreaking hardware, open collaboration with former competitors, and a roadmap reinforcing its dominance in AI infrastructure.

Against a backdrop of geopolitical tensions and industry evolution, Huang positioned Nvidia as the linchpin of global AI advancement.

Next-Gen AI Hardware: GB300 Systems and Beyond

Nvidia confirmed its GB300 AI systems, successors to the Grace Blackwell lineup, will launch in Q3 2025. Designed to merge CPUs and GPUs into a unified architecture, these systems aim to accelerate both AI training and inference, targeting hyperscalers like AWS and Microsoft. The focus remains on optimizing generative AI workloads, ensuring faster processing and energy efficiency for large-scale deployments.

In a strategic pivot, Nvidia introduced NVLink Fusion, expanding its proprietary high-speed interconnect technology to third-party chipmakers. Previously exclusive to Nvidia GPUs, this update allows external CPUs and accelerators (e.g., from Qualcomm, Fujitsu) to integrate seamlessly with Nvidia’s ecosystem.

Partners like MediaTek and Marvell will leverage NVLink to co-develop custom AI chips, signaling a shift toward interoperability in an often-fragmented industry.

Collaborations and Accessibility

  • Global Chipmaker Partnerships: MediaTek, Marvell, and Alchip are crafting bespoke AI chips optimized for Nvidia’s frameworks, while Qualcomm and Fujitsu design processors compatible with Nvidia accelerators.

  • DGX Spark: A compact AI workstation targeting startups and researchers, offering enterprise-grade computing without data center dependencies. Already in production, it democratizes access to advanced AI tools.

Roadmap to 2028: Blackwell Ultra, Rubin, and Feynman

Nvidia outlined its processor pipeline, including:

  • Blackwell Ultra (2026): Enhanced performance for foundational AI models.

  • Rubin and Feynman (2027–2028): Focused on scalable AI applications across healthcare, finance, and manufacturing.

Strategic Context and Geopolitical Tensions

Huang reflected on Nvidia’s evolution from a gaming GPU leader to the “AI computing company,” now central to global tech infrastructure. The announcements coincided with U.S. threats of semiconductor tariffs under President Trump, highlighting Taiwan’s critical role in chip supply chains. With 1,400+ exhibitors, Computex underscored both innovation and the fragile geopolitics shaping tech alliances.

By bridging collaboration with rivals and advancing hardware-roadmap ambition, Nvidia aims to solidify its role as the backbone of AI’s next era—balancing competition with cooperation in a rapidly shifting landscape.

Microsoft’s Linux Love Affair Goes Open-Source

Microsoft has announced the open-source release of the Windows Subsystem for Linux (WSL), marking a milestone in its multiyear effort to empower developers. The code, now available on GitHub, enables users to build, modify, and contribute to WSL’s development, fulfilling a long-standing community request dating back to the project’s inception.

WSL Architecture and Components

The open-source release includes:

  • Command-line tools: wsl.exe, wslconfig.exe, and wslg.exe for interacting with WSL.

  • Core services: The WSL service (wslservice.exe) managing virtual machines, distros, and file shares.

  • Linux processes: Init scripts, networking tools (gns), and port forwarding (localhost).

  • File-sharing: Plan9 server implementation for Linux-Windows file integration.

Existing open-source components like WSLg (graphics support) and the WSL2-Linux-Kernel remain available. Notably, some drivers (e.g., Lxcore.sys for WSL1, P9rdr.sys for file redirection) remain proprietary.

Evolution of WSL: A Brief History

  • 2016: WSL debuted, enabling Linux binaries on Windows via kernel syscalls (WSL1).

  • 2019: Shifted to a VM-based approach with WSL2 for better Linux compatibility.

  • 2021: Decoupled from Windows, released as a standalone Store app (v0.47.1).

  • 2022: Achieved stability with v1.0.0, supporting Windows 10/11.

  • 2024: Current version (v2.5.7) introduces features like mirrored networking, DNS tunneling, and firewall support.

Why Open Source Now?

Microsoft cites the growing WSL community as pivotal to its decision. Despite limited access to source code, developers have long contributed through feedback and testing. Open-sourcing aims to accelerate innovation, allowing direct code contributions and faster iteration.

Impact and Future

This move democratizes WSL’s development, inviting global collaboration to enhance performance, compatibility, and features. By transitioning users to the standalone WSL package (phasing out built-in versions), Microsoft underscores its commitment to agility and community-driven progress.

With the code now public, developers can shape WSL’s future, ensuring it remains a cornerstone of cross-platform development. Explore the repository here and join the evolution of this transformative tool.

The Gen Matrix Advantage

In a world drowning in data but starved for clarity, Gen Matrix second edition cuts through the clutter. We don’t just report trends, we analyze them through the lens of actionable intelligence.

Our platform equips you with:

  • Strategic foresight to anticipate market shifts

  • Competitive benchmarks to refine your approach

  • Network-building tools to forge game-changing partnerships

Malware Alert: PyPI Package Steals Code from Developers!

Security researchers at ReversingLabs have uncovered a deceptive Python package, “dbgpkg,” on the PyPI repository, disguised as a debugging tool but designed to deploy a covert backdoor. This discovery highlights escalating threats to open-source ecosystems, with potential ties to Phoenix Hyena, a pro-Ukrainian hacktivist group targeting Russian entities since the 2022 invasion.

Stealthy Attack Mechanism

The malicious package employs function wrappers, code modifiers that discreetly alter program behavior to hijack widely used networking libraries like requests and sockets. By embedding itself within these modules, the malware activates only during runtime, evading detection. Once triggered, it:

  • Fetches a public key from Pastebin.

  • Installs Global Socket Toolkit to bypass firewalls.

  • Transmits encrypted connection secrets to a private Pastebin.

Ties to Hacktivist Operations

While attribution remains cautious, ReversingLabs notes similarities to Phoenix Hyena’s prior campaigns, including the 2024 DR Web breach. The group, also known as DumpForums, leaks stolen data on platforms like Telegram. The consistent payload patterns and timing of earlier uploads bolster suspicions of their involvement.

Broader Campaign and Historical Precedents

This tactic mirrors earlier attacks via packages like “discordpydebug” and “requestsdev”, which mimicked legitimate tools, including those associated with Python contributor Cory Benfield. Notably, “discordpydebug” evaded detection for three years, accumulating over 11,000 downloads, underscoring persistent vulnerabilities in open-source repositories.

Implications for Developers

The sophistication of these attacks—leveraging advanced obfuscation and persistence tools, signals a high skill level among threat actors. Developers are urged to:

  • Scrutinize packages, even those appearing benign.

  • Verify sources and maintain vigilance against social engineering tactics.

As open-source platforms remain prime targets, this incident reinforces the need for enhanced security protocols and proactive threat monitoring to safeguard software supply chains.

Satya Nadella Listens to 10-Hour Podcasts in Minutes, Using AI!

Microsoft CEO Satya Nadella has shared his unconventional approach to consuming podcasts, relying on AI tools like Microsoft Copilot to streamline his workflow. During his commute to Redmond headquarters, Nadella uploads podcast transcripts into the Copilot app on his iPhone, engaging with the AI assistant to dissect content through voice interactions.

“I’m an email typist,” he quipped, highlighting his reliance on Copilot for tasks such as summarizing Outlook emails and Teams messages. Nadella also utilizes 10+ custom AI agents via Copilot Studio, which he likens to “AI chiefs of staff” managing his daily priorities.

Strategic Embrace of Competitors


Nadella emphasized Microsoft’s collaborative ethos in the AI race, recounting how the company integrated DeepSeek R1, a rival AI model, into its Azure cloud platform upon release.

“Get it out,” he instructed teams, ensuring customers could access R1 alongside offerings from OpenAI and Microsoft itself. This move reflects Microsoft’s strategy to position Azure as an agnostic hub for diverse AI solutions.

AI’s Growing Role in Microsoft’s Ecosystem


The revelations come ahead of Microsoft Build 2025, the company’s flagship developer conference, where Nadella is expected to unveil new AI-driven features. His recent appearance at Meta’s LlamaCon underscored AI’s expanding footprint: 20–30% of Microsoft’s codebase is now AI-generated, accelerating development cycles. However, this shift has coincided with workforce reductions, including 6,000 layoffs in May 2025, the largest since 2023, disproportionately affecting programmers.


While AI boosts efficiency, its integration raises questions about workforce dynamics. Nadella’s blend of personal AI adoption and strategic partnerships illustrates Microsoft’s push to lead the AI era, balancing innovation with pragmatic collaboration, even with competitors—to shape the future of enterprise technology.

Why It Matters?

  • For Leaders: Benchmark your AI strategy against the best.

  • For Founders: Find investors aligned with your vision.

  • For Builders: Get inspired by the individuals shaping AI’s future.

  • For Investors: Track high-potential opportunities before they go mainstream.

Musk’s Grok AI Joins Microsoft Cloud, While Suing OpenAI

In a surprising move, Elon Musk virtually appeared at Microsoft’s Build 2025 developer conference, announcing that his AI chatbot Grok, developed by his startup xAI, will now be hosted on Microsoft’s Azure cloud platform.

This partnership places Grok alongside rival AI models such as OpenAI’s ChatGPT, Meta’s Llama, and others from Mistral and DeepSeek, a notable collaboration given Musk’s ongoing lawsuit against Microsoft and OpenAI.


Despite co-founding OpenAI in 2015, Musk has been critical of its shift toward commercialization and its close ties with Microsoft, filing a lawsuit in 2023 alleging abandonment of its nonprofit mission.

However, during a pre-recorded dialogue with Microsoft CEO Satya Nadella, Musk emphasized transparency in AI development, stating, “We aspire to correct mistakes quickly… honesty is the best policy for AI safety.” The event sidestepped recent controversies, including Grok’s unintended focus on sensitive racial topics, which xAI blamed on an employee’s “unauthorized modification.”

OpenAI’s Presence and GitHub’s AI Agent


Earlier at Build 2025, OpenAI CEO Sam Altman joined Nadella to reaffirm Microsoft’s partnership, highlighting integrations across Bing, GitHub, and other services.

Meanwhile, Microsoft-owned GitHub unveiled an advanced AI coding agent designed to autonomously handle routine tasks in established codebases, freeing developers for complex work. The tool targets “low-to-medium complexity” functions in stable environments, expanding on GitHub Copilot’s capabilities.

The announcements followed Microsoft’s recent layoffs of 6,000 employees (3% of its workforce), primarily affecting technical roles. Despite this, the conference focused on innovation, underscoring Azure’s role as a neutral platform for diverse AI tools, even those from competitors, as the industry navigates ethical and competitive challenges.

Musk’s participation signals a pragmatic alliance in the AI race, balancing rivalry with collaboration to advance cloud-based AI ecosystems.

Your opinion matters!

Hope you loved reading our piece of newsletter as much as we had fun writing it. 

Share your experience and feedback with us below ‘cause we take your critique very critically. 

What's your review?

Login or Subscribe to participate in polls.

Thank you for reading

-Shen & Towards AGI team