Published on February 10, 2026

Robot Digital Twins Now Run on Standard PCs with AMD Graphics Cards

AMD has demonstrated technology for simulating robotic operations locally on a PC using its own GPUs – without relying on cloud servers or expensive computing farms.

Products 4 – 6 minutes min read
Event Source: AMD 4 – 6 minutes min read

Benefits of Using Digital Twins in Robotics Development

When a Virtual Robot is Cheaper Than the Real Deal

Imagine this: you're developing a warehouse robot. Every glitch in the movement algorithm is a risk of crashing into racks, damaging goods, or bringing the entire system to a halt. Testing everything on physical hardware is slow and expensive. That's why the industry has long relied on digital twins – virtual copies of robots that behave exactly like the real thing but exist only within a simulator.

The catch is that such simulations usually require beefy servers or cloud resources. AMD decided to show that this can be done locally – right on a standard computer equipped with their graphics card. The project is called ROCm Genesis. It's built on the foundation of NVIDIA's popular Isaac Sim robotics engine, adapted for AMD hardware.

What is a Digital Twin and Why Do You Need One?

A digital twin isn't just a 3D model of a robot. It's a full-fledged virtual environment where the robot interacts with objects, receives sensor data, processes it via neural networks, and makes decisions. Everything happens just like in reality, but without the physical consequences if things go south.

This approach allows you to:

  • test robot control algorithms before any metal is even cast;
  • check how the robot behaves in edge cases – for instance, if an object falls or a sensor sends faulty data;
  • train neural networks on synthetic data generated in the simulator instead of collecting thousands of real-world examples.

Simply put, it's a test bench that's always at your fingertips and requires no physical floor space.

Advantages of Local Simulation for Robotics Developers

Why This Matters for Developers

Until recently, running such simulations on a local machine was a privilege reserved for owners of flagship NVIDIA GPUs. AMD has stepped up with an alternative: their solution works on Radeon series graphics cards and Ryzen AI processors with ROCm support – an open platform for GPU computing.

The key advantage here is locality. There's no need to connect to the cloud, pay for server rentals, or depend on your internet speed. Everything runs on your own computer, and your data stays under your control.

This is especially relevant for:

  • small teams and startups that don't have the budget for massive cloud infrastructure;
  • research labs working with confidential data;
  • developers who need fast iteration without the lag of syncing with a remote server.

Implementing AMD ROCm Genesis for Robot Simulations

How It Works in Practice

AMD demonstrated the system using a robotic arm as an example. In the simulator, the robot picks up objects, moves them, and adapts to environmental changes. All the heavy lifting – from visual data processing to neural network decision-making – is performed on a single PC.

This was made possible by optimizing Isaac Sim for AMD's architecture. Developers used ROCm to accelerate machine learning operations and adapted the simulator's physics engine for AMD GPUs.

A crucial point: we're not talking about a mere demo prototype, but a full-scale tool. The simulator supports complex scenarios – multiple objects, dynamic lighting, and various sensor types. This means professionals can use it for real-world tasks, not just simple experiments.

Challenges of Running Robotics Simulations on AMD Hardware

Limitations and Open Questions

Of course, it's not all smooth sailing. First, Isaac Sim was originally built for NVIDIA's hardware and software stack, and porting it to AMD solutions requires extra effort. Not all features may work identically, and performance can vary depending on the specific GPU model.

Second, ROCm is still a relatively young platform compared to NVIDIA's CUDA. Support for tools and libraries is growing fast, but not all popular frameworks work with it with the same level of efficiency just yet.

Finally, there's the question of scalability. One robot in a simulator is one thing, but how will the system hold up if you need to simulate dozens or hundreds of agents simultaneously? For such scenarios, a local solution might fall short, and cloud computing will remain relevant.

Impact of Accessible Digital Twin Technology on the Robotics Market

What This Means for the Industry

AMD's project shows that the market for robotics and AI tools is gradually becoming more open. For a long time, NVIDIA held a dominant position thanks to its CUDA architecture and the mature ecosystem surrounding it. Now, alternatives are emerging, allowing developers to choose hardware based on their needs and budget, rather than just platform compatibility.

For engineers, this means more flexibility. For the industry, it brings healthy competition, which drives technological progress and lowers costs.

Digital twins have already become the standard in automotive, logistics, and manufacturing. Now, this technology is becoming more accessible to those just starting their journey in robotics or those who lack the deep pockets of a major corporation. And that is perhaps the main achievement of AMD's approach – not just a technical update, but a significant step toward the democratization of professional tools.

Original Title: Digital Twins on AMD: Building Robotic Simulations Using Edge AI PCs – ROCm Blogs
Publication Date: Feb 10, 2026
AMD www.amd.com An international company manufacturing processors and computing accelerators for AI workloads.
Previous Article European AI Chip Passes Validation: What This Means for Technological Independence Next Article How GenAI and OpenTelemetry Are Reshaping Observability: System Monitoring Trends in 2026

Related Publications

You May Also Like

Explore Other Events

Events are only part of the bigger picture. These materials help you see more broadly: the context, the consequences, and the ideas behind the news.

From Source to Analysis

How This Text Was Created

This material is not a direct retelling of the original publication. First, the news item itself was selected as an event important for understanding AI development. Then a processing framework was set: what needs clarification, what context to add, and where to place emphasis. This allowed us to turn a single announcement or update into a coherent and meaningful analysis.

Neural Networks Involved in the Process

We openly show which models were used at different stages of processing. Each performed its own role — analyzing the source, rewriting, fact-checking, and visual interpretation. This approach maintains transparency and clearly demonstrates how technologies participated in creating the material.

1.
Claude Sonnet 4.5 Anthropic Analyzing the Original Publication and Writing the Text The neural network studies the original material and generates a coherent text

1. Analyzing the Original Publication and Writing the Text

The neural network studies the original material and generates a coherent text

Claude Sonnet 4.5 Anthropic
2.
Gemini 3 Pro Google DeepMind step.translate-en.title

2. step.translate-en.title

Gemini 3 Pro Google DeepMind
3.
Gemini 3 Flash Preview Google DeepMind Text Review and Editing Correction of errors, inaccuracies, and ambiguous phrasing

3. Text Review and Editing

Correction of errors, inaccuracies, and ambiguous phrasing

Gemini 3 Flash Preview Google DeepMind
4.
DeepSeek-V3.2 DeepSeek Preparing the Illustration Description Generating a textual prompt for the visual model

4. Preparing the Illustration Description

Generating a textual prompt for the visual model

DeepSeek-V3.2 DeepSeek
5.
FLUX.2 Pro Black Forest Labs Creating the Illustration Generating an image based on the prepared prompt

5. Creating the Illustration

Generating an image based on the prepared prompt

FLUX.2 Pro Black Forest Labs

Want to dive deeper into the world
of neuro-creativity?

Be the first to learn about new books, articles, and AI experiments
on our Telegram channel!

Subscribe