中文版本:
Back to Table of Contents:
To ensure codec performance, as well as inference performance, make sure that your onboard computer has the following minimum operating requirements (If you choose to remote inference, a Raspberry Pi/Jetson is enough :P):
| Components | Minimum Requirements | Used for | Reasoning | 
|---|---|---|---|
| CPUs | 4 Cores | Video streaming for remote teleoperation and image compression of dataset creation | These operations use software codecs, and low CPU performance can result in encoding frame rates not keeping up with camera frame rates | 
| RAM | 8 GB | Dataset creation cache | Camera images are cached in memory for a period of time before being written to disk | 
| GPU | NVIDIA RTX2060 | Inference of VLA models | AhaRobot uses NVIDIA GPUs for inference acceleration | 
The main part of AhaRobot runs on ROS2 Humble and employs Docker to isolate the main system. Make sure you have the following software components installed on your onboard computer:
| Components | Installation Instructions | 
|---|---|
| Ubuntu 22.04 Desktop | https://ubuntu.com/download/desktop | 
| Since 22.04, ubuntu has a built-in RDP-based remote desktop support | |
| ⚠️ If you are using an NVIDIA Blackwell (50-series) graphics card, gnome-remote-desktop may encounter compatibility issues. Please use version 25.10 or later, or use Sunshine/Moonlight (recommended). | |
| Ref: https://www.reddit.com/r/Ubuntu/comments/1mu9mpi/remote_login_rdp_protocol_nvidia_5090/ | |
| Sunshine (optional) | https://docs.lizardbyte.dev/projects/sunshine/latest/md_docs_2getting__started.html | 
Every time you reboot your system, you need to start the service in the terminal with this command: systemctl --user restart sunshine && DISPLAY=:0 xhost local: + | 
|
| Docker (recommended) | https://docs.docker.com/engine/install/ubuntu/ | 
| For isolation with system | |
| NVIDIA Container Toolkit (recommended) | https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html | 
| For VLA inference | 
Continue reading the next section: