Gazebo is not yet published and is only visible on this page. Upgrade your listing to skip the queue and get published within 24 hours.
Upgrade listingGazebo is an open-source robotics simulator used to model robots, environments, sensors, and control systems before hardware is built or deployed. It combines a 3D simulator, SDF-based world description, plugin interfaces for physics and rendering, and tight ROS 2 integration through ros_gz packages and bridge tooling.
Gazebo lets teams run physics-based simulations for mobile robots, manipulators, and autonomous systems using configurable worlds and reusable models. It supports common robotics workflows such as sensor simulation, controller validation, collision testing, and continuous integration checks for robot software.
Engineering teams use Gazebo to iterate on robot designs and behaviors before touching physical hardware. A common pattern is to build worlds in SDF, run the simulator headless or with the GUI, bridge data to ROS 2, and visualize the resulting state in tools such as RViz while testing planners, navigation stacks, or manipulation logic.
Gazebo is built as a modular simulator with separate libraries for simulation, transport, physics, rendering, and sensors. The platform exposes plugin interfaces for physics engines, rendering backends, GUI extensions, and custom simulation systems, which makes it suitable for research teams and product teams that need to customize environments or runtime behavior.
Gazebo is primarily deployed on-premise through Ubuntu packages, ROS vendor packages, source builds, or local development environments on Linux and macOS. It can run with both server and GUI processes, supports headless execution for automation and CI, and uses versioned releases such as Fortress, Harmonic, Ionic, and Jetty with different support windows.
Gazebo can publish ROS 2 data through ros_gz_bridge so RViz can visualize robot models, transforms, and trajectories from a running simulation.
Gazebo is commonly used alongside MoveIt to simulate manipulators, sensors, and collision scenes before deploying motion-planning pipelines to real robots.