Skip to content

🤖 Robots


In OmniGibson, Robots define agents that can interact with other objects in a given environment. Each robot can interact by deploying joint commands via its set of Controllers, and can perceive its surroundings via its set of Sensors.

OmniGibson supports both navigation and manipulation robots, and allows for modular specification of individual controllers for controlling the different components of a given robot. For example, the Fetch robot is a mobile manipulator composed of a mobile (two-wheeled) base, two head joints, a trunk, seven arm joints, and two gripper finger joints. Fetch owns 4 controllers, one for controlling the base, the head, the trunk + arm, and the gripper. There are multiple options for each controller depending on the desired action space. For more information, check out our robot examples.

It is important to note that robots are full-fledged StatefulObjects, and thus leverage the same APIs as normal scene objects and can be treated as such. Robots can be thought of as StatefulObjects that additionally own controllers (robot.controllers) and sensors (robot.sensors).



Robots can be added to a given Environment instance by specifying them in the config that is passed to the environment constructor via the robots key. This is expected to be a list of dictionaries, where each dictionary specifies the desired configuration for a single robot to be created. For each dict, the type key is required and specifies the desired robot class, and global position and orientation (in (x,y,z,w) quaternion form) can also be specified. Additional keys can be specified and will be passed directly to the specific robot class constructor. An example of a robot configuration is shown below in .yaml form:

  - type: Fetch
    position: [0, 0, 0]
    orientation: [0, 0, 0, 1]
    obs_modalities: [scan, rgb, depth]
    scale: 1.0
    self_collision: false
    action_normalize: true
    action_type: continuous
    grasping_mode: physical
    rigid_trunk: false
    default_trunk_offset: 0.365
    default_arm_pose: diagonal30
    reset_joint_pos: tuck
          image_height: 128
          image_width: 128
            min_range: 0.05
            max_range: 10.0
        name: DifferentialDriveController
        name: InverseKinematicsController
        kv: 2.0
        name: MultiFingerGripperController
        mode: binary
        name: JointController
        use_delta_commands: False


Usually, actions are passed to robots and observations retrieved via the obs, info, reward, done = env.step(action). However, actions can be directly deployed and observations retrieved from the robot using the following APIs:

  • Applying actions: robot.apply_action(action) (1)

  • Retrieving observations: obs, info = robot.get_obs() (2)

  1. action is a 1D-numpy array. For more information, please see the Controller section!
  2. obs is a dict mapping observation name to observation data, and info is a dict of relevant metadata about the observations. For more information, please see the Sensor section!

Controllers and sensors can be accessed directly via the controllers and sensors properties, respectively. And, like all objects in OmniGibson, common information such as joint data and object states can also be directly accessed from the robot class.


OmniGibson currently supports 9 robots, consisting of 4 mobile robots, 2 manipulation robots, 2 mobile manipulation robots, and 1 anthropomorphic "robot" (a bimanual agent proxy used for VR teleoperation). Below, we provide a brief overview of each model:

Mobile Robots

These are navigation-only robots (an instance of LocomotionRobot) that solely consist of a base that can move.


The two-wheeled Turtlebot 2 model with the Kobuki base.

  • Controllers: Base
  • Sensors: Camera, LIDAR

The two-wheeled, open-source LoCoBot model.

Note that in our model the arm is disabled and is fixed to the base.

  • Controllers: Base
  • Sensors: Camera, LIDAR

The four-wheeled Husky UAV model from Clearpath Robotics.

  • Controllers: Base
  • Sensors: Camera, LIDAR

The two-wheeled Freight model which serves as the base for the Fetch robot.

  • Controllers: Base
  • Sensors: Camera, LIDAR

Manipulation Robots

These are manipulation-only robots (an instance of ManipulationRobot) that cannot move and solely consist of an actuated arm with a gripper attached to its end effector.


The popular 7-DOF Franka Research 3 model equipped with a parallel jaw gripper. Note that OmniGibson also includes two alternative versions of Franka: FrankaAllegro (equipped with an Allegro hand) and FrankaLeap (equipped with a Leap hand).

  • Controllers: Arm, Gripper
  • Sensors: Wrist Camera

The 6-DOF ViperX 300 6DOF model from Trossen Robotics equipped with a parallel jaw gripper.

  • Controllers: Arm, Gripper
  • Sensors: Wrist Camera

Mobile Manipulation Robots

These are robots that can both navigate and manipulate (and inherit from both LocomotionRobot and ManipulationRobot), and are equipped with both a base that can move as well as one or more gripper-equipped arms that can actuate.


The Fetch model, composed of a two-wheeled base, linear trunk, 2-DOF head, 7-DOF arm, and 2-DOF parallel jaw gripper.

  • Controllers: Base, Head, Arm, Gripper
  • Sensors: Head Camera, LIDAR

The bimanual Tiago model from PAL robotics, composed of a holonomic base (which we model as a 3-DOF (x,y,rz) set of joints), linear trunk, 2-DOF head, x2 7-DOF arm, and x2 2-DOF parallel jaw grippers.

  • Controllers: Base, Head, Left Arm, Right Arm, Left Gripper, Right Gripper
  • Sensors: Head Camera, Rear LIDAR, Front LIDAR

Additional Robots


A hand-designed model intended to be used exclusively for VR teleoperation.

  • Controllers: Base, Head, Left Arm, Right Arm, Left Gripper, Right Gripper
  • Sensors: Head Camera