Intel RealSense D435i

Ask for price
EUR
VAT not included

Compact RGB-D depth camera with integrated IMU, ideal for 3D mapping, SLAM, obstacle detection, and visual-inertial odometry in mobile robotics.

Get a quote

Main features

No items found.

The Intel RealSense D435i is a depth camera equipped with an integrated 6-axis IMU, providing both depth perception and motion tracking in a compact form factor. It captures synchronized depth, RGB, and inertial data, enabling advanced robotics applications such as SLAM, obstacle avoidance, 3D reconstruction, and visual-inertial odometry (VIO).

The D435i features a stereo depth system with a global shutter, making it effective for capturing fast-moving scenes without motion blur. The RGB sensor delivers high-resolution color imaging, while the IMU adds motion data for enhanced spatial awareness and sensor fusion.

For mobile robotics, the D435i enables precise navigation and environmental understanding in dynamic or unstructured environments, essential for autonomous operation.

Key Specifications:

  • Depth Technology: Stereo-based depth sensing
  • Depth Field of View: 85.2° × 58°
  • Depth Range: Up to ~10 meters (optimal 0.2–3 meters)
  • Depth Resolution: Up to 1280 x 720 at 30 FPS
  • RGB Camera: 1920 x 1080 at 30 FPS
  • Shutter: Global shutter for depth; Rolling shutter for RGB
  • IMU: 6-axis (gyro + accelerometer)
  • Connectivity: USB 3.1 Gen 1 Type-C
  • Dimensions: 90 mm × 25 mm × 25 mm
  • Weight: ~72g
  • SDK: Compatible with Intel RealSense SDK 2.0, ROS/ROS2 support

Typical uses in robotics:

  • Real-time SLAM
  • Visual-inertial odometry
  • Obstacle avoidance and navigation
  • 3D scene reconstruction

Check the integration manual here:

https://docs.fictionlab.pl/integrations/cameras/intel-realsense

CAD model

Resources

Estimated shipping time:
Leo Rovers - 5 working days;
addons - please ask;

Available payment options:
- International bank transfer
- PayPal and debit/credit card
- Net30 for Universities

Distributors

No items found.

Need help? Contact us - contact@fictionlab.pl