DepthImage
Pool-backed RAII depth image with zero-copy shared memory transport. DepthImage allocates from a global tensor pool and supports two depth formats: F32 (meters) and U16 (millimeters). All depth access methods work in meters regardless of the underlying format -- U16 values are automatically converted.
Creating a DepthImage
use horus::prelude::*;
// F32 depth image -- values stored in meters (default choice)
let mut depth = DepthImage::new(640, 480, TensorDtype::F32)?;
// U16 depth image -- values stored in millimeters (common for Intel RealSense, Azure Kinect)
let mut depth_mm = DepthImage::new(640, 480, TensorDtype::U16)?;
Reading and Writing Depth
use horus::prelude::*;
let mut depth = DepthImage::new(640, 480, TensorDtype::F32)?;
// Set depth at pixel (x, y) in meters -- returns &mut Self for chaining
depth.set_depth(320, 240, 1.5)?;
depth.set_depth(100, 100, 2.3)?;
// Get depth at pixel -- always returns meters (auto-converts U16 mm to f32 m)
if let Some(d) = depth.get_depth(320, 240) {
println!("Depth at center: {:.3}m", d);
}
// For U16 images, get raw millimeter value
let depth_u16 = DepthImage::new(640, 480, TensorDtype::U16)?;
if let Some(mm) = depth_u16.get_depth_u16(320, 240) {
println!("Raw depth: {}mm", mm);
}
// Compute min, max, mean over valid (non-zero) depth values
if let Some((min, max, mean)) = depth.depth_statistics() {
println!("Range: {:.2}-{:.2}m, mean: {:.2}m", min, max, mean);
}
Metadata and Transport
use horus::prelude::*;
let mut depth = DepthImage::new(640, 480, TensorDtype::F32)?;
// Set metadata (method chaining)
depth
.set_frame_id("depth_camera")
.set_timestamp_ns(1234567890);
// Check format
println!("{}x{}", depth.width(), depth.height());
println!("Meters: {}, Scale: {}", depth.is_meters(), depth.depth_scale());
// Zero-copy raw data access
let raw: &[u8] = depth.data();
println!("{} bytes", depth.nbytes());
RGB-D Pipeline
use horus::prelude::*;
struct RgbdNode {
rgb_sub: Topic<Image>,
depth_sub: Topic<DepthImage>,
cloud_pub: Topic<PointCloud>,
fx: f32, fy: f32, cx: f32, cy: f32,
}
impl Node for RgbdNode {
fn name(&self) -> &str { "RgbdNode" }
fn tick(&mut self) {
let depth = match self.depth_sub.recv() {
Some(d) => d,
None => return,
};
let w = depth.width();
let h = depth.height();
if let Ok(mut cloud) = PointCloud::new(w * h, 3, TensorDtype::F32) {
// Depth-to-pointcloud using camera intrinsics
for y in 0..h {
for x in 0..w {
if let Some(z) = depth.get_depth(x, y) {
if z > 0.0 {
let px = (x as f32 - self.cx) * z / self.fx;
let py = (y as f32 - self.cy) * z / self.fy;
// Write (px, py, z) into cloud data...
}
}
}
}
cloud.set_frame_id(depth.frame_id());
self.cloud_pub.send(cloud);
}
}
}
Rust API Reference
DepthImage is an RAII type -- fields are private, accessed through methods. Mutation methods return &mut Self for chaining.
| Method | Returns | Description |
|---|---|---|
new(width, height, dtype) | HorusResult<Self> | Create depth image (F32 or U16) from global pool |
get_depth(x, y) | Option<f32> | Depth in meters at pixel (auto-converts U16) |
set_depth(x, y, value) | HorusResult<&mut Self> | Set depth in meters at pixel |
get_depth_u16(x, y) | Option<u16> | Raw U16 millimeters (U16 dtype only) |
depth_statistics() | Option<(f32, f32, f32)> | (min, max, mean) in meters over valid depths |
width() | u32 | Image width in pixels |
height() | u32 | Image height in pixels |
is_meters() | bool | True if dtype is F32 |
is_millimeters() | bool | True if dtype is U16 |
depth_scale() | f32 | Depth unit scale factor |
data() | &[u8] | Zero-copy access to raw depth data |
data_mut() | &mut [u8] | Mutable access to raw depth data |
copy_from(src) | &mut Self | Copy raw bytes into the image |
dtype() | TensorDtype | Data type (F32 or U16) |
nbytes() | u64 | Total size in bytes |
is_cpu() | bool | Whether data is on CPU |
set_frame_id(id) | &mut Self | Set sensor frame identifier |
frame_id() | &str | Get sensor frame identifier |
set_timestamp_ns(ts) | &mut Self | Set timestamp in nanoseconds |
timestamp_ns() | u64 | Get timestamp in nanoseconds |
Depth format summary:
| Dtype | Unit | is_meters() | depth_scale() | Typical sensor |
|---|---|---|---|---|
TensorDtype::F32 | meters | true | 1.0 | Processed / simulated |
TensorDtype::U16 | millimeters | false | 0.001 | RealSense, Kinect, stereo |
Python API (PyDepthImage)
import horus
import numpy as np
# Create from dtype string
depth = horus.DepthImage(480, 640, dtype="float32") # meters
depth = horus.DepthImage(480, 640, dtype="uint16") # millimeters
depth = horus.DepthImage(480, 640, dtype="mm") # alias for uint16
# Create from numpy array
arr = np.zeros((480, 640), dtype=np.float32)
depth = horus.DepthImage.from_numpy(arr)
# Create from torch tensor
import torch
t = torch.zeros(480, 640, dtype=torch.float32)
depth = horus.DepthImage.from_torch(t)
# Read/write depth (meters)
depth.set_depth(320, 240, 1.5)
d = depth.get_depth(320, 240) # -> float
# Statistics
stats = depth.depth_statistics() # -> (min, max, mean) or None
# Convert to numpy/torch/jax
arr = depth.to_numpy()
t = depth.to_torch()
j = depth.to_jax()
# Properties
depth.height # 480
depth.width # 640
depth.dtype # "float32"
depth.nbytes # total bytes
depth.depth_scale # 1.0 for F32, 0.001 for U16
# Metadata
depth.set_frame_id("depth_cam")
depth.set_timestamp_ns(123456)
depth.is_meters() # True
depth.is_millimeters() # False
Valid dtype strings:
| String | Format |
|---|---|
"float32", "meters" | F32, meters |
"uint16", "millimeters", "mm" | U16, millimeters |
See Also
- Perception Messages -- PointCloud, Landmark, TrackedObject
- Vision Messages -- Image, CameraInfo, Detection
- TensorPool API -- Underlying pool management
- Python Memory Types -- Python DepthImage API with NumPy/PyTorch/JAX interop