Sensor Modules
High-level sensor support for DSG-JIT. This package provides:
- Lightweight measurement containers: camera, LiDAR, IMU.
- Streaming abstractions (
sensors.streams) for synthetic or file-based data. - Conversion utilities (
sensors.conversion) from raw samples → DSG-JIT measurements. - Sensor fusion manager (
sensors.fusion.SensorFusionManager) that: - Polls one or more sensor streams.
- Converts them into measurements.
- Optionally forwards them to
WorldModel/SceneGraphWorldhelpers.
sensors.base
Abstract base classes and shared interfaces for DSG-JIT sensor modules.
This module defines the foundational API that all sensor types in DSG-JIT (IMU, LiDAR, RGB cameras, range sensors, etc.) are expected to implement. By unifying the contract between sensor objects and the rest of the system, DSG-JIT enables plug-and-play multi-sensor integration without requiring special-case logic for each modality.
The goal of this module is to answer a simple question:
“What does it mean to be a sensor in DSG-JIT?”
Core Components
BaseSensor The abstract parent class for all sensors. It typically defines:
- ``initialize()`` — optional setup before streaming begins
- ``read()`` — return a *single typed measurement*
- ``close()`` — release hardware or simulation resources
Subclasses (e.g., ``IMUSensor``, ``LiDARSensor``, ``CameraSensor``)
implement their own measurement-specific logic, but all present
a consistent surface to the rest of DSG-JIT.
BaseMeasurement
A typed container for the output of a sensor.
Every measurement has:
- ``timestamp``
- device-specific payload (e.g., accelerations, images, point clouds)
The purpose of this common type is to make downstream modules—
factor graphs, dynamic scene graphs, training loops, logging tools—
treat all measurements uniformly.
Design Goals
-
Unified sensor API
All sensors behave the same from the perspective of DSG-JIT’s world model, optimization routines, and streaming helpers. -
Compatibility with synchronous & asynchronous streams
The interfaces defined here are intentionally minimal so thatstreams.pycan wrap any sensor using either Python loops or asyncio-based background tasks. -
Future-proof extensibility
The goal is for users to implement custom sensors (GPS, UWB, RADAR, event cameras, tactile sensors, etc.) by subclassingBaseSensorand returning customBaseMeasurementtypes. -
Separation of concerns
This module defines contracts, not implementation logic. Sensor-specific math, calibration, conversion, or projection live in their respective modules (e.g.,lidar.py,camera.py).
Summary
The base module provides the essential abstraction layer required for
building robust, modular, multi-sensor dynamic scene graphs. It ensures that
all data entering DSG-JIT—regardless of modality—flows through a consistent,
well-structured interface suitable for high-frequency SLAM, perception,
and future real-time scene generation.
BaseMeasurement(t, source, data, meta=None)
dataclass
Generic typed measurement used by all DSG-JIT sensor backends.
This class represents a single sensor sample emitted by a device or
by a stream wrapper. All concrete sensor measurement types
(e.g., :class:CameraMeasurement, :class:LidarMeasurement,
:class:IMUMeasurement) should inherit from this class.
:param t: Discrete timestamp or frame index associated with the sample.
:type t: int
:param source: Name of the sensor that produced this measurement
(e.g., "front_cam", "lidar_0", "imu_main").
:type source: str
:param data: Raw modality-specific payload. Subclasses may refine this type.
For example, a camera may store an ndarray, LiDAR may store a point cloud,
and IMU may store (accel, gyro) tuples.
:type data: Any
:param meta: Optional metadata, such as exposure time, pose hints, or flags.
:type meta: dict or None
Sensor(name, agent_id)
Abstract base class for all DSG-JIT sensors.
Concrete subclasses implement :meth:build_factors to turn a
:class:SensorReading into one or more :class:core.types.Factor
instances, which can then be added to a :class:world.model.WorldModel.
Sensors are intended to be stateless with respect to optimization: they describe how to map readings into factors, but do not own any variables themselves.
Typical usage pattern::
sensor = SomeSensor(agent_id="robot0", ...)
reading = SensorReading(t=3, data=raw_measurement)
factors = sensor.build_factors(world_model, dsg, reading)
for f in factors:
world_model.fg.add_factor(f)
:param name: Human-readable name for this sensor instance.
:type name: str
:param agent_id: Identifier of the agent this sensor is mounted on,
e.g. "robot0". Used to resolve pose nodes in the DSG.
:type agent_id: str
Source code in dsg-jit/dsg_jit/sensors/base.py
158 159 160 | |
build_factors(wm, dsg, reading)
Convert a sensor reading into factor(s) to be added to the world.
Subclasses must implement this to return a list of
:class:core.types.Factor objects whose var_ids and
params are consistent with the residuals registered in the
world's factor graph.
:param wm: World model into which new factors will be added.
:type wm: world.model.WorldModel
:param dsg: Dynamic scene graph providing access to pose node ids
for this sensor's agent at the requested time index.
:type dsg: world.dynamic_scene_graph.DynamicSceneGraph
:param reading: Sensor reading to convert.
:type reading: SensorReading
:return: List of factor objects ready to be added to
:attr:wm.fg.
:rtype: list[core.types.Factor]
Source code in dsg-jit/dsg_jit/sensors/base.py
164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 | |
SensorReading(t, data)
dataclass
Lightweight container for a single sensor measurement.
:param t: Discrete time index or timestamp at which the reading was taken. For now this is typically an integer matching the DSG's time index. :type t: int :param data: Raw or minimally processed sensor payload. The exact structure depends on the sensor type (e.g. a scalar range, a 3D point, an image array, etc.). :type data: Any
sensors.camera
Camera sensor abstractions and utilities for DSG-JIT.
This module defines lightweight, JAX-friendly camera interfaces that can be plugged into dynamic scene graph (DSG) pipelines. The goal is to provide a clean separation between:
- Raw image acquisition (e.g., from a hardware driver, simulator, or prerecorded dataset), and
- Downstream SLAM / DSG consumers that only need structured frames (RGB or grayscale) with timestamps and metadata.
The module typically exposes:
-
CameraFrame: A simple dataclass-like structure representing a single image frame. It usually stores image data (RGB or grayscale), optional depth, and a timestamp or frame index. -
CameraSensor: A wrapper around an arbitrary user-provided capture function. The capture function may return NumPy arrays or JAX arrays; the wrapper normalizes these and provides a consistent interface for reading frames in synchronous loops or via the generic sensor streams. -
Optional utilities for:
- Converting RGB frames to grayscale.
- Normalizing/typing images for consumption by JAX or downstream vision modules.
- Integrating with the generic sensor streaming API (synchronous or asynchronous).
These camera abstractions are intentionally minimal and do not perform full visual odometry or object detection. Instead, they are designed as building blocks for higher-level modules (e.g., visual SLAM, semantic DSG layers) that can interpret camera data and inject new nodes and factors into the scene graph or factor graph.
The design philosophy is:
- Keep camera handling stateless where possible.
- Make it easy to wrap any existing camera source (OpenCV, ROS, custom simulator, etc.) behind a small capture function.
- Ensure that integration with DSG-JIT remains explicit and composable, so users can swap cameras or add multiple sensors without changing core SLAM logic.
AsyncCamera(intrinsics, aiter_fn, color_space='rgb')
dataclass
Asynchronous camera source.
:param intrinsics: Camera intrinsics. :param aiter_fn: Callable that returns an async iterator yielding images as numpy arrays. :param color_space: Color space of the images produced by aiter_fn, default is "rgb".
frames()
async
Asynchronously iterate over frames from the camera.
:returns: An async iterator yielding CameraFrame objects with current timestamp, no frame_id, and the configured color_space.
Source code in dsg-jit/dsg_jit/sensors/camera.py
231 232 233 234 235 236 237 238 239 240 241 242 243 | |
CameraFrame(image, timestamp, frame_id, color_space)
dataclass
A single camera frame.
:param image: The image array. :param timestamp: Timestamp of the frame capture. :param frame_id: Optional frame identifier. :param color_space: Color space of the image, either "rgb" or "gray".
CameraIntrinsics(width, height, fx, fy, cx, cy)
dataclass
Pinhole camera intrinsics.
:param width: Image width in pixels. :param height: Image height in pixels. :param fx: Focal length in x direction. :param fy: Focal length in y direction. :param cx: Principal point x coordinate. :param cy: Principal point y coordinate.
CameraMeasurement(frame, sensor_id='cam0', T_cam_body=None, seq=None, metadata=None)
dataclass
High-level camera measurement suitable for feeding into SLAM / DSG layers.
This wraps a low-level :class:CameraFrame with additional metadata
such as sensor ID, extrinsics, and an optional sequence index. It is
intentionally minimal and can be extended by applications as needed.
:param frame:
The underlying camera frame (image, timestamp, color space, etc.).
:param sensor_id:
Identifier for the camera (e.g., "cam0"). Useful when multiple
cameras are present.
:param T_cam_body:
Optional 4x4 homogeneous transform from body frame to camera frame.
If omitted, downstream consumers may assume an identity transform or
use a configured default.
:param seq:
Optional sequence index (e.g., frame counter) for convenience.
:param metadata:
Optional free-form dictionary for extra information
(exposure, gain, rolling-shutter parameters, etc.).
SyncCamera(intrinsics, read_fn, color_space='rgb')
dataclass
Synchronous camera source.
:param intrinsics: Camera intrinsics. :param read_fn: Callable that returns an image as a numpy array. :param color_space: Color space of the images produced by read_fn, default is "rgb".
read()
Read a single frame from the camera.
:returns: A CameraFrame containing the image, current timestamp, no frame_id, and the configured color_space.
Source code in dsg-jit/dsg_jit/sensors/camera.py
203 204 205 206 207 208 209 210 211 212 213 214 215 | |
is_gray(frame)
Check if the frame is in grayscale color space.
:param frame: The camera frame to check. :returns: True if the frame's color space is one of "gray", "grey", or "grayscale" (case-insensitive), False otherwise.
Source code in dsg-jit/dsg_jit/sensors/camera.py
133 134 135 136 137 138 139 140 | |
is_rgb(frame)
Check if the frame is in RGB color space.
:param frame: The camera frame to check. :returns: True if the frame's color space is "rgb" (case-insensitive), False otherwise.
Source code in dsg-jit/dsg_jit/sensors/camera.py
123 124 125 126 127 128 129 130 | |
to_grayscale(frame)
Convert an RGB frame to grayscale. If the frame is already grayscale, returns it unchanged.
For RGB frames, assumes image shape is (H, W, 3) and applies standard luminance weights [0.299, 0.587, 0.114] to convert to grayscale with shape (H, W). The returned image is float32 in [0, 1] if the input was uint8.
:param frame: The input camera frame. :returns: A new CameraFrame in grayscale color space with the same timestamp and frame_id.
Source code in dsg-jit/dsg_jit/sensors/camera.py
143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 | |
to_rgb(frame)
Convert a grayscale frame to RGB by stacking the gray channel three times along the last axis. If the frame is already RGB, returns it unchanged.
:param frame: The input camera frame. :returns: A new CameraFrame in RGB color space with the same timestamp and frame_id.
Source code in dsg-jit/dsg_jit/sensors/camera.py
167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 | |
sensors.conversion
Conversion utilities from sensor measurements to factor-graph factors.
This module implements the "measurement conversion layer" for DSG-JIT: given
typed sensor measurements (IMU, LiDAR, cameras, simple range sensors, etc.),
it produces factor descriptions that can be attached to the core
:class:core.factor_graph.FactorGraph or to higher-level world/scene-graph
abstractions.
The core idea is to keep sensor-facing code and factor-graph-facing code decoupled:
- Sensor modules (
sensors.camera,sensors.imu,sensors.lidar,sensors.streams, …) produce strongly-typed measurement objects. -
This module converts those measurements into small :class:
MeasurementFactorrecords describing:factor_type(string key for the residual)var_ids(tuple of variable node ids to connect)params(dictionary passed into the residual function)
Downstream code can then:
- Construct :class:
core.types.Factorobjects from these records. - Call :meth:
core.factor_graph.FactorGraph.add_factor. - Or wrap them in higher-level helpers in :mod:
world.scene_graph.
This keeps sensor integration "plug-and-play" while preserving a clean, minimal interface for the optimization engine.
MeasurementFactor(factor_type, var_ids, params)
dataclass
Lightweight description of a factor generated from a sensor measurement.
This is intentionally decoupled from :class:core.types.Factor so that
the conversion layer does not depend on internal factor-graph details.
Call :func:measurement_factors_to_graph_factors to turn these into
concrete :class:Factor instances, or use
:func:apply_measurement_factors to add them directly to a
:class:FactorGraph.
:param factor_type: String key for the residual function
(e.g. "range_1d", "pose_landmark_bearing",
"voxel_point_obs", etc.). This must correspond to a type
previously registered via
:meth:core.factor_graph.FactorGraph.register_residual.
:type factor_type: str
:param var_ids: Tuple of variable node ids that this factor connects,
in the order expected by the residual function.
:type var_ids: tuple[int, ...]
:param params: Parameter dictionary passed into the residual function.
Entries are typically NumPy/JAX arrays or scalar floats
(e.g. {"measurement": ..., "sigma": ...}).
:type params: dict[str, Any]
RangeMeasurement(distance, ray_dir)
dataclass
Simple 1D range measurement along a known unit ray.
:param distance: Measured distance along the ray, in meters. :param ray_dir: Unit direction vector in the sensor frame, shape (3,).
apply_measurement_factors(fg, meas_factors)
Add a sequence of measurement-derived factors to a factor graph.
This is a thin convenience wrapper around
:func:measurement_factors_to_graph_factors and
:meth:core.factor_graph.FactorGraph.add_factor.
:param fg: The factor graph to which the new factors should be added.
:type fg: FactorGraph
:param meas_factors: Iterable of :class:MeasurementFactor objects.
:type meas_factors: Iterable[MeasurementFactor]
:return: This function has no return value; it mutates fg in-place
by adding new factors.
:rtype: None
Source code in dsg-jit/dsg_jit/sensors/conversion.py
490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 | |
bearing_to_factor(pose_id, landmark_id, bearing_vec, sigma=0.01, factor_type='pose_landmark_bearing')
Convert a bearing vector to a factor description.
This is suitable for camera-like observations where you have a unit bearing vector from the camera pose to a landmark in either camera or world coordinates, and a residual function that enforces angular consistency between the predicted bearing and the measured one.
:param pose_id: Node id of the camera/pose variable.
:type pose_id: int
:param landmark_id: Node id of the landmark/point variable.
:type landmark_id: int
:param bearing_vec: Measured bearing direction as a vector. This
should typically be normalized (unit length) and have shape
(2,) or (3,), depending on the residual implementation.
:type bearing_vec: jax.numpy.ndarray
:param sigma: Measurement noise standard deviation in angular units
(radians or an equivalent bearing metric).
:type sigma: float
:param factor_type: String key for the residual function (e.g.
"pose_landmark_bearing"). Must match a registered factor
type in the :class:FactorGraph.
:type factor_type: str
:return: A :class:MeasurementFactor describing the bearing
constraint.
:rtype: MeasurementFactor
Source code in dsg-jit/dsg_jit/sensors/conversion.py
141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 | |
camera_bearings_to_factors(pose_id, landmark_ids, measurement, sigma=0.01, factor_type='pose_landmark_bearing')
Convert a :class:sensors.camera.CameraMeasurement containing bearing
directions into a list of measurement factors.
This helper assumes that the camera front-end has already extracted
unit bearing vectors from image data (e.g. via feature detection and
calibration), and that these bearings have been associated with a set
of landmark ids. For each landmark_id and corresponding row in
measurement.bearings, we construct a bearing factor using
:func:bearing_to_factor.
:param pose_id: Node id of the camera/pose variable in the factor graph.
:type pose_id: int
:param landmark_ids: Sequence of landmark node ids, one per bearing
vector in measurement.bearings.
:type landmark_ids: Sequence[int]
:param measurement: Camera measurement containing an array of bearing
vectors in measurement.bearings with shape (N, D) where
D is typically 2 or 3.
:type measurement: CameraMeasurement
:param sigma: Bearing noise standard deviation passed through to
:func:bearing_to_factor.
:type sigma: float
:param factor_type: Factor type string for the bearing residual,
usually "pose_landmark_bearing".
:type factor_type: str
:return: List of :class:MeasurementFactor objects, one per
(pose, landmark) bearing observation.
:rtype: list[MeasurementFactor]
:raises ValueError: If the number of landmark ids does not match the
number of bearing vectors stored in the measurement.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 | |
camera_depth_to_points_sensor(cam, depth)
Convert a depth image into a 3D point cloud in the camera frame.
The depth image is assumed to have the same width/height as specified
by the camera intrinsics. For each pixel (u, v) with depth d,
we compute a ray via :func:pixel_to_ray_camera and place the point
at d * ray.
:param cam: Camera measurement with intrinsics.
:param depth: Depth map of shape (H, W) in meters.
:returns: Array of points of shape (H*W, 3) in the camera frame.
Invalid or zero depths are skipped.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 | |
imu_to_factors_placeholder(pose_ids, imu_meas)
Placeholder conversion from IMU measurement to factor descriptions.
In a full SLAM system, IMU data is typically handled via preintegration over multiple high-rate samples to produce a single inertial factor between two poses. That logic is non-trivial and highly application specific, so this function serves as a placeholder and example.
For now, it returns an empty list and is intended to be replaced with a proper preintegration pipeline in future work.
:param pose_ids: Sequence of pose node ids between which an IMU factor would be created (e.g. previous pose id and current pose id). :type pose_ids: Sequence[int] :param imu_meas: A single IMU measurement containing accelerometer and gyroscope readings along with a timestamp. :type imu_meas: IMUMeasurement :return: An empty list. Replace with application-specific IMU factor generation as needed. :rtype: list[MeasurementFactor]
Source code in dsg-jit/dsg_jit/sensors/conversion.py
425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 | |
integrate_imu_delta(imu, dt, gravity=None)
Integrate a single IMU sample into a small SE(3) increment (se(3) vector).
This is a very simple, single-step integrator meant as a starting point / placeholder. For real applications, a proper preintegration scheme or filter should be used instead.
The returned 6D vector is in the form::
[dtx, dty, dtz, drx, dry, drz]
where dt* is the approximate translational displacement in the IMU
frame and dr* is a small-angle approximation for the rotation.
:param imu: IMU measurement containing linear acceleration and angular
velocity in the sensor frame.
:param dt: Time step in seconds between this sample and the previous one.
:param gravity: Optional gravity vector (in sensor frame). If provided,
it is subtracted from the measured acceleration before integration.
If None, no gravity compensation is performed.
:returns: A length-6 JAX array representing a small SE(3) increment in
the IMU frame.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 | |
lidar_measurement_to_voxel_factors(measurement, sigma=0.05, factor_type='voxel_point_obs')
Convert a :class:sensors.lidar.LidarMeasurement into voxel-point
observation factors.
This helper assumes that the LiDAR front-end has already projected raw
ranges into 3D world coordinates and, optionally, associated each
point with a voxel id. If measurement.voxel_ids is provided, it is
used directly; otherwise, the caller is expected to supply voxel
associations separately.
Concretely, this is a thin wrapper around
:func:lidar_scan_to_voxel_factors, using
measurement.points_world and measurement.voxel_ids.
:param measurement: LiDAR measurement containing a point cloud in
world coordinates and optional voxel assignments.
:type measurement: LidarMeasurement
:param sigma: Noise level for each point in world units, forwarded to
:func:voxel_point_obs_factor.
:type sigma: float
:param factor_type: Factor type string used for all voxel-point
factors, typically "voxel_point_obs".
:type factor_type: str
:return: List of :class:MeasurementFactor objects describing the
LiDAR point cloud constraints.
:rtype: list[MeasurementFactor]
:raises ValueError: If measurement.voxel_ids is None or its
length does not match the number of points.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 | |
lidar_scan_to_points_sensor(scan)
Convert a planar LiDAR scan into 3D points in the sensor frame.
This function assumes a 2D planar LiDAR mounted in the x-y plane,
with all points lying at z=0 in the sensor frame::
x = r * cos(theta)
y = r * sin(theta)
z = 0
:param scan: LiDAR measurement containing per-beam ranges and angles.
:returns: Array of points of shape (N, 3) in the sensor frame.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 | |
lidar_scan_to_points_world(scan, T_world_sensor)
Convert a planar LiDAR scan into 3D points in world coordinates.
:param scan: LiDAR measurement in the sensor frame.
:param T_world_sensor: Homogeneous transform from sensor to world,
shape (4, 4).
:returns: Array of points of shape (N, 3) in world coordinates.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 | |
lidar_scan_to_voxel_factors(voxel_ids, points_world, sigma=0.05, factor_type='voxel_point_obs')
Convert a LiDAR point cloud into per-voxel observation factors.
This helper assumes that a pre-processing step has already:
- Projected the LiDAR ranges into 3D points in world coordinates.
- Associated each point with a voxel id (e.g. via a voxel grid index).
Given a list of voxel node ids and a matching array of 3D points,
this function returns one :class:MeasurementFactor per point.
:param voxel_ids: Iterable of voxel node ids, one per point.
:type voxel_ids: Sequence[int]
:param points_world: Array of 3D points in world coordinates with
shape (N, 3), where N == len(voxel_ids).
:type points_world: jax.numpy.ndarray
:param sigma: Noise level for each point in world units.
:type sigma: float
:param factor_type: Factor type string used for all voxel-point
factors, typically "voxel_point_obs".
:type factor_type: str
:return: A list of :class:MeasurementFactor objects, one for each
point/voxel pair.
:rtype: list[MeasurementFactor]
Source code in dsg-jit/dsg_jit/sensors/conversion.py
301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 | |
measurement_factors_to_graph_factors(meas_factors)
Convert a sequence of :class:MeasurementFactor objects to concrete
:class:core.types.Factor instances.
Unique factor ids are generated using a simple running index; if you need stable ids, you can post-process the factors or construct them manually instead.
:param meas_factors: Iterable of :class:MeasurementFactor objects
returned by the conversion helpers in this module.
:type meas_factors: Iterable[MeasurementFactor]
:return: A list of :class:Factor instances suitable for adding to
a :class:FactorGraph.
:rtype: list[Factor]
Source code in dsg-jit/dsg_jit/sensors/conversion.py
458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 | |
pixel_to_ray_camera(cam, u, v)
Convert a pixel coordinate into a normalized ray in the camera frame.
The intrinsics are assumed to follow the usual pinhole model::
x = (u - cx) / fx
y = (v - cy) / fy
ray_cam = [x, y, 1]
ray_cam /= ||ray_cam||
:param cam: Camera measurement object providing intrinsics.
:param u: Pixel x-coordinate (column index).
:param v: Pixel y-coordinate (row index).
:returns: A unit 3D vector (shape (3,)) in the camera frame.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 | |
pixels_to_rays_camera(cam, pixels)
Convert a collection of pixel coordinates into rays in the camera frame.
:param cam: Camera measurement object providing intrinsics.
:param pixels: Iterable of (u, v) pixel coordinates.
:returns: Array of unit 3D vectors of shape (N, 3) in the camera frame.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
697 698 699 700 701 702 703 704 705 706 707 708 709 | |
range_1d_to_factor(pose_id, place_id, distance, sigma=0.05, factor_type='range_1d')
Convert a scalar range measurement (1D) into a factor description.
This is the generic bridge used by simple range sensors (e.g. the 1D range DSG experiment), where you have:
pose_x - place_x ≈ distance
and a residual function of type factor_type that expects a
"measurement" and optionally a "sigma" or "weight" in
its parameter dictionary.
:param pose_id: Node id of the pose variable in the factor graph.
:type pose_id: int
:param place_id: Node id of the place or landmark variable in the
factor graph.
:type place_id: int
:param distance: Measured distance between the pose and the place.
:type distance: float
:param sigma: Measurement noise standard deviation. If your residual
uses weight instead, you can convert this using
1.0 / (sigma ** 2) when creating the factor.
:type sigma: float
:param factor_type: String key for the residual function. This must
match a factor type registered into the :class:FactorGraph,
for example "range_1d".
:type factor_type: str
:return: A :class:MeasurementFactor describing the range constraint.
:rtype: MeasurementFactor
Source code in dsg-jit/dsg_jit/sensors/conversion.py
89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 | |
range_to_point_sensor(m)
Convert a range measurement into a 3D point in the sensor frame.
:param m: Range measurement with a scalar distance and unit ray direction
in the sensor frame.
:returns: A 3D point (shape (3,)) in the sensor frame.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
577 578 579 580 581 582 583 584 585 586 587 | |
range_to_point_world(m, T_world_sensor)
Convert a range measurement into a 3D point in world coordinates.
This assumes you already have the sensor pose T_world_sensor as a
homogeneous transform matrix of shape (4, 4). The point is first
constructed in the sensor frame and then transformed into the world.
:param m: Range measurement in the sensor frame.
:param T_world_sensor: Homogeneous transform from sensor to world,
shape (4, 4).
:returns: A 3D point (shape (3,)) in world coordinates.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 | |
raw_sample_to_camera_measurement(sample, sensor_id='cam0', T_cam_body=None, seq=None)
Convert a raw camera sample dictionary into a CameraMeasurement.
Expected keys in sample:
"t"(optional): float timestamp. If missing, the current time is used."frame_id"(optional): string frame identifier.- ONE of the following image-like entries (optional):
"image": (H, W) or (H, W, 3) array-like.
- Optional directional data (for feature-based SLAM):
"bearings": (N, 3) array-like of unit directions."dirs": (N, 3) array-like."rays": (N, 3) array-like.
Any directional data is stored in the returned measurement's metadata
under the corresponding key (e.g. metadata["bearings"]).
:param sample:
Raw sample dictionary produced by a sensor stream (e.g. ReadingStream
or FunctionStream) in the experiments.
:param sensor_id:
Identifier for this camera (e.g. "cam0").
:param T_cam_body:
Optional 4x4 homogeneous transform from body frame to camera frame.
:param seq:
Optional sequence index (frame counter).
:returns:
A :class:CameraMeasurement containing a :class:CameraFrame plus
any directional data in the metadata field.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 | |
raw_sample_to_imu_measurement(sample)
Convert a raw dict from a stream into an IMUMeasurement.
Expected keys in sample:
- "t": float timestamp (seconds).
- "accel": array-like linear acceleration in the IMU frame.
- "gyro": array-like angular velocity in the IMU frame.
- "dt" (optional): time step in seconds since the previous sample. If omitted, a default value is used.
:param sample: Raw sample dictionary produced by a sensor stream.
:return: An :class:IMUMeasurement instance.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 | |
raw_sample_to_lidar_measurement(sample)
Convert a raw LiDAR sample dictionary into a :class:LidarMeasurement.
Expected keys in sample:
"ranges": 1D array-like of LiDAR ranges. (required)"angles": 1D array-like of bearing angles (radians), same length asranges. (optional)"directions": (N, 3) array-like of direction vectors. (optional)"t"(optional): float timestamp."frame_id"(optional): string frame identifier.
This is consistent with the rest of the LiDAR utilities in this module,
which assume a planar LiDAR described by ranges and angles.
:param sample: Raw sample dictionary produced by a sensor stream.
:returns: A :class:LidarMeasurement instance.
:raises KeyError: If required keys are missing from the sample.
Source code in dsg-jit/dsg_jit/sensors/conversion.py
833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 | |
voxel_point_obs_factor(voxel_id, point_world, sigma=0.05, factor_type='voxel_point_obs')
Convert a single 3D point observation into a voxel-point factor.
This is intended for mapping-style sensors (LiDAR, depth cameras, RGB-D, stereo) where you receive one or more 3D points in world coordinates and want to attach them to a voxel cell center.
:param voxel_id: Node id of the voxel variable (voxel_cell)
in the factor graph.
:type voxel_id: int
:param point_world: Observed 3D point in world coordinates with shape
(3,).
:type point_world: jax.numpy.ndarray
:param sigma: Noise level for this observation in world units
(meters, etc.).
:type sigma: float
:param factor_type: Factor type string (e.g. "voxel_point_obs")
corresponding to the voxel-point residual used in your
measurement model.
:type factor_type: str
:return: A :class:MeasurementFactor describing the voxel-point
observation.
:rtype: MeasurementFactor
Source code in dsg-jit/dsg_jit/sensors/conversion.py
259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 | |
sensors.fusion
Sensor fusion utilities for DSG-JIT.
This module provides a small, opinionated fusion core that sits between raw sensor streams (e.g. LiDAR, cameras, IMUs) and the rest of the DSG-JIT world/scene-graph stack.
The goals of this layer are:
- Provide a common abstraction to register named sensors and pull data from synchronous streams.
- Convert raw samples into strongly-typed measurement objects defined in
:mod:
sensors.camera, :mod:sensors.lidar, and :mod:sensors.imu. - Dispatch the resulting measurements to user-provided callbacks that
can update a :class:
world.model.WorldModelor :class:world.dynamic_scene_graph.DynamicSceneGraph.
This is intentionally lightweight:
- It does not assume any particular factor-graph structure.
- It does not run its own background threads or event loops.
- It is designed so that experiments can start simple (single-threaded polling) while leaving room to grow into a more complex async or multi-rate fusion system later on.
Typical usage from an experiment::
from sensors.streams import ReadingStream
from sensors.camera import CameraMeasurement
from sensors.lidar import LidarMeasurement
from sensors.fusion import SensorFusionManager
fusion = SensorFusionManager()
# 1) Register a camera and a lidar stream
fusion.register_sensor(
name="cam0",
modality="camera",
stream=ReadingStream(camera_read_fn),
)
fusion.register_sensor(
name="lidar0",
modality="lidar",
stream=ReadingStream(lidar_read_fn),
)
# 2) Connect fusion to the world model via a callback
def on_measurement(meas):
if isinstance(meas, CameraMeasurement):
# add a bearing / reprojection factor, etc.
...
elif isinstance(meas, LidarMeasurement):
# add range factors or occupancy updates
...
fusion.register_callback(on_measurement)
# 3) Poll sensors in your main loop
for step in range(100):
fusion.poll_once()
# run optimization, render, etc.
In the future we can add:
- Async helpers that drive :class:
sensors.streams.AReadingStream. - Tighter integration with the dynamic scene graph (e.g. per-agent queues).
- Higher-level policies (e.g. downsampling, time sync, etc.).
FusedPoseEstimate(t, pose_se3, covariance=None, source_counts=dict())
dataclass
Fused SE(3) pose estimate produced by SensorFusionManager.
:param t: Time index (discrete step or timestamp) associated with this
estimate.
:param pose_se3: 6D se(3) pose vector in world coordinates, typically
(tx, ty, tz, rx, ry, rz).
:param covariance: Optional 6x6 covariance matrix for the fused pose.
:param source_counts: Optional dictionary tracking how many measurements
contributed from each sensor type (e.g. {"imu": 10, "lidar": 2}).
RegisteredSensor(name, modality, stream, converter)
dataclass
Bookkeeping structure for a registered sensor.
:param name: Logical name of the sensor (e.g. "lidar0").
:param modality: String describing the modality, e.g. "camera",
"lidar", or "imu". This is used to choose a default
converter when one is not provided.
:param stream: Underlying sensor stream used to pull raw samples.
:param converter: Function that maps a raw sample from stream to
a :class:~sensors.base.BaseMeasurement instance.
SensorFusionManager(default_callbacks=None, world_model=None, auto_register_world_callbacks=True)
Central registry and dispatcher for sensor measurements.
The fusion manager is intentionally minimal: it owns no threads and does not know about factor graphs or optimization. It simply:
- Keeps track of registered sensors.
- Polls synchronous streams on demand.
- Converts raw samples to measurement objects.
- Broadcasts those measurements to user callbacks.
Experiments are free to use this in a tight, single-threaded loop, or to build more advanced async / multi-rate infrastructure on top.
:param default_callbacks: Optional iterable of callbacks to register
at construction time. Each callback will be called as
callback(measurement) whenever a new
:class:~sensors.base.BaseMeasurement is produced.
Create a new fusion manager.
:param default_callbacks: Optional iterable of callbacks to register
immediately. Each callback will be invoked as cb(measurement)
for every measurement produced by :meth:poll_once or
:meth:push_measurement.
:param world_model: Optional :class:world.model.WorldModel instance
to be associated with this fusion manager. If provided and
auto_register_world_callbacks is True, the manager will
automatically register per-modality callbacks that forward
measurements into the world model.
:param auto_register_world_callbacks: If True and world_model
is not None, register default world-model callbacks for
camera, LiDAR, and IMU measurements (when the corresponding
handler methods exist on the world model).
Source code in dsg-jit/dsg_jit/sensors/fusion.py
154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 | |
attach_world_model(world_model, register_default_callbacks=True)
Attach a :class:world.model.WorldModel to this fusion manager.
When a world model is attached and register_default_callbacks is
True, the manager will install per-modality callbacks that route
camera, LiDAR, and IMU measurements into the world model using any
available handler methods.
Expected handler names on world_model are, for example:
apply_camera_measurementoradd_camera_measurementapply_lidar_measurementoradd_lidar_measurementapply_imu_measurementoradd_imu_measurement
Only handlers that actually exist and are callable will be used.
:param world_model: World model instance to associate.
:param register_default_callbacks: If True, automatically
register default callbacks that forward measurements into the
world model.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 | |
get_latest_pose()
Return the most recent fused SE(3) pose estimate, if available.
This is a thin convenience wrapper used by integration layers (e.g. world models or dynamic scene graphs) to pull a single, canonical pose update out of the fusion buffer.
:returns: A :class:FusedPoseEstimate instance if any fused result
has been produced, or None if the manager has not yet
emitted a pose.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 | |
poll_once()
Poll all registered synchronous streams exactly once.
For each sensor whose stream is an instance of
:class:~sensors.streams.ReadingStream, this method will:
- Call
stream.read()to obtain a raw sample. - If a non-
Nonesample is returned, convert it to a measurement via the registered converter. - Dispatch the measurement to all registered callbacks.
Asynchronous streams are not handled here; they can be consumed
separately using their own async for loops.
:return: The total number of measurements produced and dispatched during this call.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 | |
push_measurement(measurement)
Inject a pre-constructed measurement into the fusion pipeline.
This is useful when the experiment already builds measurement objects directly (e.g. from a simulator) and only wants to reuse the callback dispatch mechanism.
:param measurement: Measurement instance to dispatch to all registered callbacks.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
436 437 438 439 440 441 442 443 444 445 446 447 | |
record_fused_pose(t, pose_se3, covariance=None, source_counts=None)
Append a fused SE(3) pose estimate to the internal history buffer.
This does not perform any fusion by itself; instead it allows an
external estimator (e.g. an EKF or factor-graph solver) to publish
its current best pose into the fusion manager so that callers can
retrieve it via :meth:get_latest_pose.
:param t: Time index or timestamp associated with the estimate. :param pose_se3: 6D se(3) pose vector in world coordinates. :param covariance: Optional 6x6 covariance matrix for the estimate. :param source_counts: Optional dictionary describing how many measurements from each sensor type contributed to this pose.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 | |
register_callback(callback)
Register a callback to receive all fused measurements.
:param callback: Function that accepts a
:class:~sensors.base.BaseMeasurement instance. It will be
called for every measurement produced by
:meth:poll_once or :meth:push_measurement.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
387 388 389 390 391 392 393 394 395 396 | |
register_sensor(name, modality, stream, converter=None)
Register a new sensor with the fusion manager.
If converter is not provided, a default converter will be
chosen based on modality.
:param name: Logical sensor name, e.g. "cam0" or "lidar_front".
:param modality: Modality string ("camera", "lidar",
"imu", or a custom value). Custom values must provide an
explicit converter.
:param stream: Sensor stream object used to read raw samples.
:param converter: Optional function that converts raw samples from
stream into measurement objects.
:raises ValueError: If a sensor with the same name is already registered, or if no default converter exists for the given modality and no explicit converter is provided.
Source code in dsg-jit/dsg_jit/sensors/fusion.py
324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 | |
sensors.imu
Inertial measurement unit (IMU) abstractions and simple integration utilities.
This module provides a small set of IMU-related types and helpers that make it easy to wire IMU data into DSG-JIT experiments and factor graphs. The focus is on structured measurements and simple integration, rather than on a full production-grade inertial navigation system.
The module typically exposes:
-
IMUMeasurement: A lightweight container for a single IMU sample, including body-frame linear acceleration, angular velocity, and an associateddt(time delta). A timestamp can also be stored for logging or alignment with other sensors. -
IMUSensor: A thin wrapper around a user-defined sampling function. The sampling function returnsIMUMeasurementinstances, and the wrapper provides: - Aread()method for synchronous polling. - An iterator interface for use in simple loops. - Compatibility with the generic sensor streaming helpers. -
integrate_imu_naive: A toy integrator that demonstrates how to accumulate IMU measurements into a position and velocity estimate. It assumes that: - Acceleration is already expressed in the world frame, and - Gravity has been compensated externally. This function is intended for didactic experiments and not as a replacement for a full IMU preintegration pipeline.
Usage patterns:
- Wrap any IMU source (hardware driver, simulator, dataset) with an
IMUSensorso that you always work withIMUMeasurementobjects. - For demonstration or unit tests, use
integrate_imu_naiveto produce rough pose/velocity estimates and then inject those as odometry factors into a factor graph. - For more advanced use cases, replace the naive integrator with a
preintegration module, but keep the same
IMUMeasurementandIMUSensorinterfaces so the rest of the system remains unchanged.
The design goal is to make IMU handling:
- Explicit and transparent (no hidden global state).
- Composable with other sensors (cameras, range sensors, etc.).
- Easy to plug into the existing DSG-JIT world model and optimization stack without requiring changes to core solvers.
IMUMeasurement(accel, gyro, dt, timestamp=None)
dataclass
Single IMU sample consisting of specific force and angular velocity.
This is a lightweight container for raw IMU readings that can be produced by hardware drivers, simulators, or log readers and then consumed by the factor-graph/DSG layers.
The convention assumed here is:
accelis specific force in the IMU body frame, inm/s^2.gyrois angular velocity in the IMU body frame, inrad/s.dtis the time delta since the previous sample, in seconds.timestampis an optional absolute time in seconds.
:param accel: Linear acceleration in the IMU/body frame, shape (3,).
:param gyro: Angular velocity in the IMU/body frame, shape (3,).
:param dt: Time delta since the previous sample, in seconds.
:param timestamp: Optional absolute timestamp in seconds.
as_numpy()
Return the acceleration and gyro as NumPy arrays.
This is a small convenience helper for users who want to interoperate
with NumPy-based tooling. If NumPy is not available, a :class:RuntimeError
is raised.
:return: Tuple (accel_np, gyro_np) as NumPy arrays.
:raises RuntimeError: If NumPy is not installed or import failed.
Source code in dsg-jit/dsg_jit/sensors/imu.py
88 89 90 91 92 93 94 95 96 97 98 99 100 101 | |
IMUSampleFn
Bases: Protocol
Protocol for callables that produce :class:IMUMeasurement samples.
This is primarily used to type-annotate IMU sensor wrappers.
:return: A new :class:IMUMeasurement instance.
IMUSensor(name, sample_fn)
Bases: BaseSensor
Generic IMU sensor wrapper.
This class adapts any callable that returns :class:IMUMeasurement into the
common sensor interface used by DSG-JIT. It is intentionally minimal: it does
not attempt to manage threads, buffering, or synchronization—those concerns
are handled by the sensor stream utilities in :mod:sensors.streams.
:param name: Human-readable sensor name.
:param sample_fn: Callable producing a single :class:IMUMeasurement per call.
Source code in dsg-jit/dsg_jit/sensors/imu.py
135 136 137 138 139 140 141 142 143 144 145 | |
__iter__()
Create an iterator over IMU samples.
This is primarily useful when coupling the sensor with an asynchronous or synchronous sensor stream helper that consumes an iterator.
:return: Infinite iterator yielding :class:IMUMeasurement objects.
Source code in dsg-jit/dsg_jit/sensors/imu.py
158 159 160 161 162 163 164 165 166 167 168 | |
read()
Read a single IMU measurement.
This will typically perform a blocking read from a hardware device, a
simulator, or a log file, depending on how sample_fn is implemented.
:return: The next :class:IMUMeasurement sample.
Source code in dsg-jit/dsg_jit/sensors/imu.py
147 148 149 150 151 152 153 154 155 156 | |
integrate_imu_naive(measurements, v0=None, p0=None)
Naively integrate IMU measurements to update position and velocity.
This helper performs a very simple, orientation-agnostic integration of a sequence of IMU samples. It assumes that the acceleration vectors are already expressed in the world frame and that gravity has been compensated for. As such, it is not a replacement for proper IMU preintegration, but it is convenient for toy 1D/3D experiments and testing the data flow from sensors into the optimizer.
The integration scheme is:
.. math::
v_{k+1} &= v_k + a_k \delta t_k\
p_{k+1} &= p_k + v_{k+1} \delta t_k\
:param measurements: Iterable of :class:IMUMeasurement objects, in time order.
:param v0: Optional initial velocity, shape (3,). Defaults to zeros.
:param p0: Optional initial position, shape (3,). Defaults to zeros.
:return: Tuple (p, v) with the final position and velocity.
Source code in dsg-jit/dsg_jit/sensors/imu.py
171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 | |
sensors.integration
Helpers for wiring sensor fusion results into DSG-JIT world models.
This module provides small, stateless utilities that take fused pose
estimates from :mod:sensors.fusion and apply them to a
:class:world.model.WorldModel or dynamic scene graph.
By keeping this logic in a separate integration layer, we avoid coupling the sensor stack directly to the optimization core, while still making it very easy for users to build real-time or batch pipelines.
apply_fused_pose_to_world(world, fusion, agent_id, t, fused=None)
Apply a fused SE(3) pose estimate to the world model for a given agent.
If a pose for (agent_id, t) already exists in the world, this function
updates its value in-place. Otherwise, it creates a new agent pose
variable via :meth:world.model.WorldModel.add_agent_pose.
:param world: World model whose factor graph should be updated.
:param fusion: Sensor fusion manager providing fused pose estimates.
:param agent_id: String identifier for the agent (e.g. "robot0").
:param t: Discrete timestep index for this update.
:param fused: Optional fused pose estimate. If None, this function
calls :meth:SensorFusionManager.get_latest_pose internally.
:returns: The integer node id (int(NodeId)) corresponding to the
agent's pose variable at time t.
:raises ValueError: If no fused estimate is available.
Source code in dsg-jit/dsg_jit/sensors/integration.py
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 | |
apply_trajectory_to_world(world, agent_id, trajectory)
Bulk-apply a discrete trajectory to the world model as agent poses.
This is useful for offline pipelines, where you already have a fused trajectory (e.g. from a batch fusion run) and simply want to seed or refresh the world model with those states.
:param world: World model to update.
:param agent_id: String identifier for the agent.
:param trajectory: Mapping from timestep t to 6D se(3) pose vectors
in world coordinates.
:returns: None. All updates are applied in-place.
Source code in dsg-jit/dsg_jit/sensors/integration.py
65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 | |
sensors.lidar
LiDAR sensor abstractions and utilities for DSG-JIT.
This module defines a lightweight representation of LiDAR data—either 1D range scans, 2D planar scans, or sparse 3D point samples—and provides a simple, JAX-friendly interface for integrating such scans into the factor graph or dynamic scene graph layers.
The module includes:
-
LiDARScan A minimal container representing a single LiDAR measurement: - Ranges (1D, 2D, or 3D depending on sensor type) - Optional beam angles or directions - Optional timestamp The structure is intentionally simple so that downstream algorithms (e.g., geometry-based localization, place association, or occupancy grid inference) can build on top of it without being locked into a particular LiDAR model.
-
LiDARSensor A wrapper around any user-provided LiDAR capture function. This allows plugging in real hardware, ROS topics, simulation engines, or synthetic data generators. The wrapper returns
LiDARScanobjects and is fully compatible with both synchronous and asynchronous streaming utilities. -
Helper transforms Utilities for: - Converting range/angle scans into 2D or 3D point clouds. - Projecting LiDAR points into world coordinates given a robot pose. - Preparing LiDAR-derived factors for integration into the factor graph (e.g., bearing-range residuals, scan-matching constraints).
Design philosophy:
- Keep the LiDAR model minimal and general.
- Allow users to choose how scans translate into DSG elements (places, objects, layout nodes, occupancy voxels).
- Cleanly interoperate with the broader DSG-JIT world model, enabling: - scan→point cloud→place association - scan→object detection integration - scan→unknown-space discovery - scan→range constraints between robot poses and map nodes
This module intentionally avoids tying LiDAR data to a specific SLAM method (e.g., ICP, NDT, LOAM); instead, it provides a consistent base layer for future plugins and extensions.
LidarMeasurement(ranges, directions=None, t=None, frame_id='lidar', metadata=None)
dataclass
Lightweight container for a single LiDAR scan.
This structure is deliberately minimal and JAX-friendly. It can be used for 1D range scans, 2D planar scans, or sparse 3D point samples, and is meant to serve as an intermediate representation between raw sensor data and DSG-JIT factors (e.g., range/bearing residuals, voxel updates).
:param ranges: LiDAR ranges in sensor coordinates.
Typical shapes:
* 1D scan: ``(N,)`` where each entry is a range sample.
* 2D grid: ``(H, W)`` for image-like range sensors.
* Sparse 3D: ``(N,)`` used together with ``directions`` below.
Values are typically in meters.
:type ranges: jax.numpy.ndarray
:param directions: Optional unit vectors giving the direction of each range sample in the sensor frame.
Typical shapes:
* 1D scan: ``(N, 3)`` where each row is a 3D direction vector.
* 2D grid: ``(H, W, 3)`` matching the shape of ``ranges``.
If ``None``, downstream code is expected to infer directions based
on sensor intrinsics (e.g., azimuth/elevation tables or a pinhole
camera model).
:type directions: Optional[jax.numpy.ndarray]
:param t: Optional timestamp (e.g., in seconds). This can be used to align the measurement with the dynamic scene graph time index or other sensors. :type t: Optional[float]
:param frame_id:
Identifier for the sensor frame from which this measurement was
taken (e.g. "lidar_front"). This is useful when a robot carries
multiple LiDAR units or when extrinsic calibration is maintained
per frame.
:type frame_id: str
:param metadata: Optional dictionary for any additional information (e.g., intensity values, per-beam noise estimates, or scan ID). This field is not used by the core library but can be useful for higher-level perception algorithms. :type metadata: Optional[dict]
RangeSensor(name, agent_id, config, sg)
Bases: Sensor
Simple range-only sensor attached to an agent.
Given a :class:SensorReading whose data field is a scalar
range, this sensor produces a single "range" factor connecting
the agent's pose at time t to a 3D target node.
It expects:
- the agent trajectory to be registered in the
:class:
DynamicSceneGraphunder the sameagent_idused to construct this sensor, and - the target node id to be present in the underlying
:class:
SceneGraphWorld.
The factor uses :func:slam.measurements.range_residual and is
registered under the factor type "range".
Source code in dsg-jit/dsg_jit/sensors/lidar.py
164 165 166 167 168 169 170 171 172 173 | |
build_factors(wm, dsg, reading)
Build a single range factor from a reading.
:param wm: World model whose factor graph already contains the
agent poses and the target node declared in the config.
:type wm: world.model.WorldModel
:param dsg: Dynamic scene graph providing the pose node id for
(agent_id, reading.t).
:type dsg: world.dynamic_scene_graph.DynamicSceneGraph
:param reading: Range measurement; reading.data is assumed
to be a scalar float.
:type reading: sensors.base.SensorReading
:return: List containing a single "range" factor.
:rtype: list[core.types.Factor]
Source code in dsg-jit/dsg_jit/sensors/lidar.py
175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 | |
RangeSensorConfig(noise_sigma, target_node)
dataclass
Configuration for a simple range-only sensor.
:param noise_sigma: Standard deviation of the range measurement, in
the same units as the world coordinates (typically meters).
:type noise_sigma: float
:param target_node: Node id of the 3D target in the
:class:SceneGraphWorld (e.g. a room or object center).
:type target_node: int
sensors.streams
Generic sensor streaming utilities for DSG-JIT.
This module provides a unified abstraction layer for handling live or simulated sensor data streams—synchronous or asynchronous—so that all sensor types (IMU, LiDAR, cameras, range sensors, etc.) can plug into DSG-JIT’s dynamic scene graph pipeline without requiring unique logic for each device.
The key goal is to decouple how data is produced (polling, callbacks, asynchronous feeds) from how DSG-JIT consumes that data (factor graph updates, node creation, temporal linking).
The module typically defines:
-
SynchronousStreams
- A minimal wrapper around a sensor object that exposes a
read()method. - Designed for simple for-loops or offline dataset playback.
- Ensures each call returns a typed measurement (e.g., IMUMeasurement, CameraFrame, LiDARScan).
- A minimal wrapper around a sensor object that exposes a
-
AsynchronousStreams
- Provides asyncio-based background tasks that fetch sensor data without blocking the main SLAM/DSG loop.
- Each sensor type is polled in a separate coroutine, and the latest measurement is stored internally for consumption.
- Enables future real-time extensions (e.g., multi-sensor fusion, asynchronous factor graph updates).
-
StreamHandle
- An ergonomic wrapper exposing:
.latest()– retrieve most recent measurement..reset()– clear buffer..close()– stop background tasks. - Useful for synchronized fusion pipelines where multiple sensors must provide data simultaneously.
- An ergonomic wrapper exposing:
Design philosophy:
- Treat all sensors uniformly—same streaming API regardless of modality.
- Allow both synchronous and asynchronous execution with zero change to downstream SLAM/DSG logic.
- Enable future “plug-and-play” sensor integration for real robots, simulators, prerecorded logs, or unit test data.
This module does not itself perform SLAM or scene graph updates—it only defines the mechanism by which sensors deliver data into such pipelines.
AsyncFileRangeStream(path, delay=None)
Bases: AsyncReadingStream
Asynchronous file-backed range stream.
This behaves like :class:FileRangeStream, but exposes an async iterator
interface so it can be consumed with async for. An optional delay
can be used to simulate a sensor publishing at a fixed rate.
.. note::
File I/O is still performed using the standard blocking open call.
For small logs and simple simulations this is usually fine, but for
large files or strict real-time requirements you may prefer to replace
this with a true async file reader.
:param path: Path to a text file containing one numeric value per line. :type path: str :param delay: Optional delay between consecutive readings, in seconds. :type delay: float or None
Initialize the asynchronous file-backed range stream.
:param path: Path to a text file containing one numeric value per line. :type path: str :param delay: Optional delay between consecutive readings, in seconds. :type delay: float or None
Source code in dsg-jit/dsg_jit/sensors/streams.py
243 244 245 246 247 248 249 250 251 252 | |
__aiter__()
async
Asynchronously iterate over sensor readings.
If delay is set, the coroutine sleeps for that many seconds
between consecutive readings.
:return: Asynchronous iterator over :class:SensorReading objects.
:rtype: AsyncIterator[SensorReading]
Source code in dsg-jit/dsg_jit/sensors/streams.py
254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 | |
AsyncReadingStream
Bases: BaseSensorStream
Asynchronous stream of :class:SensorReading objects.
Subclasses implement :meth:__aiter__ to provide an async iterator over
sensor readings. This is useful when integrating with an asyncio-based
event loop or real-time sensor drivers.
__aiter__()
Return an asynchronous iterator over sensor readings.
:return: Asynchronous iterator over :class:SensorReading objects.
:rtype: AsyncIterator[SensorReading]
Source code in dsg-jit/dsg_jit/sensors/streams.py
211 212 213 214 215 216 217 | |
BaseSensorStream
Bases: ABC
Abstract base class for all sensor streams used by DSG-JIT.
This defines the minimal interface that synchronous and asynchronous
sensor streams must support so they can be registered with
:class:sensors.fusion.SensorFusionManager.
Subclasses may implement either sync or async behavior:
- Synchronous streams must implement :meth:
read. - Asynchronous streams must implement :meth:
__aiter__.
Implementations may choose to support both, but it is not required.
__aiter__()
Asynchronous iteration interface. Async-only streams MUST implement
this. Sync-only streams may raise NotImplementedError.
Source code in dsg-jit/dsg_jit/sensors/streams.py
83 84 85 86 87 88 | |
close()
Optional cleanup hook.
Streams that hold system resources (files, sockets, hardware handles) should override this to release them.
Source code in dsg-jit/dsg_jit/sensors/streams.py
90 91 92 93 94 95 96 97 | |
read()
Return the next sample from the stream, or None if no sample
is currently available.
Sync-only streams MUST implement this. Async-only streams may raise
NotImplementedError.
Source code in dsg-jit/dsg_jit/sensors/streams.py
73 74 75 76 77 78 79 80 81 | |
FileRangeStream(path)
Bases: ReadingStream
Synchronous stream backed by a plain-text file.
Each line in the file is interpreted as a single floating-point range
measurement. The line index is used as the time step t.
:param path: Path to a text file containing one numeric value per line. :type path: str
Initialize the file-backed range stream.
:param path: Path to a text file containing one numeric value per line. :type path: str
Source code in dsg-jit/dsg_jit/sensors/streams.py
180 181 182 183 184 185 186 187 | |
__iter__()
Yield sensor readings from the underlying file.
Each yielded reading has t equal to the zero-based line index and
data equal to the parsed floating-point value.
:return: Iterator over :class:SensorReading objects.
:rtype: Iterator[SensorReading]
Source code in dsg-jit/dsg_jit/sensors/streams.py
189 190 191 192 193 194 195 196 197 198 199 200 | |
FunctionStream(fn)
Bases: ReadingStream
Synchronous stream that pulls samples from a user-provided callback function instead of a file or iterator.
The callback must return either
- a SensorReading
- None (to indicate end of data)
Source code in dsg-jit/dsg_jit/sensors/streams.py
159 160 161 | |
ReadingStream()
Bases: BaseSensorStream
Synchronous stream of :class:SensorReading objects.
Concrete subclasses implement :meth:__iter__ to yield readings one by one.
Typical usage::
stream = FileRangeStream("ranges.txt")
for reading in stream:
process(reading)
Initialize a new reading stream.
Subclasses typically only need to override :meth:__iter__. This base
constructor sets up internal state for :meth:read.
Source code in dsg-jit/dsg_jit/sensors/streams.py
111 112 113 114 115 116 117 118 | |
__iter__()
Iterate over sensor readings.
:return: Iterator over :class:SensorReading objects.
:rtype: Iterator[SensorReading]
Source code in dsg-jit/dsg_jit/sensors/streams.py
141 142 143 144 145 146 147 | |
read()
Return the next sensor reading from the stream, or None if the
underlying iterator is exhausted.
This method adapts the iterator protocol (__iter__) to the
:class:BaseSensorStream synchronous read API, so that
:class:~sensors.fusion.SensorFusionManager and other callers can
treat all synchronous streams uniformly.
:return: Next sensor reading, or None if no further data is
available.
:rtype: Optional[SensorReading]
Source code in dsg-jit/dsg_jit/sensors/streams.py
120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 | |