Machine Vision  v1.2.13
Visual-Inertial Simultaneous Fusion Localization and Mapping

Classes

struct  mvVISLAMPose
 Pose information along with a quality indicator for VISLAM. More...
 
struct  mvVISLAMMapPoint
 Map point information from VISLAM. More...
 

Functions

mvVISLAMmvVISLAM_Initialize (const mvCameraConfiguration *camera, float32_t readoutTime, const float32_t *tbc, const float32_t *ombc, float32_t delta, const float32_t *std0Tbc, const float32_t *std0Ombc, float32_t std0Delta, float32_t accelMeasRange, float32_t gyroMeasRange, float32_t stdAccelMeasNoise, float32_t stdGyroMeasNoise, float32_t stdCamNoise, float32_t minStdPixelNoise, float32_t failHighPixelNoiseScaleFactor, float32_t logDepthBootstrap, bool useLogCameraHeight, float32_t logCameraHeightBootstrap, bool noInitWhenMoving, float32_t limitedIMUbWtrigger, const char *staticMaskFileName, float32_t gpsImuTimeAlignment, const float32_t *tba, bool mapping)
 Initialize VISLAM object. A few parameters may significantly impact the performance of the VISLAM algorithm. Some parameters affect the initial convergence of VISLAM, which impacts the overall drift in the estimated pose. The following parameters should have particular attention paid to them: logDepthBootstrap, useLogCameraHeight, logCameraHeightBootstrap, and limitedIMUbWtrigger. More...
 
void mvVISLAM_Deinitialize (mvVISLAM *pObj)
 De-initialize VISLAM object. More...
 
void mvVISLAM_AddImage (mvVISLAM *pObj, int64_t time, const uint8_t *pxls)
 Add the camera frame to the VISLAM object and trigger processing (a frame update) on the newly added image while utilizing any already added IMU samples, including timestamps occurring after the image, to fully propagate the pose forward to the most recent IMU sample.
NOTE: All other sensor data occurring before this image must be added first before calling this function otherwise that older data will be dropped at the next call of this function. More...
 
void mvVISLAM_AddAccel (mvVISLAM *pObj, int64_t time, float64_t x, float64_t y, float64_t z)
 Pass Accelerometer data to the VISLAM object. More...
 
void mvVISLAM_AddGyro (mvVISLAM *pObj, int64_t time, float64_t x, float64_t y, float64_t z)
 Pass Gyroscope data to the VISLAM object. More...
 
void mvVISLAM_AddGPSvelocity (mvVISLAM *pObj, int64_t time, float64_t velocityEast, float64_t velocityNorth, float64_t velocityUP, float64_t measCovVelocity[3][3], uint16_t solutionInfo)
 Pass GPS velocity data to the VISLAM object. More...
 
void mvVISLAM_AddGPStimeSync (mvVISLAM *pObj, int64_t time, int64_t bias, int64_t gpsTimeStdDev)
 Pass GPS time bias data to the VISLAM object. More...
 
mvVISLAMPose mvVISLAM_GetPose (mvVISLAM *pObj)
 Grab last computed pose. More...
 
int mvVISLAM_HasUpdatedPointCloud (mvVISLAM *pObj)
 Inquire if VISLAM has new map points. More...
 
int mvVISLAM_GetPointCloud (mvVISLAM *pObj, mvVISLAMMapPoint *pPoints, uint32_t maxPoints)
 Grab point cloud. More...
 
void mvVISLAM_Reset (mvVISLAM *pObj, bool resetPose)
 Resets the EKF from an external source. EKF will try to reinitialize in the subsequent camera frame. To properly initialize after reset, device should not be rotating, moving a lot, camera look at 10+ features. More...
 

Detailed Description

Overview

VISLAM provides 6-DOF localization and pose estimation for various applications. It has been tuned for robot use cases in particular.

In addition to the initialization parameters, there are other things to consider when attempting to get the best possible performance out of VISLAM. A good camera calibration performed specifically for a given camera has the potential to significantly reduce the overall odometry drift rather than the default calibration provided in examples.

Furthermore, rich motion just after VISLAM starts can accelerate the state space convergence and lead to lower drift. For the drone application, rolling/pitching and high linear accelerations are good types of motion for better convergence.

The spatial frame (s) used herein might also be called the world frame or event the initialization frame since it is defined by where the feature fully initializes and stabilizes. The gravity-aligned spatial frame (s') might also be called the inertial frame if the x-axis were aligned to North. The body frame (b) is centered on the accelerometer which may be slightly different than the IMU or gyro. The camera frame (c) is centered on the camera.

Limitations

The following list are some of the known limitations:

Function Documentation

◆ mvVISLAM_AddAccel()

void mvVISLAM_AddAccel ( mvVISLAM pObj,
int64_t  time,
float64_t  x,
float64_t  y,
float64_t  z 
)

Pass Accelerometer data to the VISLAM object.

Parameters
pObjPointer to VISLAM object.
timeTime stamp of data (in nanoseconds) in system time.
xAccelerometer data for X axis in \(m/s^2\).
yAccelerometer data for Y axis in \(m/s^2\).
zAccelerometer data for Z axis in \(m/s^2\).

◆ mvVISLAM_AddGPStimeSync()

void mvVISLAM_AddGPStimeSync ( mvVISLAM pObj,
int64_t  time,
int64_t  bias,
int64_t  gpsTimeStdDev 
)

Pass GPS time bias data to the VISLAM object.

Parameters
pObjPointer to VISLAM object.
timeTime stamp of data in system time in nanoseconds.
biasTime bias/offset (time bias/offset = GPS time - system time) in nanoseconds.
gpsTimeStdDevGPS time uncertainty (if available, otherwise set to -1).

◆ mvVISLAM_AddGPSvelocity()

void mvVISLAM_AddGPSvelocity ( mvVISLAM pObj,
int64_t  time,
float64_t  velocityEast,
float64_t  velocityNorth,
float64_t  velocityUP,
float64_t  measCovVelocity[3][3],
uint16_t  solutionInfo 
)

Pass GPS velocity data to the VISLAM object.

Parameters
pObjPointer to VISLAM object.
timeTime stamp of data in GPS time in nanoseconds
velocityEastGPS velocity data in East direction in \(m/s\).
velocityNorthGPS velocity data in North direction in \(m/s\).
velocityUPGPS velocity data in Up direction in \(m/s\).
measCovVelocityGPS velocity measurement error co-variance, fields in \((m/s)^2\).
solutionInfoFix type/quality: the last 3 bits being '100' represents a good message (if available, otherwise set to 4).

◆ mvVISLAM_AddGyro()

void mvVISLAM_AddGyro ( mvVISLAM pObj,
int64_t  time,
float64_t  x,
float64_t  y,
float64_t  z 
)

Pass Gyroscope data to the VISLAM object.

Parameters
pObjPointer to VISLAM object.
timeTime stamp of data in nanoseconds in system time.
xGyro data for X axis in \(rad/s\).
yGyro data for Y axis in \(rad/s\).
zGyro data for Z axis in \(rad/s\).

◆ mvVISLAM_AddImage()

void mvVISLAM_AddImage ( mvVISLAM pObj,
int64_t  time,
const uint8_t *  pxls 
)

Add the camera frame to the VISLAM object and trigger processing (a frame update) on the newly added image while utilizing any already added IMU samples, including timestamps occurring after the image, to fully propagate the pose forward to the most recent IMU sample.
NOTE: All other sensor data occurring before this image must be added first before calling this function otherwise that older data will be dropped at the next call of this function.

Parameters
pObjPointer to VISLAM object.
timeTime stamp of camera frame in nanoseconds in system time. Time must be center of exposure time, not start of frame or end of frame.
pxlsPointer to camera frame 8-bit grayscale luminance data (VGA).

◆ mvVISLAM_Deinitialize()

void mvVISLAM_Deinitialize ( mvVISLAM pObj)

De-initialize VISLAM object.

Parameters
pObjPointer to VISLAM object.

◆ mvVISLAM_GetPointCloud()

int mvVISLAM_GetPointCloud ( mvVISLAM pObj,
mvVISLAMMapPoint pPoints,
uint32_t  maxPoints 
)

Grab point cloud.

Parameters
pObjPointer to VISLAM object.
pPointsPreallocated array of mvVISLAMMapPoint structure to be filled in by VISLAM with current map points.
maxPointsMax number of points requested. Should match allocated size of pPoints.
Returns
Number of points filled into the pPoints array.

◆ mvVISLAM_GetPose()

mvVISLAMPose mvVISLAM_GetPose ( mvVISLAM pObj)

Grab last computed pose.

Parameters
pObjPointer to VISLAM object.
Returns
Computed pose from previous frame and IMU data.

◆ mvVISLAM_HasUpdatedPointCloud()

int mvVISLAM_HasUpdatedPointCloud ( mvVISLAM pObj)

Inquire if VISLAM has new map points.

Parameters
pObjPointer to VISLAM object.
Returns
Number of map points currently being observed and estimated.

◆ mvVISLAM_Initialize()

mvVISLAM* mvVISLAM_Initialize ( const mvCameraConfiguration camera,
float32_t  readoutTime,
const float32_t *  tbc,
const float32_t *  ombc,
float32_t  delta,
const float32_t *  std0Tbc,
const float32_t *  std0Ombc,
float32_t  std0Delta,
float32_t  accelMeasRange,
float32_t  gyroMeasRange,
float32_t  stdAccelMeasNoise,
float32_t  stdGyroMeasNoise,
float32_t  stdCamNoise,
float32_t  minStdPixelNoise,
float32_t  failHighPixelNoiseScaleFactor,
float32_t  logDepthBootstrap,
bool  useLogCameraHeight,
float32_t  logCameraHeightBootstrap,
bool  noInitWhenMoving,
float32_t  limitedIMUbWtrigger,
const char *  staticMaskFileName,
float32_t  gpsImuTimeAlignment,
const float32_t *  tba,
bool  mapping 
)

Initialize VISLAM object. A few parameters may significantly impact the performance of the VISLAM algorithm. Some parameters affect the initial convergence of VISLAM, which impacts the overall drift in the estimated pose. The following parameters should have particular attention paid to them: logDepthBootstrap, useLogCameraHeight, logCameraHeightBootstrap, and limitedIMUbWtrigger.

Parameters
cameraPointer to camera intrinsic calibration parameters.
readoutTimeFrame readout time (seconds). n times row readout time. Set to 0 for global shutter camera. Frame readout time should be (close to) but smaller than the rolling shutter camera frame period.
tbcPointer to accelerometer-camera translation misalignment vector (meters). \(T_{bc}\) is the translation of the origin of the camera (c) frame relative to that of the body (b) or accelerometer frame in the body frame. \(T_{bc}\) setting can be verified by checking estimated \(T_{bc}\).
ombcPointer to accelerometer-camera misalignment vector (radians). \(\Omega_{bc}\) is the corresponding rotation in exponential coordinates. Can be used together with \(T_{bc}\) to rotate vector in camera frame \(x_c\) to IMU frame \(x_{imu}\) via \([R|T]x_c = x_{imu}\). \(\Omega_{bc}\) settings can be verified by checking estimated \(R_{bc}\) mapped to exponential coordinates.
deltaCamera-inertial time stamp misalignment (seconds). Ideally this is within about 1 ms of the true value. Delta can be verified by checking the estimated time alignment.
std0TbcPointer to initial uncertainty in accelerometer-camera translation vector (meters).
std0OmbcPointer to initial uncertainty in accelerometer-camera orientation vector (rad.).
std0DeltaInitial uncertainty in time misalignment estimate (seconds).
accelMeasRangeAccelerometer sensor measurement range ( \(m/s^2\)).
gyroMeasRangeGyro sensor measurement range ( \(rad./s\)).
stdAccelMeasNoiseStandard deviation of accelerometer measurement noise ( \(m/s^2\)).
stdGyroMeasNoiseStandard deviation of gyro measurement noise ( \(rad./s\)).
stdCamNoiseStandard dev of camera noise per pixel.
minStdPixelNoiseMinimum of standard deviation of feature measurement noise in pixels.
failHighPixelNoiseScaleFactorScales measurement noise and compares against search area (is search area large enough to reliably compute measurement noise covariance matrix).
logDepthBootstrapInitial point depth [log(meters)], where log is the natural log. By default, initial depth is set to 1m. However, if e.g. a downward facing camera on a drone is used and it can be assumed that feature depth at initialization is always e.g. 4cm, then we can set this parameter to 4cm (or -3.2). This will improve tracking during takeoff, accelerate state space convergence, and lead to more accurate and robust pose estimates.
useLogCameraHeightUse logCameraHeightBootstrap instead of logDepthBootstrap.
logCameraHeightBootstrapInitializes point depth based on known geometry, assumes (1) camera pointing partially at ground plane and (2) board/IMU aligned with gravity at start (= accelerometer measures roughly [0, 0, -9.8] in units of \(m/s^2\)), required input is camera height over ground (log(meters)), log is natural log. Understanding when to use logDepthBootstrap versus logCameraHeightBootstrap and how to set these values appropriately can improve the initialization of VISLAM and has the potential to reduce the amount of odometry drift observed.
noInitWhenMovingSet if device is stationary w.r.t. surface when initializing (e.g. drone) based on camera, not on IMU: supports device on moving surface.
limitedIMUbWtriggerTo prevent tracking failure during/right after (hard) landing: If sum of 3 consecutive accelerometer samples in any dimension divided by 4.3 exceed this threshold, IMU measurement noise is increased (and resets become more likely);
NOTE: if platform vibrates heavily during flight, this may trigger mid- flight; if poseQuality in mvVISLAMPose drops to MV_TRACKING_STATE_LOW_QUALITY during flight, improve mechanical dampening (and/or increase threshold)
RECOMMEND: \(150 m/s^2 / 4.3 ~= 35\)
staticMaskFileName1/4 resolution image (w.r.t. VGA), 160 x 120, PGM format, the part of the camera view for which pixels are set to 255 is blocked from feature detection useful, e.g., to avoid detecting & tracking points on landing gear reaching into camera view.
gpsImuTimeAlignmentGPS-inertial time stamp misalignment (seconds), negative if GPS time stamping is delayed relative to IMU time stamping, ideally this is within about 1 ms of the true value.
tbaPointer to accelerometer-GPS antenna translation misalignment vector/lever arm (meters). \(T_{ba}\) is the translation of the origin of the GPS antenna (a) frame relative to that of the body (b) or accelerometer frame in the body frame.
mappingFlag to enable simple mapping: Feature points that leave the camera view are stored. If they move back into camera view and they are within the search area of front-end tracker, they will be re-used.
Returns
Pointer to VISLAM object; returns NULL if failed.

◆ mvVISLAM_Reset()

void mvVISLAM_Reset ( mvVISLAM pObj,
bool  resetPose 
)

Resets the EKF from an external source. EKF will try to reinitialize in the subsequent camera frame. To properly initialize after reset, device should not be rotating, moving a lot, camera look at 10+ features.

Parameters
pObjPointer to VISLAM object.
resetPosefalse: Initializes with last good pose after triggering reset true: Initializes with "zero" pose after triggering reset