# 4 x 4 Transform matrix for static laser triangulation sensor to part on rotary

Hi all,

I’m sure this has been answered before but struggling to get a simple answer.

Suppose I have the following setup

• A part that I want to scan in 3D on a rotary stage with an encoder output
• A static laser triangulation sensor with an encoder input

Here is an image of a typical laser sensor from Cognex but lets assume the sensor used is an off the shelf calibrated unit.

Instead of moving the part linearly under the sensor I want to keep the sensor fixed and move the part to be scanned on a rotary stage which will have an encoder output.

For each point in the point cloud I need to get x, y & z

z - I get from the sensor
x - not sure how the rotary angles map to give me x
y - not sure how the rotary angles map to give me y

Is this just a basic trigonometry problem or is it a little be it more complicated

To keep the math’s simple lets assume I take 360 line scans with the rotary moving in 1 degree increments for each of the line scans. Each line scan is triggered by the rotary encoder output.

What is the 4x4 transformation matrix I need to perform in order on each line scan, taking into account the rotary angle for each of the line scans to create a point cloud using this configuration?

yes.

yes.

I’m gonna say no because I don’t understand what you’re asking with the last two lines. this isn’t just “some angles”. you’ll want to work with 4x4 transformation matrices.

your scanner projects a plane into space. it’s giving you points on that plane.

ideally the axis of rotation of your part lies on that plane, i.e. the intersection of the plane and the axis is the axis.

however, practically, nothing is that perfect. that plane needn’t intersect with the axis of rotation of your part. it may be offset but parallel to the axis (intersection is empty). or it may not be parallel, i.e. the intersection is a point.

you need to model these situations and account for them, i.e. come up with ways to measure the situation.

assuming stuff is perfect, you’d take the scanner’s points and the turntable’s rotation, and rotate those points (because rotating the object is identical to rotating the scanner), and insert them into your world model.

if the scanner’s plane is parallel but offset (no intersection), you’d get a cylinder around the rotation axis that you can’t scan. if the intersection is a point, you get a cone around the axis that you can’t scan.

to get an intuition for the situation, mount the scanner at an odd angle in relation to the turntable. that makes all the imperfections obvious, and it makes any code you write, and the bugs, trivial to see.

In my head this is how I thought it should be for each of these scenarios

Let the scanner coordinate frame system be the base frame.

Scenario 1

Lets assume the laser sensor is perpendicular to the rotary stage

In this case do we just have a rotation around z by the amount that rotary has moved (lets call it theta). Is this correct?

Mat R_z = (Mat_<` `double` `>(3,3) <<

cos` `(theta), -` `sin` `(theta), 0,`

sin` `(theta), ` `cos` `(theta), 0,`

0, 0, 1);

Scenario 2

Lets assume the laser sensor is NOT perpendicular to the rotary stage and it’s more like 30 degrees so that we can see the top and side of the part in the one scan line

In this case do we still just have a rotation around z by the amount that rotary has moved (lets call it theta) or is there more rotations because the sensor is at an angle?

don’t bother literally writing rotation matrices here (I will not check if the signs in there are in the right places). you should implement that as a function that takes an axis and an angle, employs cv.Rodrigues, and spits out a matrix, and that’s it.

if you want the scanner to be the world frame, and the rotation axis is at an angle to the scanner’s z-axis, then your object/points of course rotate around that skewed axis instead of just z.

your questions are too low level yet lacking detail to be useful to you. I’m uncomfortable answering those because it takes effort to figure out what you’re asking, and it takes effort to figure out how to formulate an answer that can’t be misunderstood.

ok - let me work on formulating better / more targeted questions as to what I am trying to do

I think it’d help you understand if you got some way (library) to visualize things in 3D. drawing lines (…cylinders…) in 3D. then set up some of those matrices.

matplotlib may be enough to get something started but it may need some doing to get the 3d view to scale its axes correctly (i.e. equally). or look for some support for OpenGL… don’t bother with the “modern” OpenGL, it’s insane. you would want the good old (“inefficient”) immediate style API. I also still mean to evaluate Open3D

have a “model” of the scanner, i.e. a few random fixed points in the scanner’s frame.

have a “model” of the turntable, i.e. a few random fixed points in the turntable’s frame. make it spin.

see if you can transform from one frame into another.

it’d also probably help to grab some camera and play with aruco markers… you can put them in your physical scene and “measure” things.

I’d recommend a separate world origin/frame, say for the table this is all mounted on. that separates the turning table from the upside-down (and whatnot) positioned scanner.

All good feedback - not easy trying to get an easy to use package to visualize all this. I’ll taka look at some of your suggestions. Thanks

I haven’t found anything neat yet either. all the search results are “how to build your own FUCKING GAME ENGINE” or “we’re very modern and object-oriented, read our autogenerated docubarf” or “we are a game engine, and everything you’ll ever need to do can be modeled as a video game”

I am disgusted by the state of things. maybe 8 years ago I used “vpython”. now they need you to run that stuff in a web browser… because why not do a web server that sends commands to a browser that does the webgl… from your python script. insane.

if I find something that doesn’t make me homicidal, I hope I’ll remember the thread and report as well.