A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers

Page created by Juanita Morales
 
CONTINUE READING
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
sensors
Article
A Three-Dimensional Structured Light Vision System by Using
a Combination of Single-Line and Three-Line Lasers
Qiucheng Sun * , Zeming Ren, Jinlong Zhu, Weiyu Dai, Mingze Wang and Mingyu Sun

 College of Computer Science and Technology, Changchun Normal University, Changchun 130032, China
 * Correspondence: sunqiucheng@mail.ccsfu.edu.cn

 Abstract: A multi-line structured light measurement method that combines a single-line and a
 three-line laser, in which precision sliding rails and displacement measurement equipment are
 not required, is proposed in this paper. During the measurement, the single-line structured light
 projects onto the surface of an object and the three-line structured light remains fixed. The single-
 line laser is moved and intersects with the three-line laser to form three intersection points. The
 single-line light plane can be solved using the camera coordinates of three intersection points, thus
 completing the real-time calibration of the scanned light plane. The single-line laser can be scanned
 at any angle to determine the overall complete three-dimensional (3D) shape of the object during
 the process. Experimental results show that this method overcomes the difficulty of obtaining
 information about certain angles and locations and can effectively recover the 3D shape of the object.
 The measurement system’s repetition error is under 0.16 mm, which is sufficient to measure the 3D
 shapes of complicated workpieces.

 Keywords: multi-line structured light; three-dimensional measurement; visual measurement

 1. Introduction
Citation: Sun, Q.; Ren, Z.; Zhu, J.;
 At present, 3D measurement is widely used in the fields of heritage restoration and
Dai, W.; Wang, M.; Sun, M. A
 machinery production [1]. Among them, 3D object surface contour measurement meth-
Three-Dimensional Structured Light
 ods are generally divided into two types, contact and non-contact. The current contact
Vision System by Using a
 measurement methods, such as coordinate measuring machines [2], can basically meet the
Combination of Single-Line and
 requirements of measurement accuracy, but the contact measurement methods have prob-
Three-Line Lasers. Sensors 2023, 23,
13. https://doi.org/10.3390/
 lems such as slow measurement speed and easy damage to the contact surface, which limit
s23010013
 their measurement accuracy. Non-contact measurements comprise projected structured
 light, interferometry, ultrasonic measurement, and other techniques for applications in the
Academic Editors: Euan McLeod and field of quick measurement [3–5].Structured light projection 3D measurement technology,
Vittorio Passaro
 for example, is easy to use and may provide accurate real-time measurements. In structured
Received: 8 October 2022 light projection 3D measurement technology, laser projection 3D measurement technology
Revised: 6 December 2022 is a main research technique. This technique is gradually displacing contact measuring
Accepted: 17 December 2022 methods in a variety of industries, including car assembly [6], reverse engineering, and
Published: 20 December 2022 part quality inspection [7–9].
 Currently, line laser is mainly used in laser projection 3D measurements to scan and
 reconstruct the object under test [10–12]. The line structured light system emits a laser
 beam that forms a strip projection on the surface of the object. Afterwards, the camera
Copyright: © 2022 by the authors.
 captures the laser stripe and extracts the center of the laser stripe. Finally, the system
Licensee MDPI, Basel, Switzerland.
 converts the light strip pixel information into physical information based on the laser plane
This article is an open access article
 equation and camera parameters and fits these 3D point cloud to obtain the 3D shape of
distributed under the terms and
 the object [13–16].
conditions of the Creative Commons
Attribution (CC BY) license (https://
 To calibrate the structured light planes, Dewar [17] projected the laser onto some fine
creativecommons.org/licenses/by/
 non-coplanar threads mounted in 3D space, and then defined the structured light planes
4.0/). by obtaining images of bright light points from the camera as control points. Huang [18]

Sensors 2023, 23, 13. https://doi.org/10.3390/s23010013 https://www.mdpi.com/journal/sensors
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
Sensors 2023, 23, 13 2 of 17

 presented a new calibration method for line structured light multi-vision sensors based
 on combined targets. Each line structured light multi-vision sensor is calibrated locally
 with high accuracy by combining targets. The position of the target has no impact on
 the profile feature imaging for ball targets. Using a single ball target as his foundation,
 Zhou [19] proposed a technique for the quick calibration of a line-structured light system.
 Xie [20] used a planar target and a raising block to complete the field calibration of the
 linear structured light sensor. The three-dimensional measurement method based on line-
 structured light can only measure 3D contour information on one cross-section of an object
 at a time. Therefore, motion devices are required to sweep the laser plane across the surface
 of the measured object to gather 3D data on the complete surface [21]. Both translational
 and rotational scanning methods used slides and rotary tables and are commonly used to
 create these 3D measurement systems.
 To estimate the size of big forgings in thermal condition, Zhang [22] used high-
 precision guides in combination with line structured light. The experimental results
 revealed that the procedure is effective with an error of less than 1 mm, which meets
 the accuracy criteria. For remote inspection of internal delamination in wind turbine blades,
 Hwang [23] presented a continuous line laser scanning thermography (CLLST) method.
 The test results have shown that the 10 mm diameter internal delamination located 1 mm
 underneath the blade surface was successfully detected even 10 m from the target blade.
 The above methods usually require the scanning direction to be strictly perpendicular
 to the structured light plane, in which the strict position relationship is too difficult to
 achieve in practical measurements. Then, a number of improved methods have been pro-
 posed. Wu [24] designed and manufactured a calibration device through the perspective
 invariance and the projective invariant cross-ratio properties. Experimental results with
 different test pieces demonstrated that the minimum geometric accuracy was 18 µm and
 the repeatability was less than ±3 µm. Zeng [25] used a planar target to calibrate the
 translation vector, which solved the traditional inconvenience of calibrating the precise
 movement direction by high-precision auxiliary equipment. Lin [26] proposed a binocular
 stereo vision based on the camera translation direction cosine calibration method to realize
 the conversion of point clouds from local to a global coordinate system, so as to obtain the
 3D data of the object under test. When it is necessary to measure the inner surface of an
 object or complete information about the outside of an object, a series of methods were
 proposed in which rotary tables or rotating guides were applied. Cai [27] used a line laser
 in combination with a motor, measuring the 3D shape of potatoes, which can help farmers
 to analyze their phenotypic characteristics and grade them. Guo [28] used a high precision
 air floating rotary table and a line structure light sensor to rapidly obtain information on
 the 3D shape of gear. Liu [29] placed a mouse on a metal part on a rotating platform with a
 motor to measure its 3D shape by modulated the average stripe width within a favorable
 range. Measurements on different parts show that the method improves the integrity of the
 surface. Wang [30] put the pipe on a turntable and the linear structured light measuring
 device on the middle of the pipe. The three-dimensional measurement of the inner wall
 of the pipe was realized by rotating the pipe. In panoramic 3D shape measurement based
 on a turntable, the point cloud registration accuracy will be influenced to some extent by
 the calibration of the rotation axis. Therefore, some methods of calibrating the rotation
 axis have been proposed. Cai [31] proposed an auxiliary camera-based calibration method.
 The experimental results show that the method improves the calibration accuracy of the
 rotation axis vector. Although all of the approaches above provide reasonable measurement
 precision, they are extremely dependent on the equipment’s accuracy, both in the experi-
 mental setup and during the scanning process. For example, the slide’s sliding accuracy,
 the rotary stage’s rotation accuracy, and the stepper motor’s displacement accuracy will all
 have an impact on the final measurement results. Furthermore, in these methods, the angle
 of light plane scans is frequently fixed and cannot be adjusted during the scanning process,
 which can be easily affected by noise and folds, resulting in missing image information in
 certain areas.
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
Sensors 2023, 23, 13 3 of 17

 In order to lessen the inaccuracies caused by the equipment, several academics have
 proposed using handheld laser measurement systems. Simon et al. [32] presented a low-
 cost system for 3D data acquisition and fast pairwise surface registration. The object can
 simply be placed under a background plate with a priori information, and the laser can be
 held in the hand to obtain the complete 3D shape of the object. However, this method often
 requires a background plate larger than the object when measuring the object, which is not
 conducive to industrial field measurements.
 This paper completes the construction of a multi-line structured light 3D measurement
 system combining single-line and three-line lasers. The procedure can be summarized as:
 first, a structured light system is calibrated, which includes a monocular camera and a
 three-line structured light. Second, the pixel coordinates of three intersection points are
 detected, which are generated when single-line structure light intersects with three-line
 structure light during scanning. Finally, the single-line light plane can be calibrated in
 real-time by means of the camera coordinates of the three intersection points, which are
 solved by the three-line structured light system. During the measurement, the single-line
 structured light can be scanning at any angle, ensuring that no information is lost in a
 particular location of the object. The difficulty of obtaining information about certain
 angles and locations when the laser travels on a set track is overcome, and the demand for
 equipment, especially precision mobile equipment, is reduced.
 The organization of this paper is as follows. Section 2 describes the principle of multi-
 line structured light measurement system. Section 3 introduces the calibration of structured
 light systems. Section 4 describes the method of 3D measurement. Finally, the experimental
 results and conclusions are presented in Sections 5 and 6, respectively.

 2. The Principle of Multi-Line Structured Light Measurement System
 The measurement system designed in this paper consists of three main components:
 a laser emission module, an image acquisition module and an image processing module,
 as shown in Figure 1. The laser transmitter module consists of a three-line laser and a
 single-line laser. The image acquisition module captures line laser streak images that
 are distorted due to height fluctuations on the surface of the object, mainly through a
 charge-coupled device (CCD) image sensor. The image processing module performs the
 subsequent processing of the image acquired by the CCD camera, for example, by obtaining
 the pixel coordinate values of each light stripe center in the image. As a result of the spatial
 relationship between the world coordinate system, the camera coordinate system, and the
 image coordinate system, in combination with the pixel coordinate system, the coordinates
 of the object in the camera coordinate system can be obtained by a coordinate transformation
 for non-contact measurement. The coordinate relationships are shown in Figure 2.
 As shown in Figure 3, the measurement scheme of the system consists of two steps:
 system calibration and 3D measurement. The system calibration includes camera calibration
 and three-line light plane calibration. The camera is calibrated according to the acquired
 calibration plate image to obtain the internal parameters of the camera. Next, the calibration
 of the three-line light plane is completed by a flat target with a black square pattern. Then,
 the equation of the single-line light plane can be derived from the three intersections of
 the three-line light plane and the single-line light plane. During the 3D measurement, the
 centerline of the light-strip image of the object surface can be extracted using the Steger
 algorithm [33]. Then, the extracted pixel coordinates are converted to 3D coordinates in the
 camera coordinate system by combining the camera internal parameters and the equation
 of the single-line light plane to complete the 3D measurement of the object.
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
Sensors Sensors
 2022, 22,2022,
 x FOR22,PEER
 x FORREVIEW
 PEER REVIEW 4 of 194 of 19

 Sensors 2023, 23, 13 4 of 17

 FigureFigure
 1. The 1. The diagram
 diagram of the measurement
 of the measurement system.
 system.system. (1) Computer,
 (1) Computer, (2) Camera,
 (2) Camera, (3) Single-line
 (3) Single-line laser,
 laser, laser,
 Figure 1. The diagram of the measurement (1) Computer, (2) Camera, (3) Single-line
 (4) Three-line
 (4) Three-line laser, laser,
 (5) (5)
 Object, Object,
 (6) (6) Experiment
 Experiment platform.
 platform.
 (4) Three-line laser, (5) Object, (6) Experiment platform.

 FigureFigure
 2. The2.
 Figure The
 Thegeometric
 geometric
 2. structure
 structure
 geometric of theofline
 structure thethe
 of line structured
 structured
 line vision
 vision
 structured sensor.
 sensor.
 vision (1)(1) Camera,
 Camera,
 sensor. (2)(2)Single-line
 (1) Camera, Single-line laser,
 (2) Single-line
 laser, (3)
 (3)Laser
 (3)projector,
 Laser
 laser, projector, (4)
 (4) Image
 Laser projector, Image plane,
 plane, (5)
 (4) Image (5) Light
 (5)stripe,
 Light
 plane, stripe, (6)
 (6) Experiment
 Experiment
 Light stripe, platform.
 (6) Experiment platform.

 As shown in Figure
 As shown 3, the 3,
 in Figure measurement
 the measurement scheme of the of
 scheme system consists
 the system of twoofsteps:
 consists two steps:
 systemsystem
 calibration and 3D measurement. The system calibration
 calibration and 3D measurement. The system calibration includes includes camera camera
 calibration and three-line
 calibration light plane
 and three-line calibration.
 light plane The camera
 calibration. is calibrated
 The camera according
 is calibrated to the to the
 according
 acquired calibration
 acquired plate image
 calibration to obtain
 plate image the internal
 to obtain parameters
 the internal of the of
 parameters camera. Next, the
 the camera. Next, the
 calibration of the of
 calibration three-line light plane
 the three-line is completed
 light plane by a flat
 is completed by target with awith
 a flat target blacka square
 black square
 pattern. Then, Then,
 pattern. the equation of the of
 the equation single-line light plane
 the single-line can becan
 light plane derived from the
 be derived from three
 the three
 intersections of the three-line light plane and the single-line light plane.
 intersections of the three-line light plane and the single-line light plane. During During the 3Dthe 3D
 measurement, the centerline
 measurement, of the of
 the centerline light-strip imageimage
 the light-strip of the of
 object surface
 the object can becan
 surface extracted
 be extracted
 using using
 the Steger algorithm
 the Steger [33]. Then,
 algorithm the extracted
 [33]. Then, pixel coordinates
 the extracted are converted
 pixel coordinates to 3D to 3D
 are converted
 coordinates in thein camera
 coordinates the cameracoordinate systemsystem
 coordinate by combining
 by combining the camera internal
 the camera internal
 parameters and the equation of the single-line light plane
 parameters and the equation of the single-line light plane to complete the 3D to complete the 3D
 measurement of the of
 measurement object.
 the object.
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
s 2022, 22, xSensors 23, 13
 2023,REVIEW
 FOR PEER 5 of 19 5 of 17

 Figure 3. System measurement
 Figure 3. Systemflow chart.
 measurement flow chart.

 3. Calibration3.ofCalibration
 Structured of Structured
 Light SystemsLight Systems
 The calibrationTheof calibration of the
 the structured structured
 light system inlight
 thissystem
 paper isindivided
 this paper
 intoistwo
 divided
 parts.into two parts.
 The camera calibration is the first step to obtaining the camera
 The camera calibration is the first step to obtaining the camera internal parameters, and internal parameters, and
 the three-line light plane calibration is the second step to obtaining
 the three-line light plane calibration is the second step to obtaining the three-line light the three-line light
 plane
 plane equations. equations.

 3.1. Camera Calibration
 3.1. Camera Calibration
 Figure 4 shows a schematic diagram of a typical coordinate system. P is a random
 Figure 4 shows a schematic diagram of a typical coordinate system. is a random
 point in 3D space, the world coordinates of the change point are (Xw , Yw , Zw ), the camera
 point in 3D space, the world coordinates of the change point are ( , , ), the camera
 coordinates are (X , Y , Zc ), Pd is the actual projection of P on the image, and Pu is the
 coordinates are ( , , ), isc thec actual projection of on the image, and is the
 ideal projection point. The image coordinates are ( x , xd ), ( xu , xu ), and (u, v) are the pixel
 ideal projection point. The image coordinates are ( , ), ( , ),d and ( , ) are the pixel
 coordinates of P. The positional relationship [34] between the world coordinate system and
 coordinates of . The positional relationship [34] between the world coordinate system
 the camera coordinate system can be denoted by
 and the camera coordinate system can be denoted by
  
 α γ u0 0   Xw
        
 u α r u0 X
 0 Z  v  =  0 β v 0 R t Y
  w =  0 β v  [r r t ]  Yw 
 
  
 c w (1)
 = 0 0 =0 01 0 
 0
 0T 1  0  0 1 2
 (1) 1
 1 0 1 01 0 0 0 1
 0 0 1 0 0 0 1 1 1
 1
 In order to obtain more accurate camera parameters, a non-linear imaging model of
 In order to obtain
 the cameramore accurate
 is used camerainto
 [35]. Taking parameters, a non-linear
 account the imaging
 effect of radial andmodel of aberrations,
 tangential
 the camera is used [35]. Taking
 the meaning into
 of the account
 actual the effect
 projection of radial is
 coordinates andas tangential
 follows. aberrations,
 the meaning of the actual projection coordinates is as follows.
 δx ( xu , yu ) = xu k1 p2 + k2 p4  + 2p1 x4 yu + p2 3xu2 + y2u 
   
 ( , ) = ( δy( x + ) + 22 4 + (3 + )2 2
 u , yu ) = yu k 1 p + k 2 p + 2p1 x4 yu + p2 3xu + yu (2)
 (2)
 ( , ) p
 = ( + ) + 2 + (3 + )
 where p = ( xu2 + y2u ), k1 , k2 , p1 , p2 are the radial and tangential distortion coefficients,
 ( + ) This
 where = respectively. , produces
 , , , a nonlinear
 are the optimization
 radial and function.
 tangential
 Thedistortion
 function can be solved
 coefficients, respectively.
 precisely byThis produces
 means a nonlinear optimization
 of the Levenberg–Marquardt function.
 (L–M) The function
 algorithm. Finally, the intrinsic
  
 can be solved precisely by means
 α r ofu0the Levenberg–Marquardt (L–M) algorithm. Finally,
 matrix A =  0 β v0  and the distortion coefficient K = (k1 , k2 , p1 , p2 )T are obtained.
 0 0 1
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
the intrinsic matrix A = 0 and the distortion coefficient =
 0 0 1
 ( , , , ) are obtained. 
 the intrinsic matrix A = 0 and the distortion coefficient =
Sensors 2023, 23, 13 0 0 1 6 of 17
 ( , , , ) are obtained.

 Figure 4. Coordinate system of measurement system.

 3.2. The 4.
 Figure Principle of Three-Line
 Coordinate system of Light Plane Calibration
 measurement system.
 Figure 4. Coordinate system of measurement system.
 The three-line structured light’s calibration precision will have a direct impact on the
 3.2. The Principle
 measurement of Three-Line
 accuracy becauseLight
 it is Plane Calibration
 utilized to position the scanning light plane in real-
 3.2. The Principle of Three-Line Light Plane Calibration
 time. The three-line structured light’s calibration precision will have a direct impact on the
 The
 In three-line
 measurement
 this paper, structured
 accuracy
 an becauselight’s
 intersection it is
 stylecalibration
 utilized precision
 to position
 multi-line will have
 the scanning
 structured light a direct
 light
 system planeimpact on the
 in real-time.
 is mainly used,
 measurement
 whereInthe this accuracy
 paper,
 laser because
 an intersection
 throws it is
 three light style utilized
 planes to
 multi-lineposition the
 structured
 intersecting scanning
 on thelight
 same light
 system plane in
 is mainly
 line, and real-
 used,
 the angle
 time.
 where the laser throws three light planes intersecting
 between the light planes is a constant α, as shown in Figure 5. on the same line, and the angle
 In this
 between thepaper, an intersection
 light planes styleα,multi-line
 is a constant as shownstructured
 in Figure 5.light system is mainly used,
 where the laser throws three light planes intersecting on the same line, and the angle
 between the light planes is a constant α, as shown in Figure 5.

 Spatialgeometric
 Figure5.5.Spatial
 Figure geometricrelationship
 relationshipofofthe
 thestructured
 structuredlight
 lightplanes.
 planes.

 In order to obtain the light plane equations, this section consists of the following three
 Figure steps:
 main thegeometric
 5. Spatial first steprelationship
 determinesofthe
 theexternal parameters
 structured of the flat target with a black
 light planes.
 square pattern, as shown in Figure 6. The second step uses the Steger method to obtain
 the coordinates of the center point of the light stripe on the target image. The center point
 coordinates are the projected into the camera coordinate system by the internal parameters
 of the camera and the external parameters of the target as shown in Figure 7. The third
 step uses these three sets of points as control points to determine the equations of the
 structured light plane with the a priori position relationship of the structured light plane as
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
Sensors 2023, 23, 13 7 of 17

Sensors 2022, 22, x FOR PEER REVIEW the constraint which is the angle between the light planes (the constant α = 5◦ ) as8 shown
 of 19 in
 Figure 8. The light plane equations are as follows
 
  A1 X + B1 Y + C1 Z + D1 = 0
 Sensors 2022, 22, x FOR PEER REVIEW A2 X + B2 Y + C2 Z + D2 = 0 8 of 19(3)
 ( A1 + µA2 ) X + ( B1 + µB2 )Y + (C1 + µC2 ) Z + ( D1 + µD2 ) = 0
 

 where A, B, C, D are the light plane equation coefficient, µ is a scale factor.

 Figure 6. Three-line light plane calibration. (1) Three-line laser, (2) A flat target with black square
 pattern, (3) Corner Point.

 Figure 6. Three-line light plane calibration. (1) Three-line laser, (2) A flat target with black square
 pattern, (3) Corner light
 Figure 6. Three-line Point.plane calibration. (1) Three-line laser, (2) A flat target with black square
 pattern, (3) Corner Point.

 Figure 7. Three sets of 3-D points.
 Figure 7. Three sets of 3-D points.

 Figure 7. Three sets of 3-D points.
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
Sensors 2022, 22, x FOR PEER REVIEW 9 of 19
 Sensors 2023, 23, 13 8 of 17

 Figure 8. Three light planes in space.
 Figure 8. Three light planes in space.
 A uniform objective function is established:
 4. Method of Three-Dimensional
 n1
 Measurement n2
 ∆ = ∑ ( Aprocess
 The measurement 1 x1i + B1in + Cpaper
 y1ithis 1 z1i + is )2 + ∑ ( Aby
 D1achieved 2 x2i + B2 y2i +structured
 single-line )2 +
 C2 z2i + D2 light
 i =1 i =1
 scanning. Therefore,
 n3 real-time calibration of the single-line light plane is essential to the (4)
 ∑ (( A1 +The
 measurement process. 2 ) x3i + ( Bcalibration
 µAreal-time 1 + µB2 ) y3imethod
 + (C1 +isµC 2 ) z3i +into
 divided ))2 steps:
 + µD2main
 ( D1 three
 i =1
 Step 1: Detect the pixel coordinates of the intersection points
 under the constraint
 Since the single-line laser must remain intersected by the three-line laser during the
 measurement, three intersection points are generated.
 A2 ,B2 ,C2 ) TAs shown in Figure 9, the centerline
 
 
  √ ( A2 1 ,B12,C1 )(2 √ = cos 2α
 of the light stripe in thearea near theAintersection
 1 + B1 +C1 A22 point 2
 + B22 +Cis
 2 detected, and the coordinates of
 
 
  T
 the intersection point are√
  ( A1 ,B1 ,C
 calculated 1 )( A
 by 1 + µA2 ,B
 fitting the1 +lines +µCand
 µB2 ,C1 L1 2) L2. = cos α
 q
 A21 + B12 +C12 ( A1 +µA2 )2 +( B1 +µB2 )2 +(C1 +µC2 )2
 (5)
 
 ( A2 ,B2 ,C2 )( A1 +µA2 ,B1 +µB2 ,C1 +µC2 ) T
 
 
  √ 2 2 2q = cos α
 
 
  2 2 2
 A2 + B2 +C2 ( A1 +µA2 ) +( B1 +µB2 ) +(C1 +µC2 )

 which ensures that the angle between the light planes is constant. Here,
 ( x1i , y1i , z1i )i = 1, . . . , n1 , ( x2i , y2i , z2i )i = 1, . . . , n2 ,( x3i , y3i , z3i )i = 1, . . . , n3 represents three
 sets of 3D points corresponding to three planes. Finally, the coefficients of Equation (6) are
 solved using the L–M algorithm. In this way, the a priori geometric relation of the planes
 can be accurately transplanted into the camera coordinate frame.

 4. Method of Three-Dimensional Measurement
 The measurement process in this paper is achieved by single-line structured light
 scanning. Therefore, real-time calibration of the single-line light plane is essential to the
 measurement process. The real-time calibration method is divided into three main steps:
 Step 1: Detect the pixel coordinates of the intersection points
 Since the single-line laser must remain intersected by the three-line laser during the
 measurement, three intersection points are generated. As shown in Figure 9, the centerline
 of the light stripe in the area near the intersection point is detected, and the coordinates of
 the intersection point are calculated by fitting the lines L1 and L2.
 Step 2: Calculate the intersection points in camera coordinates
 Figure 9. Detect the pixel coordinates of the intersection point.
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
measurement process. The real-time calibration method is divided into three main steps:
 Step 1: Detect the pixel coordinates of the intersection points
 Since the single-line laser must remain intersected by the three-line laser during the
 measurement, three intersection points are generated. As shown in Figure 9, the centerline
 Sensors 2023, 23, 13 9 of 17
 of the light stripe in the area near the intersection point is detected, and the coordinates of
 the intersection point are calculated by fitting the lines L1 and L2.

Sensors 2022, 22, x FOR PEER REVIEW 10 of 19

 Step 2: Calculate the intersection points in camera coordinates
 The three intersection points are located both in the single-line structured light plane
 and in each plane of the three-line structured light respectively. Therefore, by using the
 internal parameters of the camera and the three-line structured light plane equation, the
 three-dimensional coordinates of the three intersection points in the camera coordinate
 system can be solved.
 Figure 9. Detect the pixel coordinates of the intersection point.
 Detect the pixel
 Step 3:Figure 9. the
 Fitting space planecoordinates
 equation of the intersection point.
 The space planeintersection
 The three equation, which
 pointsisare
 thelocated
 single-line
 bothlight
 in theplane equation,
 single-line can be fitted
 structured light plane
 using and
 the three-dimensional
 in each plane of the coordinates of the threelight
 three-line structured intersection points,
 respectively. as illustrated
 Therefore, in the
 by using
 Figureinternal
 10. parameters of the camera and the three-line structured light plane equation, the
 After completing thecoordinates
 three-dimensional real-time calibration,
 of the threetheintersection
 feature coordinates
 points ininthe
 a specific
 camera plane
 coordinate
 in the system
 3D space are obtained
 can be solved. by the single-line structured light in the case of a single frame.
 Multiple
 Stepframes of data
 3: Fitting can beplane
 the space registered under the same coordinate system to obtain the
 equation
 original point cloud data of the object to be reconstructed with the single-line laser
 The space plane equation, which is the single-line light plane equation, can be fitted
 scanning at any angle.
 using the three-dimensional coordinates of the three intersection points, as illustrated
 in Figure 10.

 Figure 10. Three points in space determine a plane. (1) Single-line laser, (2) Three-line laser, (3) Object,
 Figure(4)
 10.Experiment
 Three points in space determine a plane. (1) Single-line laser, (2) Three-line laser, (3)
 platform.
 Object, (4) Experiment platform.
 After completing the real-time calibration, the feature coordinates in a specific plane
 5. Experiment
 in the 3D space are obtained by the single-line structured light in the case of a single frame.
 InMultiple frames
 this paper, of data
 the need forcan be registered
 precision sliding under theother
 rails and sameauxiliary
 coordinate system toinobtain
 equipment the the
 traditional method is eliminated, and the 3D information of the object can be obtained scanning
 original point cloud data of the object to be reconstructed with the single-line laser by
 simplyat irradiating
 any angle. two lasers onto the surface of the object to be measured. The
 measurement system is shown in Figure 11. The configuration used in this study is
 extremely simple. It generally constitutes a three-line laser (Lasiris; Coherent, Inc., Santa
 Clara, CA, USA) emitting a Gaussian beam at a wave-length of 660 nm with an output
 power of 20 mW, a single-line laser emitting a Gaussian beam at a wave-length of 650 nm
A Three-Dimensional Structured Light Vision System by Using a Combination of Single-Line and Three-Line Lasers
Sensors 2023, 23, 13 10 of 17

 5. Experiment
 In this paper, the need for precision sliding rails and other auxiliary equipment in
 the traditional method is eliminated, and the 3D information of the object can be obtained
 by simply irradiating two lasers onto the surface of the object to be measured. The mea-
 surement system is shown in Figure 11. The configuration used in this study is extremely
 simple. It generally constitutes a three-line laser (Lasiris; Coherent, Inc., Santa Clara, CA,
 USA) emitting a Gaussian beam at a wave-length of 660 nm with an output power of
 20 mW, a single-line laser emitting a Gaussian beam at a wave-length of 650 nm with an
 output power of 50 mW (Lei xin Yanxian-LXL65050) and an 8 mm lens camera (MER-1070-
 14U3M; DaHeng, Beijing, China). The single-line laser can be moved freely during
 Sensors 2022, 22, x FOR PEER REVIEW the19
 11 of
 measurement, and can even be moved by hand. The experiments in this paper consist of
 camera calibration, three-line structured light calibration, and real-time measurement.

 Figure11.
 Figure 11.Measurement
 Measurementimage
 imageofofthe
 theface
 facemodel.
 model.

 5.1.Calibration
 5.1. Calibrationofofthe
 theCamera
 Camera
 As
 Asshown
 shownininFigure
 Figure12,12,the
 thecheckerboard
 checkerboardtarget
 targetisisused,
 used,and
 andthe
 thesize
 sizeofofthe
 thechecker
 checkerisis
 44×× 4 mm. Four
 Four sets
 sets of
 of calibration
 calibration experiments
 experiments were
 were carried out, with nine imagesacquired
 carried out, with nine images acquired
 by
 byeach
 eachcamera
 cameraset.
 set.The
 Theoverall
 overallmean
 meanpixel
 pixelerror
 errorofofthe
 thecalibrated
 calibratedimages
 imageswas wasselected
 selectedas
 as
 the
 thebasis
 basisfor
 forjudging
 judgingthe
 thecamera
 cameracalibration
 calibrationresults.
 results.
Sensors
Sensors 2022,
 2023, 22,
 23, x13FOR PEER REVIEW 1211ofof19
 17

 Figure 12. Camera calibration.
 Figure 12. Camera calibration.
 Finally, a set of camera internal parameters and distortion parameters were obtained.
 Finally, a set of camera internal parameters and distortion parameters were obtained.
  
 4990.2635 0.77900164 1929.2399
 4990.2635 0.77900164 1929.2399
 A= 0 4993.9024 1279.1607,
 = 00 4993.9024
 0
 1279.1607
 1
 ,
 0  0
 0.0815
  1
 0.0815
  −0.3809
 K=
 −0.3809
  0.0042 
 =
 0.0042
 −0.0004
 −0.0004
 5.2. Calibration of Three-Line Light Plane
 Nine patterns of the planar target at different orientations in space are acquired by
 5.2. Calibration of Three-Line Light Plane
 the camera, and three-line laser is needed to be projected on the target synchronously, as
 shownNine patterns
 in Figure 13.ofUsing
 the planar target at
 the method differentinorientations
 proposed in space
 Section 3.2, the three are acquiredlight
 structured by
 the camera,
 planes can beand three-line
 solved, laser
 and the is needed
 equations ofto beplanes
 the projected onfollows:
 are as the target synchronously, as
 shown in Figure 13. Usingthe method proposed in Section 3.2, the three structured light
 planes can be solved, and the0.922x − 7.397y
 equations − 1.429z
 of the planes− 1000
 are =0
 as follows:
 1.321x − 9.319y − 0.992z − 1000 = 0 (6)
 0.922 
  − 7.397 − 1.429 − 1000
 1.986x − 12.52y − 0.263z − 1000 = 0
 = 0
 1.321 − 9.319 − 0.992 − 1000 = 0 (6)
 5.3. Object Measurement in1.986 − 12.52 − 0.263 − 1000
 Real Time =0
 In this experiment, a face mask model and a standard gauge block were
 measured separately.

 5.3.1. Face Mask Model Measurement in Real Time
 For comparison purposes, the point cloud of the face mask model obtained by tradi-
 tional methods using slide rails is shown in Figure 14. It can be observed that the point
 cloud of face model has relatively few data points within the ear region. This is due to
 the presence of folds in the ear area, causing occlusions or discontinuities in the scanned
 image. Unlike traditional methods, the method proposed in this paper allows free control
 of the scanning direction, thus reducing the influence of the folded area on the image and
 increasing the effective data information in this area. The point cloud data obtained using
Sensors 2023, 23, 13 12 of 17

Sensors 2022, 22, x FOR PEER REVIEWthe method is shown in Figure 15.
 The 3D reconstruction images obtained by the traditional
 13 of 19
 method and the method proposed in this paper are shown in Figure 16. The results show
 that the measurements using the method proposed in this paper are effective.

 Sensors 2022, 22, x FOR PEER REVIEW 14 of 19

 Figure 13. The image of light stripe centerline extraction with black plate.
 Figure 13. The image of light stripe centerline extraction with black plate.

 5.3. Object Measurement in Real Time
 In this experiment, a face mask model and a standard gauge block were measured
 separately.

 5.3.1. Face Mask Model Measurement in Real Time
 For comparison purposes, the point cloud of the face mask model obtained by
 traditional methods using slide rails is shown in Figure 14. It can be observed that the
 point cloud of face model has relatively few data points within the ear region. This is due
 to the presence of folds in the ear area, causing occlusions or discontinuities in the scanned
 image. Unlike traditional methods, the method proposed in this paper allows free control
 of the scanning direction, thus reducing the influence of the folded area on the image and
 increasing the effective data information in this area. The point cloud data obtained using
 the method is shown in Figure 15. The 3D reconstruction images obtained by the
 traditional method and the method proposed in this paper are shown in Figure 16. The
 results show that the measurements using the method proposed in this paper are effective.

 (a) (b)
 Figure 14. Traditional method of measurement. (a) Face mask model from camera view. (b) Point
 Figure 14. Traditional method of measurement. (a) Face mask model from camera view. (b) Point
 cloud image from the traditional method.
 cloud image from the traditional method.
(a) (b)
 Sensors 2023, 23, 13 Figure 14. Traditional method of measurement. (a) Face mask model from camera view. (b)
 13Point
 of 17
 cloud image from the traditional method.

 (a) (b)
2022, 22, x FOR PEER REVIEW Figure 15.
 Figure 15. Point
 Point cloud
 cloud image
 image from
 from this
 this paper.
 paper. (a)
 (a) Point
 Point cloud
 cloud image
 image from
 from15frontal
 of 19 perspective (b)
 frontal perspective (b)
 Point cloud image from the side perspective.
 Point cloud image from the side perspective.

 (a) (b)
 Figure 16. 3D reconstruction image. (a) Traditional method (b) The method of this paper.
 Figure 16. 3D reconstruction image. (a) Traditional method (b) The method of this paper.

 5.3.2. Standard5.3.2.
 Gauge Block Measurement
 Standard Gauge Block in Real Time in Real Time
 Measurement
 In order to determine
 In orderthe spatial measurement
 to determine the spatial accuracy
 measurement of theaccuracy
 system, the example
 of the system, the example
 of four measuring blocks
 of four with known
 measuring blocks thicknesses
 with known of 70 mm, 60 mm,
 thicknesses of 7050mm,
 mm60 and 40 mm,
 mm, 50 mm and 40 mm,
 respectively, isrespectively,
 used. The accuracy
 is used. of Thea standard
 accuracy part is 0.0015 mm.
 of a standard The
 part is real-time
 0.0015 mm. scan
 The real-time scan
 image is shownimage
 in Figure 17. The
 is shown acquired
 in Figure 17.point cloud image
 The acquired pointiscloud
 shownimage
 in Figure 18. The
 is shown in Figure 18. The
 point cloud datapoint
 fromcloud data from
 the surface the Gauge
 of the surfaceblock
 of theandGauge block cloud
 the point and the point cloud
 scanned at the scanned at the
 bottom are used bottom
 to fit are
 twoused to fitplanes,
 parallel two parallel
 and the planes,
 distanceandbetween
 the distance between
 the planes the planes is the
 is the
 thickness of thethickness of theThe
 gauge block. gauge block.
 fitted Theplane
 spatial fittedimages
 spatialare
 plane images
 shown are shown
 in Figure in Figure 19. The
 19. The
 measurement
 measurement results are shown results are shown
 in Table 1. in Table 1.
Sensors 2022, 22,23,
 Sensors x FOR PEER REVIEW 16 of
 14 19
 2022, 22,13x FOR PEER REVIEW
 Sensors2023, 16 of
 of 17
 19

 Figure 17. Measurement image of the standard gauge block.
 Figure17.
 Figure 17.Measurement
 Measurementimage
 imageofofthe
 thestandard
 standardgauge
 gaugeblock.
 block.

 Figure 18. Point cloud image of a gauge block.
 Figure 18. Point cloud image of a gauge block.
 Figure 18. Point cloud image of a gauge block.
Sensors 2022, 22, x FOR PEER REVIEW 17 of 19

Sensors 2023, 23, 13 15 of 17

 (a) (b)
 Figure 19. Fitting spatial plane images. (a) Top view. (b) Side view.
 Figure 19. Fitting spatial plane images. (a) Top view. (b) Side view.

 Table 1. Measurement results of the standard gauge blocks (mm).
 Table 1. Measurement results of the standard gauge blocks (mm).
 Thickness
 NO. Thickness
 NO. 70 60 50 40
 1 70
 69.920 60
 60.102 50
 50.122 40
 39.902
 12 69.920
 70.101 60.102
 60.085 50.122
 50.089 39.902
 39.893
 23 70.101
 70.063 60.085
 59.989 50.089
 49.923 39.893
 40.113
 3 70.063 59.989 49.923 40.113
 44 70.042
 70.042 60.092
 60.092 49.906
 49.906 40.136
 40.136
 55 69.929
 69.929 60.112
 60.112 50.104
 50.104 40.104
 40.104
 66 69.918
 69.918 60.003
 60.003 50.113
 50.113 39.915
 39.915
 77 69.892
 69.892 59.888
 59.888 49.896
 49.896 39.911
 39.911
 8 70.102 59.933 50.091 40.163
 8 70.102 59.933 50.091 40.163
 9 70.097 59.848 50.132 40.072
 9
 10 70.097
 70.114 59.848
 59.887 50.132
 49.899 40.072
 40.085
 10 70.114 59.887 49.899 40.085
 The RMSE is the root mean square error. The STD is the standard deviation. Table 2
 The RMSE is the root mean square error. The STD is the standard deviation. Table 2
 shows that between the measurement ranges between 40 mm and 70 mm, the standard
 shows that between the measurement ranges between 40 mm and 70 mm, the standard
 deviation is less than 0.11 mm, and the root mean square error is less 0.11 mm, demonstrat-
 deviation is less than 0.11 mm, and the root mean square error is less 0.11 mm,
 ing the measurement system’s high level of stability. The maximum measurement error
 demonstrating the measurement system’s high level of stability. The maximum
 for the standard gauge block does not exceed 0.16 mm. In conclusion, it can be stated that
 measurement error for the standard gauge block does not exceed 0.16 mm. In conclusion,
 the system’s spatial measurement accuracy is 0.16 mm when considering the maximum
 it can be stated
 measurement that the system’s spatial measurement accuracy is 0.16 mm when
 variation.
 considering the maximum measurement variation.
 Table 2. Comparison of the measured value with the actual value.
 Table 2. Comparison of the measured value with the actual value.
 Thickness (mm)
 Real Thickness (mm)
 Real 70.000 60.000 50.000 40.000
 70.000 60.000 50.000 40.000
 Average 70.025 59.994 50.026 40.030
 Average 70.025 59.994 50.026 40.030
 RMSE 0.0850 0.0957 0.1039 0.1083
 RMSE
 STD 0.0850
 0.0856 0.0957
 0.1007 0.1039
 0.1056 0.1083
 0.1099
 STD 0.0856 0.1007 0.1056 0.1099
 6. Conclusions
 6. Conclusions
 This paper proposes a three-dimensional measurement system that combines a single-
 Thisa three-line
 line and paper proposes a three-dimensional
 laser. The measurement systemmeasurement
 experimentalsystem
 platformthat
 wascombines
 developeda
 single-line and a three-line laser. The measurement system experimental platform was
 without the need of typical equipment for this technique, such as slide rails and step motors.
 developed without the need of typical equipment for this technique, such as
 This method therefore reduces experimental error caused by sophisticated equipment orslide rails
Sensors 2023, 23, 13 16 of 17

 complicated protocols. Thus, the proposed method is more convenient and straightforward.
 In addition, when measuring the object from multiple angles, the traditional methods
 could not deal with folded area (such as a human ear region) well. This issue is solved by
 the method proposed in this paper. Experiments show that the measurement approach
 given in this paper can successfully reconstruct the human face mask model and gauge
 block, as well as highlighting this method’s simplicity and effectiveness. Experiments from
 measuring blocks of known thickness show that the spatial measurement accuracy of the
 system is 0.16 mm when the maximum measurement variation is considered.

 Author Contributions: Conceptualization, Q.S. and Z.R.; methodology, Q.S. and Z.R.; software, Q.S.
 and Z.R.; validation, Q.S. and Z.R.; investigation, Q.S.; resources, Z.R.; data curation, Q.S., Z.R.,
 M.W. and W.D.; writing—original draft preparation, Q.S. and Z.R.; project administration, Q.S. and
 Z.R.; supervision, funding acquisition, Q.S., J.Z. and M.S. All authors have read and agreed to the
 published version of the manuscript.
 Funding: This work described in this paper is supported by Foundation of Science and Technology
 Department of Jilin Province under Grant No. YDZJ202201ZYTS531.
 Institutional Review Board Statement: Not applicable.
 Informed Consent Statement: Not applicable.
 Data Availability Statement: The data used to support the findings of this study are available from
 the first author upon request.
 Conflicts of Interest: The authors declare no conflict of interest.

References
1. Li, W.; Luo, Z.; Hou, D.; Mao, X. Multi-line laser projection 3D measurement simulation system technology. Optik 2021, 231,
 166390. [CrossRef]
2. Uddin, M.S.; Mak, C.Y.E.; Callary, S.A. Evaluating hip implant wear measurements by CMM technique. Wear Int. J. Sci. Technol.
 Frict. Lubr. Wear 2016, 364/365, 193–200. [CrossRef]
3. Rodríguez, J.A.M.; Rodríguez-Vera, R.; Asundi, A.; Campos, G.G. Shape detection using light line and Bezier approximation
 network. Imaging Sci. J. 2007, 55, 29–39. [CrossRef]
4. Apolinar, J.; Rodríguez, M. Three-dimensional microscope vision system based on micro laser line scanning and adaptive genetic
 algorithms. Opt. Commun. 2017, 385, 1–8. [CrossRef]
5. Alanis, F.; Munoz, A. Microscope vision system based on micro laser line scanningfor characterizing micro scale topography.
 Appl. Opt. 2020, 59, D189–D200. [CrossRef]
6. Wu, Z.-F.; Fan, Q.-L.; Ming, L.; Yang, W.; Lv, K.-L.; Chang, Q.; Li, W.-Z.; Wang, C.-J.; Pan, Q.-M.; He, L.; et al. A comparative study
 between traditional head measurement and structured light three-dimensional scanning when measuring infant head shape.
 Transl. Pediatr. 2021, 10, 2897–2906. [CrossRef] [PubMed]
7. Tran, T.T.; Ha, C.K. Non-contact Gap and Flush Measurement Using Monocular Structured Multi-line Light Vision for Vehicle
 Assembly. Int. J. Control Autom. Syst. 2018, 16, 2432–2445. [CrossRef]
8. Xue-Jun, Q.U.; Zhang, L. 3D Measurement Method Based on Binocular Vision Technique. Comput. Simul. 2011, 2, 373–377.
9. Wu, F.; Mao, J.; Zhou, Y.F.; Qing, L. Three-line structured light measurement system and its application in ball diameter
 measurement. Opt. Z. Fur Licht Und Elektronenoptik J. Light Electronoptic 2018, 157, 222–229. [CrossRef]
10. Li, W.; Fang, S.; Duan, S. 3D shape measurement based on structured light projection applying polynomial interpolation technique.
 Opt. Int. J. Light Electron Opt. 2013, 124, 20–27. [CrossRef]
11. Zhao, H. High-Precision 3D Reconstruction for Small-to-Medium-Sized Objects Utilizing Line-Structured Light Scanning: A
 Review. Remote Sens. 2021, 13, 4457.
12. Zhang, G.; Liu, Z.; Sun, J.; Wei, Z. Novel calibration method for a multi-sensor visual measurement system based on structured
 light. Opt. Eng. 2010, 49, 43602. [CrossRef]
13. Zhang, R.; Liu, W.; Lu, Y.; Zhang, Y.; Ma, J.-W.; Jia, Z. Local high precision 3D measurement based on line laser measuring
 instrument. In Proceedings of the Young Scientists Forum, Shanghai, China, 24–26 November 2017.
14. Bleier, M.; Lucht, J.; Nüchter, A. SCOUT3D—AN UNDERWATER LASER SCANNING SYSTEM FOR MOBILE MAPPING. Int.
 Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2019, 42, 13–18. [CrossRef]
15. Yao, L.; Liu, H. Design and Analysis of High-Accuracy Telecentric Surface Reconstruction System Based on Line Laser. Appl. Sci.
 2021, 11, 488. [CrossRef]
16. Yang, G.; Wang, Y. High resolution laser fringe pattern projection based on MEMS micro-vibration mirror scanning for 3D
 measurement. Opt. Laser Technol. 2021, 142, 107189. [CrossRef]
Sensors 2023, 23, 13 17 of 17

17. Dewar, R.; Engineers, S.O.M. Self-Generated Targets for Spatial Calibration of Structured-Light Optical Sectioning Sensors with Respect to
 an External Coordinate System; Society of Manufacturing Engineers: Southfield, MI, USA, 1988.
18. Huang, Y.-G.; Li, X.; Chen, P.-F. Calibration method for line-structured light multi-vision sensor based on combined target.
 EURASIP J. Wirel. Commun. Netw. 2013, 2013, 92. [CrossRef]
19. Zhou, F.; Zhang, G. Complete calibration of a structured light stripe vision sensor through planar target of unknown orientations.
 Image Vis. Comput. 2005, 23, 59–67. [CrossRef]
20. Ze-xiao, X.; Weitong, Z.; Zhiwei, Z.; Ming, J. A novel approach for the field calibration of line structured-light sensors. Measurement
 2010, 43, 190–196. [CrossRef]
21. Li, W.; Li, H.; Zhang, H. Light plane calibration and accuracy analysis for multi-line structured light vision measurement system.
 Optik 2020, 207, 163882. [CrossRef]
22. Zhang, Y.-C.; Han, J.-X.; Fu, X.-B.; Lin, H. An online measurement method based on line laser scanning for large forgings. Int. J.
 Adv. Manuf. Technol. 2014, 70, 439–448. [CrossRef]
23. Hwang, S.; An, Y.-K.; Yang, J.; Sohn, H. Remote Inspection of Internal Delamination in Wind Turbine Blades using Continuous
 Line Laser Scanning Thermography. Int. J. Precis. Eng. Manuf. Green Technol. 2020, 7, 699–712. [CrossRef]
24. Wu, X.; Tang, N.C.; Liu, B.; Long, Z. A novel high precise laser 3D profile scanning method with flexible calibration. Opt. Lasers
 Eng. 2020, 132, 105938. [CrossRef]
25. Zeng, X.; Huo, J.; Wu, Q. Calibrate Method for Scanning Direction of 3D Measurement System Based on Linear-Structure Light.
 Zhongguo Jiguang Chin. J. Lasers 2012, 39, 108002-5. [CrossRef]
26. Lin, Z.; Wang, T.; Nan, G.; Zhang, R. Three-dimensional Data Measurement of Engine Cylinder Head Blank Based on Line
 Structured Light. Opto Electron. Eng. 2014, 41, 46–51.
27. Cai, Z.; Jin, C.; Xu, J.; Yang, T. Measurement of Potato Volume with Laser Triangulation and Three-Dimensional Reconstruction.
 IEEE Access 2020, 8, 176565–176574. [CrossRef]
28. Xiaozhong, G.; Shi, Z.; Yu, B.; Zhao, B.; Ke, L.-D.; Yanqiang, S. 3D measurement of gears based on a line structured light sensor.
 Precis. Eng. J. Int. Soc. Precis. Eng. Nanotechnol. 2020, 61, 160–169.
29. Zhou, J.; Pan, L.; Li, Y.; Liu, P.; Liu, L. Real-Time Stripe Width Computation Using Back Propagation Neural Network for Adaptive
 Control of Line Structured Light Sensors. Sensors 2020, 20, 2618. [CrossRef]
30. Wang, Z.; Liu, S.; Hu, J.; Zhang, W.; Huang, H.; Liu, J. Line structured light 3D measurement technology for pipeline microscratches
 based on telecentric lens. Opt. Eng. 2021, 60, 124108. [CrossRef]
31. Cai, X.; Zhong, K.; Fu, Y.; Chen, J.; Liu, Y.; Huang, C. Calibration method for the rotating axis in panoramic 3D shape measurement
 based on a turntable. Meas. Sci. Technol. 2020, 32, 35004. [CrossRef]
32. Winkelbach, S.; Molkenstruck, S.; Wahl, F.M. Low-Cost Laser Range Scanner and Fast Surface Registration Approach. DBLP 2006,
 718–728. [CrossRef]
33. Steger, C. An Unbiased Detector of Curvilinear Structures. IEEE Trans. Pattern Anal. Mach. Intell. 1998, 20, 113–125. [CrossRef]
34. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [CrossRef]
35. Ricolfe-Viala, C.; Sánchez-Salmerón, A.-J. Correcting non-linear lens distortion in cameras without using a model. Opt. Laser
 Technol. 2010, 42, 628–639. [CrossRef]

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.
You can also read