Adaptive Image-Based Visual Servoing of 6 DOF Robots Using Switch Approach Ahmad Ghasemi Dep of Mechanical

Adaptive Image-Based Visual Servoing of 6 DOF Robots Using Switch Approach
Ahmad Ghasemi
Dep of Mechanical, Industrial & Aerospace Engineering Concordia University
Montreal, Quebec, Canada
ah [email protected]
1051560-31750
Abstract— In this paper, an adaptive image-based visual servoing (IBVS) controller using switch approach is proposed for an eye-in-hand 6 DOF robot. The proposed controller is decomposed into three separate stages with different gains. By decoupling the rotational and translational motions, this method can deal with the nonlinearity in the image Jacobian matrix caused by depth parameter. Adaptive laws are developed to estimate the camera parameters. Based on the estimated camera intrinsic parameters, a switch adaptive IBVS is designed to provide a fast response system to meet the demands of robotic industrial applications. Simulation results show the significant enhancement of the control performance in terms of the response time over the other IBVS methods. Also the designed visual servoing controller demonstrates the capability to overcome one of the inherent drawbacks of IBVS where the tasks require a 180 rotation of the camera around its center.

Index Terms— Adaptive control, Image-based visual servoing, Industrial robots
INTRODUCTION
Vision based control or visual servoing has been used in robotic industry to improve the performance and intelligence of the robots especially in unknown environments. Image-based visual servoing (IBVS) 1 and position-based visual servoing (PBVS) 2 are the two main types of visual servoing which are classified based on how the camera image is used to guide the robot. In PBVS, the pose of the end-effector is estimated with respect to the object by using the image data. This estimated pose is employed to guide the robot to the desired pose. On the other hand, in IBVS, the image features extracted from image data are directly used to guide the robot to match the desired image features. Either IBVS or PBVS has its own advantages and drawbacks for different tasks 3. This paper is focused on presenting a new technique of IBVS.

Best services for writing your paper according to Trustpilot

Premium Partner
From $18.00 per page
4,8 / 5
4,80
Writers Experience
4,80
Delivery
4,90
Support
4,70
Price
Recommended Service
From $13.90 per page
4,6 / 5
4,70
Writers Experience
4,70
Delivery
4,60
Support
4,60
Price
From $20.00 per page
4,5 / 5
4,80
Writers Experience
4,50
Delivery
4,40
Support
4,10
Price
* All Partners were chosen among 50+ writing services by our Customer Satisfaction Team

Various studies has been conducted to address and over-come the shortcomings of IBVS and to enhance its efficiency 4–6. However, the performances of most of the reported IBVS are not good enough to be used in industrial robots. In order to have an effective IBVS which is feasible for
This work is supported by NSERC discovery grant.

Wen-Fang Xie
Dep of Mechanical, Industrial & Aerospace Engineering
Concordia University
Montreal, Quebec, Canada
[email protected]
the practical robotic applications, a fast response system is needed. One obvious way is to increase the gain values in the control law in order to reduce the response time of the IBVS. However there is a limitation on these values because the controller with high gains makes the robotic system tend to be shaky and unstable. Moreover, the stability of traditional IBVS system is proved only in a region around the desired position 7. When the initial configuration is far away from the desired configuration, the converging time is long and possible image singularities may lead to IBVS failure. To address this issue, a switching scheme is proposed to choose the control signal between two low level visual servo controllers, i.e. homography-based controller and affine-approximation controller 8. Xie et al. 9 employ the idea of switching control in IBVS, which switches between rotational and translational movements of the end-effector by using a laser pointer to estimate the depth of the features. Although the switch control has been demonstrated that it avoids some inherent drawbacks of the traditional IBVS such as inability to perform pure rotation about camera center and being stuck in a local minimum situation etc., no stability analysis has been provided in above-mentioned switching schemes. In addition, the switch control in 9 is based on the assumption that all camera parameters are known and certain. And the switching condition between stages is a pre-defined norm of feature errors which is not directly related to the decoupled movement. A more intuitive and efficient criteria that can guide the switch between rotational and translational movements is needed to speed up the convergence. And the switch control system with guaranteed stability is in demand in the industrial applications.

Another issue in IBVS is that its performance is dependent on the accuracy of camera calibration. To elaborate, the image Jacobian matrix (Jimg) relating the image features velocity to the camera velocity, contains the camera intrinsic parameters and depth of the features. The camera parameters can be obtained by calibration process. However, the calibration pro-cess is time-consuming and the camera parameters often have some uncertainties. Some studies have been carried out to deal with the uncertain camera parameters 10, 11. However,
in most of these studies, the controller design is kinematic based, i.e. they consider the robot as an accurate positioning device with negligible dynamics. Recently, some researchers have tried to consider non-linear robot dynamics in designing the visual servoing controller 12. However, these methods do not consider the uncertainties of the robot dynamics and kinematics caused by changing payload, changing operation environment, wear and tear of components etc.

To deal with the kinematic and dynamic uncertainties, a lot of studies have been carried out. In 13, 14 some adaptive controllers are proposed for trajectory tracking tasks . In 15, an adaptive Jacobian vision based control is proposed for robots with uncertain depth. In 16, an adaptive visual tracking control is investigated for robotic systems without using image-space velocity measurement. In these control schemes, the uncertainties of the robot dynamics, kinematics, and camera model are considered. It is noted that all of these adaptive controllers 13–16 adopt the IBVS scheme which involves the inverse of whole image Jacobian matrix. As
shown in (4), the term associated with depth1cannot
2522220-13335
Z(t)
be extracted from image Jacobian matrix. Hence above-mentioned methods suffer the inherit drawbacks of IBVS such as inability to perform 180 degree rotation and image singularities 7.

In this paper, the switching idea is employed to divide the motion of the robot end-effector to three stages: pure rotation, pure translation and fine tuning stage which consists of both rotation and translation movements. This three-stage gain scheduled switch control considers the nonlinear robot dynamics and uses the adaptive law to deal with the uncertain camera intrinsic parameters. This switch method enables the users to set gain values for the control laws of each stage separately to obtain a fast response system while keeping the system stable. An intuitive feature is proposed for determining the switch condition. The salient feature of the switch method lies in separating the rotational and translational motion in which the inverse of partial Jacobian matrix is used for generating the control command and hence avoids the image singularities. Since the rotational control law is depth independent, i.e. the depth parameter does not appear in the image Jacobian matrix needed for generating the rotation motion command, one can estimate the depth during the rotational motion using the techniques such as in 6 for the translational motion control. To validate the proposed controller, simulations on a 6 DOF Puma 560 robot with a monocular eye-in-hand vision system have been con-ducted. Simulation results show the improved performance of this method compared with conventional IBVS method and switch IBVS 9.

This paper is organized as follows. In section II, a descrip-tion of the problem is presented. In section III, the adaptive switch controller is designed and the update law for the camera intrinsic parameters is proposed. Simulation results

are presented in section IV and finally conclusion remarks are given in section V.

II. PROBLEM STATEMENT
In IBVS, n features are denoted as si(t) = xi(t) yi(t)T , and sdi = xdi ydiT as the desired image features’ coor-dinates in the image space(i = 1; ::; n). The velocity of the camera is defined as Vc(t). The relation between velocity of the camera and image feature velocity can be expressed as:
s(t) = Jimg(t)Vc(t); (1) where, s1(t) sd1(t) 2 . 3 ; sd(t) = 2 . 3 (2) s(t) = .. .. 6 sn(t) 7 6sdn(t) 7 and 4 5 4 5 2 Jimg(s1; Z1) 3 Jimg(t) = 6 . 7 (3) Jimg(s..n; Zn) 4 5 1040130-1583690
is called the image Jacobian matrix and Z1; : : : ; Zn are the depth of the features s1; :::; sn. In this study, the system is assumed eye-in-hand and the number of features are n = 4. Also it is assumed all the features have the same depth Z. Thus, for the i th feature, the image Jacobian matrix is given as 3:
Jimg(si(t)) =
; f xi(t) xi(t)yi(t) f2+xi2 # f xi(t) ; 0 yi(t) f2 yi(t)2 xi(t)yi(t) Z(t) 0 Z(t) f f yi(t) Z(t) Z(t) f f (4) where f is the focal length of the camera and xi(t) and yi(t) are the projected feature coordinates in the camera frame which are directly related to the camera intrinsic parameters as follows.

xi(t) = (ui(t) cu)=f (5)
and yi(t) = (vi(t) cv)=f ; (6)
where ui(t) and vi(t) are the pixel coordinates in image plane. cu and cv are the principal point coordination and
is the scale factor.

The velocity of the camera can be calculated by manipu-lating (1):
+ (7) Vc(t) = Jimg(t)s(t); 1877060-60325
+
where Jimg(t) is the pseudo-inverse of the image Jacobian matrix. The error signal is defined as s~(t) = s(t) sd and
_ let s~(t) = Ks~(t). Then the conventional IBVS control law can be designed as: + (8) Vc(t) = KJimg(t)~s(t); where K is the proportional gain. As it is seen in (4) the image Jacobian matrix value is dependent on camera intrinsic parameters and hence the uncertainties in these parameters affect the performance of the controller. Considering the fact that camera calibration process could be time-consuming, the goal of this work is to propose an adaptive controller to deal with camera parameters uncertainties and to reduce the response time of the system to make it feasible for industrial applications.

In order to have a fast response system, we adopt adaptive switch control that breaks the movement of the end-effector into three separate movements and applies different control gains K in each of them while estimating the camera parameters cu, cv, f and .

ADAPTIVE SWITCH METHOD
A 6 DOF robot manipulator with the camera installed at the end effector is considered. The dynamic equation of the robot manipulator is shown as:
M(q(t))•q(t) + C(q(t); q(t))q(t) + G(q(t)) = ;(9)
1551305889018059408890
where M(q(t)) is the inertia matrix, C(q(t); q(t)) is the Coriolis force, G(q(t)) is the gravitational force and is the joint torque.

2520950-311785
Vc(t) = Vct(t) Vcr(t)T 2 R(6 1) is defined as the velocity screw of the camera consisting of the translational velocity Vct(t) 2 R(3 1) and rotational velocity Vcr(t) 2
R(31).

Equation (12) can be represented as: Jt(t) = 1 Jt0(t); (14) Z(t) where,
0 f yi(t) Jt0(t) = f 0 xi(t) : (15)
By considering (10), (12) and (13), one can rewrite (1) for the i th feature si(t) 2 R(2 1) as:
802640-112395
si(t) = Jt(t) Jr(t) Vcr(t) Vct(t) (16) = Jt(t)Vct(t) + Jr(t)Vcr(t) 965835-342265
= Jt(t)JRt(q(t))q(t) + Jr(t)JRr(q(t))q(t):
1242695-158752379345-15875
The adaptive controller is designed based on the switch method which includes three different stages of camera movement. In the first stage, only rotation command of the camera is turned on. In the second stage, only translational movement is active. In the third stage, the classic IBVS control is adopted where both camera rotation and translation are turned on.

In the first stage, the translation is turned off (Vct = 0).

Therefore Eq.(16) becomes as the following:
si = Jr(t)JRr(q(t))q(t): (17)
934085-158751985645-15875
JR(t) = JRt(t) JRr(t)T 2 R(6 6) is also defined as the In the second stage, the rotation is turned off (Vcr = 0), thus robot Jacobian which is decomposed to the translational part Eq. (16) becomes: JRt(t) 2 R(3 6) and rotational part JRr(t) 2 R(3 6). 1 Thus, the camera velocity can be expressed as: si(t) = Jt(t)0JRt(q(t))q(t): (18) Vct(t) = JRt(t)q(t) Z(t) VC (t) = Vcr(t) JRr(t)q(t) ; (10) Finally, in the third stage both translation and rotation move- where q(t) 2 R(6 1) is the robot joint velocity. ments of the camera are switched on. Thus one has: With the assumption that all features have the same depth s (t) = 1 J (t)0J (q(t))q(t) + J (t)J (q(t))q(t): Z, for the i th feature, the image Jacobian matrix in (4), can Z(t) i t Rt r Rr (19) be decomposed to translational part Jt(t) and rotational part Jr(t): Jimg(t) = Jt(t) Jr(t) ; The adaptive controller generates the robot joint torques as (11) the control commands and an adaptive law is developed where, to estimate the camera parameters so that after a transient f 0 xi(t) adaptation process, the feature points reach to the desired ones in the image space. Z(t) Z(t) Jt(t) = ; 0 f yi(t) # (12) In the first stage, the camera is in pure rotation. Consider and Z(t)Z(t) i th feature. The robot Jacobian matrix is represented as: xi(t)yi(t) f2+xi(t)2 a11(t) : : : a16(t) yi(t) 2 . . 3 Jr(t) = f2 f f : (13) . . . yi(t)2 xi(t)yi(t) JR(t) = .. .. ; (20) ; xi(t) # 6a61(t) : : : a66 (t)7 f f where, the feature coordinates in the image space xi(t) and 4 5 yi(t) related to the intrinsic camera parameters are expressed Using the “^” notation for the uncertain camera parameters, in (5) and (6). the first element of the i th feature si(t) 2 R(2 1) in (17) is: 4074160-32727905551805-32727902172335-31089602178050-2957195378460-27063703505835-24726904983480-24726906120130-24726905180330-113030
X
si1(t) =a5k(t)q_k(t)+
269875-15875
k=1
^1^ )a4k(t)ui(t)vi(t)q_k(t) a5k(t)ui(t)2q_k(t)+
f2 2
102870-168910
( c^u ) a4k(t)vi(t)q_k(t) + 2a5k(t)ui(t)q_k(t) ^2 ^2 f ( c^v ) a4k(t)ui(t)q_k(t) + ( c^uc^v )a4k(t)q_k(t) ^2 ^2 ^2 ^2 f f ( c^u2 )a5k(t)q_k(t) + ( 1 )a6k(t)vi(t)q_k(t) ^2 ^2 ^^ f f )a6k(t)q_k(t) : ( c^v f^^ (21) The second element can be expressed in a similar way. The uncertain parameters can be decoupled from the
known values in above equation. Thus, for i th image feature, Eq. (17) can be represented as the linear combination of the regression matrix and the estimated parameters as follows:
_ ^ ^ s^i(t) = Jr(t)JRr(q(t))q(t) = Y1 (q(t); q(t); s(t)) (t) = 2 ^ 3 .. Y1(2;1) : : : Y1(2;10) 6 1 7 ^10 Y1(1;1) : : : Y1(1;10) 4 . ; 5 (22) where Y (q(t); q(t); s(t)) 2 R (2 10) is the regression 1 ^ matrix which is independent of the camera parameters, (t) 2 1342390-10585452157095-1058545976630-167640
R(10 1) including all the estimated parameters is represented in this form:
^
(t) =
1 c^u c^v 2 2 T 1 c^uc^v c^u 1 c^v c^u c^v : ^2 ^2 ^2 ^2 ^2 ^2 ^2 ^2 ^2 ^2 ^^ ^^ ^^ ^2 ^2 ffffff f f f (23) Three main camera parameters can be extracted from the above equation as c^ , c^ and f^^.

vu
In this study it is assumed that depth Z is constant and known. Thus, by referring to equations (18) and (19), it is noted that similar formulation can be expressed for the second and third stages as well.

Due to the merit of the switch control, the depth Z does not appear in the Jacobian matrix of the first stage (17). This creates the opportunity to estimate the camera parameters by adaptive estimation and the Z parameter by other techniques such as 6 in the first stage and to use the estimated Z in the next stages. In the second stage, a similar update law continues estimating the camera parameters but the estimated Z is used for generating the control command. For brevity, the derivation of corresponding equations to (21) and (22) is omitted here. In the third stage, the controller switches back to conventional IBVS and uses the estimated camera

Fig. 1.Block diagram of the proposed adaptive switch controller
29210-1996440
parameters in the two previous stages to calculate the image Jacobian matrix.

In order to fulfill the switch between stages, a criteria is needed to facilitate the decouple movement. One criteria is defined as the norm of feature errors in 9. In this paper, a more intuitive and effective criteria is proposed as the angle between the desired and actual features. Thus, a new feature
is defined as the angle between the desired features and the actual features as illustrated in Figure 2. This feature is used as the criteria of switching between stages. Once the angle reaches the predefined threshold, the control law is switched to the one in the next stage.

587375134620
Fig. 2. New feature–angle – the angle between the desired and actual features
The overall control law is proposed as the following,
8 s1 = G(q(t)) Kv1q(t) ^ T (Jr(t)JRr(jq j0p1 s~(t); ; (t)) K s2 = G(q(t)) Kv2q(t) (J^t(t)JRt(q(t))T Kp2s~(t); ; ; ; 1 ; ; 0 ; ; j j T ; s3 = G(q(t)) K q(t) (J ^ (t)JR(q(t)) Kp3s~(t); ; ; v3 img otherwise ; ; 1322705-6299201334135-313055
;
1283335-33655
:
(24)
where s~(t) = s(t)sd is the position error of n feature
points and Kvi and Kpi are symmetric positive definite gain
517525986155
(a) Estimated camera parameters(b) Feature positions(a) Estimated camera parameters(b) Feature positions
Fig. 3.Test 1 (= 60 )- adaptive switch IBVS performanceFig. 5.Test 2 (= 180 )- adaptive switch IBVS performance
-140335127000
(a) Norm of feature errors-(b) Norm of feature errors-
IBVS vs adaptive switchswitch vs adaptive switch
Fig. 4. Test 1 ( = 60 )- Performance comparison of adaptive switch vs traditional IBVS ; switch method

(a) Norm of feature errors-(b) Norm of feature errors-
IBVS vs adaptive switchswitch vs adaptive switch
Fig. 6. Test 2 ( = 180 )- Performance comparison of adaptive switch vs traditional IBVS ; switch method
matrices at each stage and si (i = 1; 2; 3) is the calculated torque for robot’s joints, 0 and 1 are two predefined thresholds for the control law to switch to the next stage.

^
The uncertain camera parameters (t) are estimated only in the first and second stage. In the third stage the camera parameters are not updated and the previous values are used in the control law. Thus the update law is proposed as follows:
8 ^_ 1 T j j (25) > _ 0 ^(t) = Ka 1Y1T Kp1s~(t); (t) = Ka Y2 Kp2s~(t);1j j < 0 > _ < ^ otherwise > (t) = 0; > :
where Ka and KP i (i = 1; 2) are positive definite diagonal matrices, Y1 is the regression matrix for the first stage as it is shown in (22) and similarly Y2 is the regression matrix for the second stage. The block diagram of the proposed control method is shown in Figure 1.

IV. SIMULATION RESULTS
In this section, the simulation results of the proposed control method are presented. In the simulation, a 6 DOF Puma 560 robot with a camera mounted on its end-effector is used. The camera parameters are given in Table I.

To evaluate the efficiency of the adaptive switch method and compare its performance with that of switch IBVS 9, 17 and conventional IBVS, two tests are conducted with two

initial angles between desired and actual features ( in Figure 2). The threshold angles of switching between control stages ( 0 and 1 in (24)) are set as 10:3 and 8:5 . The objective of all these tests is to lead the end-effector in a way that the actual extracted image features match the desired ones. In all these tests, the number of image features is chosen as four.

Test 1: the initial angle between the actual features and desired one is 60. Kp1, Kp2 and Kp3 values in (24), are 9 10 5 I8, 9 10 4 I8 and 9 10 3 I8 respectively. While K1; K2 and K3 in switch method (Eq. (14) of 17) are 1, 0.1 and 0.05 respectively and constant K in (8) for traditional IBVS is set as 0.05. Figure 3 indicates the performance of the adaptive switch method. Path of features are indicated in Figure 3.a and camera parameters estimation with time is shown in 3.b. Figure 4 compares the performance of adaptive switch with that of IBVS and Switch method.

TABLE I
CAMERA PARAMETERS
270510-19685
Parameter Value
Focal length(m)- (f) 0.008
X axis scaling factor(pixel/m)- ( ) 100000
Y axis scaling factor(pixel/m)- ( ) 100000
Principal point of x axis(pixel)- (cu) 512
Principal point of y axis(pixel)- (cv) 512
Test 2: the initial angle between the actual features and desired one is 180 . KP 1, KP 2, KP 3, K1, K2, K3 and K values are the same as those in Test 1. Figure 5 shows the performance of adaptive switch method and Figure 6 represents the comparison of adaptive switch IBVS with traditional IBVS and switch IBVS.

The proposed adaptive switch method is proved to be ca-pable of performing the task for all angles including 180 . Its response time is 64-76% less than that of switch method. In comparison to conventional IBVS, it is proved to have around 86% faster response for the angles below 90 . In the case where this angle exceeds 90 , the conventional IBVS fails to complete the task while adaptive switch performs it successfully and better than the switch IBVS does.

CONCLUSION
This paper proposes an adaptive IBVS using switch ap-proach for a 6 DOF robot with eye-in-hand configuration. A three stage control scheme is proposed to realize the decoupled rotational and translational movement. The update laws have been developed for estimating the camera intrinsic parameters. The designed controller can overcome some of the drawbacks of traditional IBVS and switch IBVS. The simulation results show that response time of this method is almost 85% less than that of traditional IBVS and 64-76% less than that of switch method. Moreover in the cases where the angle between initial and desired image features is greater than 90 , IBVS normally cannot perform the task while adaptive switch method performs the task successfully and faster than the switch IBVS does. The simulation re-sults demonstrate that adaptive switch method outperform the classic IBVS and switch IBVS in terms of the control performance. The future work includes the experimental test of the developed controller on an industrial robot.

REFERENCES
J. Wang and H. Cho, “Micropeg and hole alignment using image moments based visual servoing method,” IEEE Transactions on Industrial Electronics, vol. 55, no. 3, pp. 1286–1294, 2008.

W. J. Wilson, C. W. Hulls, and G. S. Bell, “Relative end-effector control using cartesian position based visual servoing,” IEEE Transactions on Robotics and Automa-tion, vol. 12, no. 5, pp. 684–696, 1996.

S. Hutchinson, G. D. Hager, and P. Corke, “A tutorial on visual servo control,” Robotics and Automation, IEEE Transactions on, vol. 12, no. 5, pp. 651–670, 1996.

N. R. Gans and S. A. Hutchinson, “Stable visual ser-voing through hybrid switched-system control,” IEEE Transactions on Robotics, vol. 23, no. 3, pp. 530–540, 2007.

S. Li, A. Ghasemi, W.-F. Xie, and Y. Gao, “Sliding mode control (smc) of image-based visual servoing for

a 6dof manipulator,” in Recent Developments in Sliding Mode Control Theory and Applications, InTech, 2017.

M. Keshmiri, W.-F. Xie, and A. Ghasemi, “Visual ser-voing using an optimized trajectory planning technique for a 4 dofs robotic manipulator,” International Journal of Control, Automation and Systems, pp. 1–12, 2017.

F. Chaumette, “Potential problems of stability and con-vergence in image-based and position-based visual ser-voing,” in The confluence of vision and control, pp. 66– 78, Springer, 1998.

N. R. Gans and S. A. Hutchinson, “A switching ap-proach to visual servo control,” in Intelligent Control, 2002. Proceedings of the 2002 IEEE International Sym-posium on, pp. 770–776, IEEE, 2002.

W.-F. Xie, Z. Li, X.-W. Tu, and C. Perron, “Switch-ing control of image-based visual servoing with laser pointer in robotic manufacturing systems,” IEEE Trans-actions on Industrial Electronics, vol. 56, no. 2, pp. 520–529, 2009.

Y. Shen, G. Xiang, Y.-H. Liu, and K. Li, “Uncali-brated visual servoing of planar robots,” in Robotics and Automation, 2002. Proceedings. ICRA’02. IEEE International Conference on, vol. 1, pp. 580–585, IEEE, 2002.

J. M. Sebastian,´ L. Pari, L. Angel, and A. Traslosheros, “Uncalibrated visual servoing using the fundamental matrix,” Robotics and Autonomous Systems, vol. 57, no. 1, pp. 1–10, 2009.

H. Wang, M. Jiang, W. Chen, and Y.-H. Liu, “Visual servoing of robots with uncalibrated robot and camera parameters,” Mechatronics, vol. 22, no. 6, pp. 661–668, 2012.

C. Cheah, C. Liu, and J. Slotine, “Task-space adaptive setpoint control for robots with uncertain kinematics and actuator model,” IEEE Transactions on Automatic Control, vol. 51, no. 6, pp. 1024–1029, 2006.

C. C. Cheah, S. P. Hou, Y. Zhao, and J.-J. E. Slotine, “Adaptive vision and force tracking control for robots with constraint uncertainty,” Mechatronics, IEEE/ASME Transactions on, vol. 15, no. 3, pp. 389–399, 2010.

C. C. Cheah, C. Liu, and J. J. E. Slotine, “Adaptive jaco-bian vision based control for robots with uncertain depth information,” Automatica, vol. 46, no. 7, pp. 1228– 1233, 2010.

H. Wang, “Adaptive visual tracking for robotic systems without image-space velocity measurement,” Automat-ica, vol. 55, pp. 294–301, 2015.

A. Ghasemi and W.-F. Xie, “Decoupled image-based visual servoing for robotic manufacturing systems using gain scheduled switch control,” in Advanced Mecha-tronic Systems (ICAMechS), 2017 International Confer-ence on, pp. 94–99, IEEE, 2017.