Controlling Leg Exoskeleton: Leap Motion Sensor and Robotics
VerifiedAdded on 2021/04/17
|8
|1962
|135
Report
AI Summary
This report details a project that aims to control a robotic leg exoskeleton using a Leap Motion sensor. The project investigates the interaction between humans and machines, with the goal of creating a robotic leg that mimics human leg movements. The Leap Motion sensor is used to track the movement of the leg exoskeleton and translate these movements into actions for the robotic leg. The methodology involves using C# language and the LabComm protocol for communication between the main computer and the robot, as well as the Leap Motion Sensor Software for sensor data acquisition. The project explores the use of Java scripting to obtain the coordinates of the leg exoskeleton's lower extremity joint angles, which are then sent to a microcontroller for controlling the articulated robot. The study reviews existing literature on leg exoskeleton motion identification, including both contact and non-contact techniques, emphasizing the benefits of the Leap Motion sensor for gesture recognition. The conclusion highlights the potential of this approach for assisting individuals with disabilities and advances in computer science.

Running head: controlling leg exoskeleton using leap motion sensor 1
Controlling Leg Exoskeleton Using Leap Motion
Student’s Name
Institutional Affiliation
INTRODUCTION
The field of robotics is the most important in the industrial and various automation
systems. This implies that the robot is becoming increasingly significant in our daily lives. One
sensor, in particular, the Leap motion sensor, is an example of a groundbreaking technology that
Controlling Leg Exoskeleton Using Leap Motion
Student’s Name
Institutional Affiliation
INTRODUCTION
The field of robotics is the most important in the industrial and various automation
systems. This implies that the robot is becoming increasingly significant in our daily lives. One
sensor, in particular, the Leap motion sensor, is an example of a groundbreaking technology that
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

controlling leg exoskeleton using leap motion sensor 2
is capable of changing the way we control machines and our world as a whole. In this project, we
will use this technology to control a leg exoskeleton.
The aim of the project is to create an interaction between a leg exoskeleton and a robotic
leg. The interaction between man and machine provides the relation between human and
computer. This idea culminates in the making of a robotic leg which resembles the human leg as
much as possible without limiting the leg to one set of task (Corke, 2017). The Leap motion
controller is to be used in X, Y, Z in controlling the leg. The project will focus on the similarity
of the human leg with the robotic leg although an underlying aim exists that entails expanding
the functionality of the leg upon creation of the basic model just like in a sports welding robot
(Soyguder and Boles, 2017). Using image processing technique at the same time makes control
quite difficult since various identifications schemes of the leg like color identification, tracking,
pattern recognition, and giving X and Y axis to the points are needed.
Data from the newest sensors can be used successfully in recognizing gestures and
therefore controlling a computer. Some devices currently exist that yield data that can easily be
applied in voice recognition. A good example is the Microsoft Kinect (Ascioglu, & Senol, 2018).
This device provides a 3D point cloud of the scene observed. However, it lacks the needed
accuracy for leg gesture recognition because it was designed for applications that interpret the
user’s whole body movement.
Leap Motion Controller is another device that is designed to track the movement of the
leg exoskeleton. The controller was developed by Leap Motion and was released in 2013. The
device is small in size and can be situated in front of a computer. It offers a new way of human-
technology interaction awaiting evaluation (Mishra and Sing, 2017). This device can be linked to
a computer with the use of a USB. It can then sense leg movements within a distance of one
is capable of changing the way we control machines and our world as a whole. In this project, we
will use this technology to control a leg exoskeleton.
The aim of the project is to create an interaction between a leg exoskeleton and a robotic
leg. The interaction between man and machine provides the relation between human and
computer. This idea culminates in the making of a robotic leg which resembles the human leg as
much as possible without limiting the leg to one set of task (Corke, 2017). The Leap motion
controller is to be used in X, Y, Z in controlling the leg. The project will focus on the similarity
of the human leg with the robotic leg although an underlying aim exists that entails expanding
the functionality of the leg upon creation of the basic model just like in a sports welding robot
(Soyguder and Boles, 2017). Using image processing technique at the same time makes control
quite difficult since various identifications schemes of the leg like color identification, tracking,
pattern recognition, and giving X and Y axis to the points are needed.
Data from the newest sensors can be used successfully in recognizing gestures and
therefore controlling a computer. Some devices currently exist that yield data that can easily be
applied in voice recognition. A good example is the Microsoft Kinect (Ascioglu, & Senol, 2018).
This device provides a 3D point cloud of the scene observed. However, it lacks the needed
accuracy for leg gesture recognition because it was designed for applications that interpret the
user’s whole body movement.
Leap Motion Controller is another device that is designed to track the movement of the
leg exoskeleton. The controller was developed by Leap Motion and was released in 2013. The
device is small in size and can be situated in front of a computer. It offers a new way of human-
technology interaction awaiting evaluation (Mishra and Sing, 2017). This device can be linked to
a computer with the use of a USB. It can then sense leg movements within a distance of one

controlling leg exoskeleton using leap motion sensor 3
meter then translate them into actions for the computer to perform. Since the Leap Motion is
very sensitive to even the smallest movements, it can map the entire movement of the leg
exoskeleton moving close to it.
The main research questions of this project are
How can robotic legs be designed for the disabled?
Can you teach a robot to walk?
LITERATURE REVIEW
Currently, there are several works that are being done to identify the motion of the leg
exoskeleton. Many articles have been utilized in surveying the motion of the leg exoskeleton.
The major fields that are applying this technique entail computer graphics, automatic sketching,
leg detection, as well as the industrial robots that perform human roles (Ascioglu, & Senol,
2018). The research paper looks into the most successful technique to utilize robotics. Two types
of techniques emerge that can be used in this area. They include the contact type and the non-
contact type. The contact type of devices entails the exoskeleton, electromagnetic tracking
system, data gloves et cetera (Do, 2017. Non-contact type, on the other hand, entails vision-
based system, camera based, and speech recognition etcetera.
In this project, the technique to be used comes under non-contact type because we will be
using a Leap motion sensor to track the movement of the leg exoskeleton. One thing about Leap
Motion is that it does not offer access to raw data in the form of a cloud of points, unlike
Microsoft Kinect (Molinari et al., 2018). Proprietary drivers gave by vendor process the captured
data and can be accessed via API. Recognizing the leg exoskeleton requires optimization of the
Leap Motion since it was designed to be a human-computer interface rather than being a general
purpose 3D scanner. The Leap Motion API provides a data container in a Frame with an average
meter then translate them into actions for the computer to perform. Since the Leap Motion is
very sensitive to even the smallest movements, it can map the entire movement of the leg
exoskeleton moving close to it.
The main research questions of this project are
How can robotic legs be designed for the disabled?
Can you teach a robot to walk?
LITERATURE REVIEW
Currently, there are several works that are being done to identify the motion of the leg
exoskeleton. Many articles have been utilized in surveying the motion of the leg exoskeleton.
The major fields that are applying this technique entail computer graphics, automatic sketching,
leg detection, as well as the industrial robots that perform human roles (Ascioglu, & Senol,
2018). The research paper looks into the most successful technique to utilize robotics. Two types
of techniques emerge that can be used in this area. They include the contact type and the non-
contact type. The contact type of devices entails the exoskeleton, electromagnetic tracking
system, data gloves et cetera (Do, 2017. Non-contact type, on the other hand, entails vision-
based system, camera based, and speech recognition etcetera.
In this project, the technique to be used comes under non-contact type because we will be
using a Leap motion sensor to track the movement of the leg exoskeleton. One thing about Leap
Motion is that it does not offer access to raw data in the form of a cloud of points, unlike
Microsoft Kinect (Molinari et al., 2018). Proprietary drivers gave by vendor process the captured
data and can be accessed via API. Recognizing the leg exoskeleton requires optimization of the
Leap Motion since it was designed to be a human-computer interface rather than being a general
purpose 3D scanner. The Leap Motion API provides a data container in a Frame with an average

controlling leg exoskeleton using leap motion sensor 4
frame rate of fifty frames per second using dual core laptop and a USB 2.0 interface. In each
frame, we have legs, printables, frame timestamp, additional information, rotation, translation
and scaling data.
A legged robot is an example of an articulated robot. Articulated robots can span from
simple two jointed structures to systems that have ten or more interacting joints (Godoy et al.,
2018). These joints are driven by various means which includes electric motors. Robot types like
robotic legs can be articulated or non-articulated. The Leap Motion operates with two IR
(Infrared) cameras and three infrared LEDs in a FOV (limited field of view) of eight cubic feet.
Both features enable the device to minimize errors from tools, leg exoskeleton features and rely
on its inbuilt mathematical model in maximizing speed and precision (Ascioglu, & Senol, 2018).
While the features are detected by the device, updates in data frames are provided by it. In each
frame, a list of tracking data exists like recognized movements, tools, leg exoskeleton and factors
that details the overall scene motion.
The Leap Motion Sensor is used as it provides analysis of the objects observed in the
field of view of the object. It provides recognition for leg exoskeleton, tools, and reports discrete
positions and motions.at the center of the device; we have the controller's field of view in the
form of an inverted pyramid. This controller is accessed and programmed via the APIs, with a
variety of programming languages giving it some support. These languages range from
JavaScript, Objective C, and C++ to Python (Shelton IV et al., 2018). The robotic leg has found
its applications real situations as it can be used in helping the disabled walk normally.
Mechanical sensors are used.
frame rate of fifty frames per second using dual core laptop and a USB 2.0 interface. In each
frame, we have legs, printables, frame timestamp, additional information, rotation, translation
and scaling data.
A legged robot is an example of an articulated robot. Articulated robots can span from
simple two jointed structures to systems that have ten or more interacting joints (Godoy et al.,
2018). These joints are driven by various means which includes electric motors. Robot types like
robotic legs can be articulated or non-articulated. The Leap Motion operates with two IR
(Infrared) cameras and three infrared LEDs in a FOV (limited field of view) of eight cubic feet.
Both features enable the device to minimize errors from tools, leg exoskeleton features and rely
on its inbuilt mathematical model in maximizing speed and precision (Ascioglu, & Senol, 2018).
While the features are detected by the device, updates in data frames are provided by it. In each
frame, a list of tracking data exists like recognized movements, tools, leg exoskeleton and factors
that details the overall scene motion.
The Leap Motion Sensor is used as it provides analysis of the objects observed in the
field of view of the object. It provides recognition for leg exoskeleton, tools, and reports discrete
positions and motions.at the center of the device; we have the controller's field of view in the
form of an inverted pyramid. This controller is accessed and programmed via the APIs, with a
variety of programming languages giving it some support. These languages range from
JavaScript, Objective C, and C++ to Python (Shelton IV et al., 2018). The robotic leg has found
its applications real situations as it can be used in helping the disabled walk normally.
Mechanical sensors are used.
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

controlling leg exoskeleton using leap motion sensor 5
PROPOSED METHODOLOGY
This project will be aimed at creating a single program for the main computer capable of
dealing with connection with the robot, and connection with the Leap Motion sensor, acting as a
linkage and data manager. To connect the main computer with the server C# language, we will
use the LabComm protocol (Schwartz and Yap, 2016). This communication protocol was
designed by Automatic Control Department of the LTH, Lund. It enables a computer connected
to the local network to communicate with the robot controller. To connect with the Leap Motion
sensor, a Leap Motion Sensor Software will be used. Once the sensor reads the data, it would
send it to the robot to initiate control. A robot leg would be our control object here and the Leap
Motion Sensor our control tool.
In this proposal, frames 200fps frame rate will be covered by the leap motion and based
on java scripting, we will be able to obtain the coordinates of the leg exoskeleton lower
extremity joint angles. The data obtained from the Leap motion sensor will be sent to the
Microcontroller for controlling the articulated robot (Chinmilli et al., 2017). The Leap motion
can trace the angle to provide a signal to robotic leg exoskeleton based on the axis location.
Transmission of the signal from PC to the microcontroller is for the movement of the robotic leg
exoskeleton. The signal is then processed and then transmitted to the robotic leg to carry out
various actions (Young and Ferris, 2017).
CONCLUSION
This proposal describes controlling the robotic leg exoskeleton using Leap Motion
Sensor. It enables me to better understand robots and the field of computer science as a whole. In
this proposed project, we get proper leg exoskeleton motion result with the use of Leap Motion
PROPOSED METHODOLOGY
This project will be aimed at creating a single program for the main computer capable of
dealing with connection with the robot, and connection with the Leap Motion sensor, acting as a
linkage and data manager. To connect the main computer with the server C# language, we will
use the LabComm protocol (Schwartz and Yap, 2016). This communication protocol was
designed by Automatic Control Department of the LTH, Lund. It enables a computer connected
to the local network to communicate with the robot controller. To connect with the Leap Motion
sensor, a Leap Motion Sensor Software will be used. Once the sensor reads the data, it would
send it to the robot to initiate control. A robot leg would be our control object here and the Leap
Motion Sensor our control tool.
In this proposal, frames 200fps frame rate will be covered by the leap motion and based
on java scripting, we will be able to obtain the coordinates of the leg exoskeleton lower
extremity joint angles. The data obtained from the Leap motion sensor will be sent to the
Microcontroller for controlling the articulated robot (Chinmilli et al., 2017). The Leap motion
can trace the angle to provide a signal to robotic leg exoskeleton based on the axis location.
Transmission of the signal from PC to the microcontroller is for the movement of the robotic leg
exoskeleton. The signal is then processed and then transmitted to the robotic leg to carry out
various actions (Young and Ferris, 2017).
CONCLUSION
This proposal describes controlling the robotic leg exoskeleton using Leap Motion
Sensor. It enables me to better understand robots and the field of computer science as a whole. In
this proposed project, we get proper leg exoskeleton motion result with the use of Leap Motion

controlling leg exoskeleton using leap motion sensor 6
Sensor in Real-time. We also obtain the X, Y, Z leg exoskeleton motions data with the
application of the Leap Motion Sensor through the use of Java Programming and the mapping of
this data with AT328PU.
Sensor in Real-time. We also obtain the X, Y, Z leg exoskeleton motions data with the
application of the Leap Motion Sensor through the use of Java Programming and the mapping of
this data with AT328PU.

controlling leg exoskeleton using leap motion sensor 7
References
Corke, P. (2017). Robotics, Vision, and Control: Fundamental Algorithms In MATLAB®
Second, Completely Revised (Vol. 118). Springer.
Schwartz, J. T., & Yap, C. K. (Eds.). (2016). Algorithmic and Geometric Aspects of Robotics
(Routledge Revivals). Routledge.
Shelton IV, F. E., Yates, D. C., Harris, J. L., Houser, K. L., & Swayze, J. S. (2018). U.S. Patent
Application No. 15/237,946.
Molinari, M., Masciullo, M., Tamburella, F., Tagliamonte, N. L., Pisotta, I., & Pons, J. L.
(2018). Exoskeletons for Over-Ground Gait Training in Spinal Cord Injury. In Advanced
Technologies for the Rehabilitation of Gait and Balance Disorders (pp. 253-265).
Springer, Cham.
Young, A. J., & Ferris, D. P. (2017). State of the art and future directions for lower limb robotic
exoskeletons. IEEE Transactions on Neural Systems and Rehabilitation Engineering,
25(2), 171-182.
Chinmilli, P. T., Redkar, S., Zhang, W., & Sugar, T. (2017). A Review of Wearable Inertial
Tracking based Human Gait Analysis and Control Strategies of Lower-Limb
Exoskeletons. Int Rob Auto J, 3(7), 00080.
Do, T. T. N. (2016). Development of a virtual pet game using Oculus Rift and leap motion
technologies (Doctoral dissertation, Bournemouth University).
Ascioglu, G., & Senol, Y. (2018). PREDICTION OF LOWER EXTREMITY JOINT ANGLES
USING NEURAL NETWORKS FOR EXOSKELETON ROBOTIC LEG. International
Journal of Robotics and Automation, 33(2).
References
Corke, P. (2017). Robotics, Vision, and Control: Fundamental Algorithms In MATLAB®
Second, Completely Revised (Vol. 118). Springer.
Schwartz, J. T., & Yap, C. K. (Eds.). (2016). Algorithmic and Geometric Aspects of Robotics
(Routledge Revivals). Routledge.
Shelton IV, F. E., Yates, D. C., Harris, J. L., Houser, K. L., & Swayze, J. S. (2018). U.S. Patent
Application No. 15/237,946.
Molinari, M., Masciullo, M., Tamburella, F., Tagliamonte, N. L., Pisotta, I., & Pons, J. L.
(2018). Exoskeletons for Over-Ground Gait Training in Spinal Cord Injury. In Advanced
Technologies for the Rehabilitation of Gait and Balance Disorders (pp. 253-265).
Springer, Cham.
Young, A. J., & Ferris, D. P. (2017). State of the art and future directions for lower limb robotic
exoskeletons. IEEE Transactions on Neural Systems and Rehabilitation Engineering,
25(2), 171-182.
Chinmilli, P. T., Redkar, S., Zhang, W., & Sugar, T. (2017). A Review of Wearable Inertial
Tracking based Human Gait Analysis and Control Strategies of Lower-Limb
Exoskeletons. Int Rob Auto J, 3(7), 00080.
Do, T. T. N. (2016). Development of a virtual pet game using Oculus Rift and leap motion
technologies (Doctoral dissertation, Bournemouth University).
Ascioglu, G., & Senol, Y. (2018). PREDICTION OF LOWER EXTREMITY JOINT ANGLES
USING NEURAL NETWORKS FOR EXOSKELETON ROBOTIC LEG. International
Journal of Robotics and Automation, 33(2).
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

controlling leg exoskeleton using leap motion sensor 8
Godoy, J. C., Campos, I. J., Pérez, L. M., & Muñoz, L. R. (2018). Nonanthropomorphic
exoskeleton with legs based on eight-bar linkages. International Journal of Advanced
Robotic Systems, 15(1), 1729881418755770.
MISHRA, S., & SINGH, M. (2017). DIFFERENT WALKING TECHNOLOGY USED FOR
ROBOTICS MECHANISMS AND MECHANICAL DEVICES. Journal on Intelligent
Systems & Robotics Insights & Transformations, 1(1).
Soyguder, S., & Boles, W. (2017). SLEGS robot: development and design of a novel flexible and
self-reconfigurable robot leg. Industrial Robot: An International Journal, 44(3), 377-391.
Godoy, J. C., Campos, I. J., Pérez, L. M., & Muñoz, L. R. (2018). Nonanthropomorphic
exoskeleton with legs based on eight-bar linkages. International Journal of Advanced
Robotic Systems, 15(1), 1729881418755770.
MISHRA, S., & SINGH, M. (2017). DIFFERENT WALKING TECHNOLOGY USED FOR
ROBOTICS MECHANISMS AND MECHANICAL DEVICES. Journal on Intelligent
Systems & Robotics Insights & Transformations, 1(1).
Soyguder, S., & Boles, W. (2017). SLEGS robot: development and design of a novel flexible and
self-reconfigurable robot leg. Industrial Robot: An International Journal, 44(3), 377-391.
1 out of 8
Related Documents

Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
© 2024 | Zucol Services PVT LTD | All rights reserved.