Geeks of Coding

Join us on Telegram


UAV – Unmanned Aerial Vehicles is (commonly known as Drone) an aircraft or a flying object without a human pilot. Basically, drone or UAVs are subjected to a self-functioning robotics module identifying their way for travelling to the desired location or destination.

 Mainly Drones or UAVs have disambiguation as they have two module operation in which they are operating themselves without any type of human help and on the other hand, they will be human operated remotely. So, UAVs are component of UAS (Unmanned Aerial System) which include both Unmanned Aerial Vehicles and a Ground-based controller type Aerial Vehicles.

UAS are of two types. Which are as follows:

  • UAV – Aerial Robots
  • RPV – Remotely Piloted Vehicle

Types of MAV (Micro Aerial Vehicles):


In this module, we shall be focused on Quadcopter and Hexacopter.

Quadcopters are famous drones which have inundated the global market. The applications of such drones are enormous; be it in rescue missions, spying, delivering, or studying unreachable landscapes.

A Quadcopter is a type of DRONE/UAV and RPV with four rotors. The small size and low inertia of drones allow the use of a particularly simple flight control system.

Design Principles

white and red DJI quadcopter drone a great example of aerial robotics

Each rotor of Quadcopter produces both lift and torque about its centre of rotation and drag opposite to the vehicle’s direction of flight.

The Quadcopter generally has the same rotation of rotors of same diagonal motors (say clockwise rotation for one diagonally attached motors), while on the other hand, two other motors of another diagonal have an opposite direction (say the anti-clockwise direction of rotation) to overcome from the problem of whole drone/UAV rotation while flying in the air.

Lift off and Stability Mechanism

Rolls and Pitch

To roll or pitch the drone, the more the motors rotate (RPM) the more the Quadcopters tilt in that direction while other motors are having constant RPMs.

Translation: –

So, imagine you want to move your drone from one point to other, just want to translate in a horizontal direction. You just have to increase the rotation of either one side of the plane while keeping other motors constant to move the vehicle in the desired direction (Always remember the drone moves in the opposite direction to the motor which increasing the speed while flying).

UAVs Translation

DoF (Degree of Freedom)-

3 Translational + 3 Rotational = 6

Working Components (Defense, Industrial etc.)

  1. State Estimation – State estimation in terms of UAVs is subjected to Position and Velocity (Including angular velocity).
  2. Control – Command motors and producing desired actions to navigate to the desired state.
  3. Mapping – The vehicle/Drone must have some basic capabilities to map its environment.
  4. Planning – Vehicle must compute its trajectory and safe path to travel from one point to another.

How to navigate without GPS/NAVIC/GLONASS –

 Sensors required for Mapping/Planning for navigation where navigational systems failed to reached are as follows:

  • Scanning LASER Range Finder (example – Hokuyo UTM-30 LX etc.)
  • Colour and Depth Camera (Kinect Sensor for XBOX 360, Raspberry Pi Camera for depth measurement and colour)
  • USB Camera (mvBlueFox)
Environment Mapping (SLAM)

Now let’s assume that the robot has something like an inertial measurement unit that allows it to estimate its movement as it goes from one position to another position as shown in the picture. So, you have some estimate of Δx, and when it gets to this new position this range finder estimates the positions of the pillars that it had measured previously (d1, d2, d3). Except now these range estimates, d1’, d2’, and d3’ are different from the original depth estimates which are d1, d2, and d3. So, the question we have is it possible for the robot to concurrently estimate the locations of the pillars and the displacement delta x. So, we have to think about it, we’re trying to estimate these eight variables.3 pairs of X, Y coordinates for the pillars, and Δx. This problem is referred to as Simultaneous Localization and Mapping (SLAM). And the idea here is that we’re trying to localize our self, in other words, we’re trying to estimate Δx, while mapping the pillars, x1, y1, x2, y2, and x3, y3.

Concurrently Estimate –

  • Location of pillars (6)
  • Displacement of the robot (2)

SLAM – Simultaneous Localization and Mapping is a computational technique to map or construct the unknown environment while simultaneously keeping the track of a robot’s location. There are several algorithmic approaches are available to construct the alter ego of the unknown environment in the brain of the robot. The best and most used algo for UAVs are as follows:

  • Kalman Filters and Particle Filters
  • Commercialized SLAM system (Google’s ARCore)
  • Monte Carlo Method
  • Augmented and Virtual Reality techniques to map the environment.
person holding controller

Mapping (SLAM) –

Topological Maps are a method of environment representation which capture the connectivity of the environment rather than creating a geometrically mapping. Topological SLAM approaches have been used to enforce global consistency in metric SLAM algorithms for commercial as well as defence-based systems UAVs and RPVs. This technique is useful for creating a virtual environment to plan the trajectory of the UAVs in an environment which is not seen by the UAVs. In these type of cases, various techniques are used, such as Computer Vision algorithmic approaches like Convolutional Neural Network aka CNN, Pattern Recognition, LeNet5 etc.

Abhishek Tyagi

Abhishek Tyagi

Currently, A Mechatronics Engineer, Machine learning and deep learning enthusiast. And love to research on various topics.

Recommended Articles

Leave A Comment