Hostname: page-component-7b9c58cd5d-bslzr Total loading time: 0.001 Render date: 2025-03-15T14:58:56.139Z Has data issue: false hasContentIssue false

Hybrid methodology based on computational vision and sensor fusion for assisting autonomous UAV on offshore messenger cable transfer operation

Published online by Cambridge University Press:  31 January 2022

Gabryel S. Ramos*
Affiliation:
Federal Center for Technological Education Celso Suckow da Fonseca (CEFET-RJ), Rio de Janeiro, Brazil
Milena F. Pinto
Affiliation:
Federal Center for Technological Education Celso Suckow da Fonseca (CEFET-RJ), Rio de Janeiro, Brazil
Fabricio O. Coelho
Affiliation:
Federal University of Juiz de Fora (UFJF), Juiz de Fora, Brazil
Leonardo M. Honório
Affiliation:
Federal University of Juiz de Fora (UFJF), Juiz de Fora, Brazil
Diego B. Haddad
Affiliation:
Federal Center for Technological Education Celso Suckow da Fonseca (CEFET-RJ), Rio de Janeiro, Brazil
*
*Corresponding author. E-mail: gabryelsr@gmail.com
Rights & Permissions [Opens in a new window]

Abstract

The recent development of new offshore projects in pre-salt deepwater fields has placed offshore loading operations as the main production outflow alternative, increasing the operational complexity and risks. Numerous dangerous situations are associated with oil offloading, such as the messenger line transfer during the mooring stage. Nowadays, this critical task is realized by launching a thin messenger cable using the pneumatic line throwing apparatus. This is a complex and slow process since the operation usually occurs with the ship opposite to the wind. This work proposes a hybrid flight methodology based on computer vision and sensor fusion techniques for autonomous unmanned aerial vehicles (UAVs). The UAV takes off from an oil rig and precisely reaches a specific point in the shuttle tanker without using expensive positioning devices and augmenting UAV’s orientation (yaw) precision since the compass can suffer from severe interference due to naval metallic structures near the vehicle. The proposed framework was tested in a realistic simulated environment considering several practical operational constraints. The results demonstrated both the robustness and efficiency of the methodology.

Type
Research Article
Copyright
© The Author(s), 2022. Published by Cambridge University Press

1. Introduction

The industry of offshore oil and gas production is a complex sector that comprises exploration, extraction, and transport of natural gas and crude oil from the seabed [Reference Bogue1, Reference Tan, Roberts, Sundararaju, Sintive, Facheris, Vanden Bosch, Lehning and Pegg2]. It is widely known that this sector plays an essential role in the global energy supply. Despite its hazardous and complex nature, the industry sector has already used robots for inspection and maintenance of underwater structures using remotely operated vehicles since the 1970s decade [Reference Petillot, Antonelli, Casalino and Ferreira3]. However, in recent years novel robotic technologies have been developed to move the traditional operations toward more autonomous and advanced processes [Reference Durdevic, Ortiz-Arroyo, Li and Yang4], reducing operational costs and human effort and enhancing safety and reliability as well [Reference Yu, Yang, Ren, Luo, Dobie, Gu and Yan5].

As stated in ref. [Reference Stavinoha, Chen, Walker, Zhang and Fuhlbrigge6], the oil and gas sector seeks to minimize both employees and suppliers without losing manufacturing efficiency and quality. In this sense, applying automation and robots in offshore oil rig environments is a prominent solution to further improving the exploration process and providing safe and consistent operations. In this paper, the authors will focus on a specific offshore exploration and production environment: the deepwater oil fields. The oil rig types applied to this kind of project usually are tension leg platform, Spar, floating, storage and offloading (FSO), and floating production storage and offloading (FPSO). The FPSO/FSO is usually built by adapting a shuttle tanker vessel hull, utilizing its already built tanks, and installing the surface processing modules over the deck. The hull may be constructed specifically for the FPSO/FSO project as well, but using the STV conversion is cheaper. This kind of oil rig has become the mainstream of offshore oil and gas production in deepwater fields due to its capacity to extract the reservoir’s oil, perform primary crude oil treatment on its topside plant, and then store the treated oil in its cargo tanks. This is a desirable characteristic given the distance from the deepwater fields from the continental shore, making oil and gas pipelines construction and operation impracticable or economically unattractive [Reference Menegaldo, Ferreira, Santos and Guerato7].

In order to transport the crude oil from the FSO or FPSO cargo tanks to oil terminals, refineries, or exportation, it is necessary to transfer the fluids to shuttle tankers, which will carry the petroleum to its respective destination in the logistic supply chain. Transferring the oil from the platform to the vessel is called offshore loading, from now on referred to as offloading. Basically, in this operation, the shuttle tanker approaches the FPSO or FSO, and they are connected by a hose, which is used for pumping the oil from the oil rig to the ship. The shuttle tanker may be dynamic positioning, which automatically holds its position using its propellers and thrusters, or conventional, which demands anchor handling tug supply vessels to tow the ship and maintain it in its desired position.

1.1. Offloading operation

The offloading operation is a complex and difficult marine operation, in which the shuttle tanker must be aligned, moored, and connected in tandem with the FPSO or FSO, as shown in Fig. 1. It can be generically broken down into six operational stages:

  1. i. Approaching: the tanker to the FPSO in a determined distance;

  2. ii.Mooring: the two vessels with a robust link;

  3. iii.Connection: of the offloading hose;

  4. iv.Pumping: fluids from oil rig’s tanks to the ship;

  5. v.Disconnection: of the hoses and mooring links;

  6. vi.Departure.

Figure 1. Tandem approach scheme.

1.2. Background and related works

The offshore messenger cable transfer stage is a risky step that occurs in the mooring stage of the offloading operation. Basically, in mooring, this thin rope is launched from the shuttle tanker to the FPSO (or vice versa), where it is attached to the mooring link, and the sender of the messenger line uses its machines to pull the mooring link through it. In some operations, the messenger line cannot be launched directly from the ship to the FPSO/FSO. Because of that, a small tug called line handling (LH) has to approach the production facility’s hull and “fish” the mooring link on the water surface by launching the messenger line and pulling it to the deck. The LH carries it to the destination. A similar process is done to transport the offloading hose between ships. Still, instead of connecting the messenger line to the mooring link, it is attached to the offloading hose [8, Reference Sphaier, Amaral Netto, Oliveira, Junior, Henrique Sá Correa da Silva and Henaut9].

This cable’s failure directly hampers the correct positioning of the shuttle tanker. An event arising out of this problematic offloading step could be the collision between the shuttle tanker and the FPSO in case the tanker loses its positioning, or the messenger cable hits a human operator during this transfer task [Reference Rodriguez, de Souza, Gilberto and Martins10].

Due to its versatility and reliability, it is suggested that aerial robots could be used to transfer the messenger cable from the platform to the shuttle tanker or vice versa. In the last years, the use of unmanned aerial vehicles (UAVs) platforms has attracted huge interest in many applications on different industrial sectors [Reference Faiz Ahmed, Kushsairy, Bakar, Hazry and Kamran Joyo11Reference Silva, Ribeiro, Santos, Carmo, Honório, Oliveira and Vidal16]. Recently, it has been applied to the offshore industry [Reference Mullenders17], which can be explained by the aircraft’s ability to carry out various activities with reduced costs, flexibility, and fewer risks to human life. Besides, these robotic platforms should present a high degree of autonomy to perform tasks with less human intervention safely, as shown in refs. [Reference Selecký, Rollo, Losiewicz, Reade and Maida18Reference Coelho, Carvalho, Pinto and Marcato21].

The objective of this work is to design a framework for a UAV to autonomously take off from an oil rig and fly to a point on the shuttle tanker vessel within a precision under 2 m. Note that our mission was divided into three stages.

In the first step, the operators will configure the delivery point coordinates by simple GPS (herefore this coordinates will present errors up to 9 m, which is the normal positioning precision of GPS [Reference Guo, Miao and Zhang22]), the generic ship’s region of the point (i.e., bow, mid-ship, or stern), the desired distance between the UAV and the ship at the end of this stage, the flight height, and finally, the maximum flight speed. The UAV will take off and fly to the calculated position following a planned path based on a native GPS feedback controller. The vehicle will be positioned perpendicularly to the ship on the distance configured by the operator, as detailed in Section 2.

The two next stages are the UAV approaching the shuttle tanker and detecting the final point (or the delivery spot), respectively. After reaching the final point of the first stage, the UAV will start its flight toward the final point configured by the operator initially, but without relying on GPS feedback and guided by computational vision instead. The camera’s image frames from the UAV are the input data to an object detector convolutional neural network (CNN) trained to detect the ship and its generic regions. The system will use the detected region to maintain the UAV’s yaw, flying straight to the configured region by correcting the yaw if this object is distancing from the image’s center. The controller will consider the flight height constant (the position control for altitude is very precise, as shown in Section 3), changing only the yaw. The amplitude of the yaw correction (not speed) will be defined by a generic proportional–derivative–integrative (PID) methodology where the error is the difference between the horizontal coordinate of the detected centroid and the image’s horizontal center.

For robustness, the yaw angle is estimated by the CNN successive detections and by the UAV’s Inertial Measuring Unit (IMU) sensors, which will provide the vehicle’s heading. Those two yaw measurements will be fused by an extended Kalman filter (EKF) alongside position (x, y) provided by the avionic sensors’ GPS, velocity, and angular velocity. In practice, only the yaw angle will be corrected by the EKF, and also, it will be used to predict the robot’s heading (therefore position, speed, and angular speed), giving more robustness to the framework controller.

The last step will be detecting a marker representing the final and precise messenger line delivery spot. In a real operation, this marker could be painted on the shuttle tanker, placed in position during the mooring stage, or held by an operator assigned to receive the cable. Thus, completing those three steps, the robot will reach the desired point with a meter (or even centimeter) precision without using expensive navigation sensors subject to interference in the hostile offshore environment.

The current literature related to the transport of thin lines or light payloads of any nature by UAV applied to offshore or maritime applications focuses on the operation concept and viability. Thus, they are described using manual radio-controlled (RC) or semi-autonomous solutions for simplifying tests and proof of concept. It was reported by ref. [23] that a tugboat operator was testing the use of UAVs in assisting port towing tasks, where usually the tug approaches the ship’s hull to receive a messenger line on its deck, exactly like the one used in offloading. Then, this line is used to gradually pull a robust rope for towing and maneuvering the ship. This operation is very dangerous because the tug must be very close to the hull to receive the messenger line from the ship. The application is described as a RC UAV with a camera and a small pneumatic launcher. This UAV has an embedded computational vision system to detect the target (i.e., a window) to launch the messenger line precisely on this target. This work aims to increase the distance between ships and tugs in maneuvering operations in ports, therefore reducing collision risks, making it more efficient [24].

Also, about small cargo transport between naval structures in an offshore environment, ref. [Reference Frederiksen and Knudsen25] presented a manually controlled multirotor application to perform this task. Light-weighted tools describe the cargo, food, and medicines performed by a drilling company that also operates in the logistics sector (including offloading shuttle tankers). Beyond the offshore industry, ref. [Reference Kim, Park and Zhang26] proposed a methodology for transporting messenger lines between power transmission towers for electricity transmission lines construction using manually controlled UAVs. The work is an initiative of a Korean company to use these vehicles to accelerate construction, therefore improving efficiency. This work addressed wind drag problems, especially because the UAV will fly long distances up to 850 m, and how to carry the line without it being screwed to the vehicle’s propellers, proposing an intermediary towing body on the line.

Related to autonomous satellite navigation, the agriculture, mapping, and environmental monitoring sectors are playing an important role in applying precision flight technology to UAVs. The real-time kinematics (RTK) and post-processing kinematics positioning systems applied for UAV flight on remote environments are analyzed by ref. [Reference Tomaštík, Mokroš, Surový, Grznárová and Merganič27], being able to accomplish mapping missions with an incredible 10–20 cm maximum deviation from the planned trajectory. Also, using RTK and integrating it with ultra-wideband, ref. [Reference Gryte, Hansen, Johansen and Fossen28] proposed a robust solution for possible satellite dropouts during UAV navigation, reaching flight precision under 1 m of deviation from the planned waypoints as well. Also, evaluating precision applied to sand beaches topography mapping, ref. [Reference Casella, Drechsel, Winter, Benninghoff and Rovere29] attested the high precision of Global Navigation Satellite System-RTK (GNSS-RTK). Finally, ref. [Reference Zimmermann, Eling, Klingbeil and Kuhlmann30] evaluated a methodology to avoid GPS restrictive areas for flight planning and, if these regions cannot be avoided, a methodology based on near known structures positions to help navigation in case of satellite missing, similar to the proposed configuration of the approximate messenger line delivery point.

Despite its high precision, the RTK-based flight demands very expensive sensors and is vulnerable to satellite reception failures in the proposed application, mainly due to the fact that oil rigs and ships’ surroundings may have high interference caused by the communication systems of the vessels. Visual-based inertial navigation in a similar environment to the proposed application is presented by ref. [Reference Durdevic, Ortiz-Arroyo, Li and Yang31]. The authors addressed a simulation of an autonomous UAV system that detects wind turbines in offshore wind farms using computer vision (CV) and used the detected object to approach and perform visual surveillance of the turbine. The project, however, did not present fail-safe methods if the object is not detected, nor control strategies to improve precision. In ref. [Reference Qin and Wang32], it is presented a visual object tracking methodology to determine the tracked target position on the image. They used EKF to deal with the lack of detection issues, which may be caused by obstacle occlusion of the target during a short period or even background color interference. This problem also may affect the proposed methodology in the present work. This project was tested in an onshore controlled environment, where the UAV flies directly above the target (a small RC car), and the objective is to correct its inertial position to keep the target on the center of the image. The system worked even when the small tracked object passed under a table because the EKF could predict trajectory for a short period while the detection was not performed. The system would have failed if the object had changed its trajectory under the table. Concerning the proposed methodology in the present work, the goal is to get close to the ship, ensuring robustness to the yaw control if the object detection fails or the UAV experiences wind drag very similar to the work in ref. [Reference Qin and Wang32]. Thus, an abrupt change in detected object position is very unlikely to happen. Finally, ref. [Reference Chuang, He and Namiki33] showed a UAV inertial positioning method based on tracked target coordinates using computational vision. This is very alike the proposed final object detection, where the final adjustments on the UAV inertial pose will be performed based on the marker detection.

Specifically about the object detector, which in this work is a ship or an FPSO/FSO (which is also a ship), most of the literature works are based on satellite imagery, where it is simple to apply digital image processing methods to segment objects on water. Another significant amount of works describes applications regarding stationary cameras on ports, where more robust computational vision techniques, such as neural networks object detectors, are facilitated by the background constancy and the low speed of the ships. Regarding non-stationary platforms, ref. [Reference Zhang, Li and Zang34] presented a ship detection methodology based on cosine discrete transform to detect the line of the horizon in the sea and then detect objects above this line or passing through it, which are presented as the ships. This methodology proved to be a good ship detector for a non-stationary mounting (the camera may be placed on a motorboat, for example), but it is highly dependent on the camera’s position because it necessarily demands that the horizon line appears on the image and is difficult to use it on a flying vehicle depending on camera’s angle, failing to detect a big object that is close to the camera, as it would cause occlusion to the horizon line. Their methodology did not detect the type of the ship and the generic regions of the vessel. The authors reported that the system might be confused by waves and marine animals (as whales and birds), detecting it as ships on the horizon.

It is also worth noting that many related methods evaluated are very narrow in scope and do not show all components required to perform the proposed tasks autonomously, comprising assisting navigation and control for small cargo transport. Compared to the works referenced in [Reference Frederiksen and Knudsen25] and [Reference Kim, Park and Zhang26], the present work steps ahead by aiming to present the application, which is a flight methodology to precisely reach a point in the destination vessel for in future carry the line to that point, and also describe the autonomous adopted strategy to complete this task.

This research contributes to the literature by automating the offshore messenger cable transfer operation approach. This work is based on CV and sensor fusion techniques. Compared to refs. [Reference Durdevic, Ortiz-Arroyo, Li and Yang31Reference Chuang, He and Namiki33], this work aims at approximating the correct part of the vessel to deliver the messenger cable, ensuring robustness to the yaw control.

Different from ref. [Reference Zhang, Li and Zang34], in this work, the deep learning approach is used to detect the ship and its regions. A very limited number of works have evaluated CV usage to segment the ship’s parts. This means that a good ready-to-use dataset is not available to compare performances and methodologies directly. In [Reference Pitsikalis, Do, Lisitsa and Luo35, Reference Zhao, Zhang, Sun and Xue36], the authors performed ship classifications and segmentation respectively in a similar fashion of this the proposed strategy in this work. These works focused explicitly on classification and did not use this information in their applications.

Table I gives a comparison of the proposed methodology among some related works.

Table I. Related works summary.

1.3. Main contributions

As can be seen, the main goal of this research work is to propose a sensory organization and processing for an autonomous UAV based on CV, and sensor fusion approaches to deliver the messenger cable from the FPSO to the shuttle tanker, or vice versa. The framework is assessed by simulations performed in the robot operating system (ROS) along with Gazebo platform [Reference Koenig and Howard37] to simulate the framework in a realistic environment to verify the proposed hybrid approach’s functionality as a proof of concept. Beyond the contribution to application, the controller packages and developed environment to perform simulation contribute to the community to future developments. Another contribution of this work is the development of a databank of 580 images of shuttle tankers, FPSOs, FSOs, and digital 3D models that enables future work training and validation. The objects include ship, bow, mid-ship, and stern, respectively, annotated and organized in Pascal VOC files, and ready to train new neural network CV models [Reference Silva Ramos and Faria Pinto38]. Thus, the main contributions can be summarized as follows:

  • A novel automated approach to offshore messenger cable transfer operation, contributing to literature with an application not yet widely explored;

  • A novel mathematical modeling that considers the agents relative positioning and the cable statics characteristics;

  • Development of a hybrid approach based on deep learning and sensor fusion to path planning of autonomous UAVs;

  • To perform tests in a realistic environment simulation to verify the proposed hybrid approach’s functionality as a proof of concept. Beyond the contribution to application, the controller packages and developed environment to perform simulation are a contribution to the community to future developments;

1.4. Notations

The notations used in this work are presented below, being the generic Greek letter $\Pi$ used only to represent a matrix or a variable in which a notation is being applied:

  • Π–1: Inverse of a matrix or variable.

  • Π(–): Observation of a matrix or variable before update.

  • Π f : Final value of a variable.

  • Π I or Π o : Initial value of a variable.

  • Π k : $k{\rm th}$ discrete time observation of a matrix or variable.

  • Π k+n : $n{\rm th}$ observation after $k{\rm th}$ discrete time observation of a matrix or variable, being n a positive integer.

  • Π k–n : $n{\rm th}$ observation before $k{\rm th}$ discrete time observation of a matrix or variable, being n a positive integer.

  • Π m : Maximum value of a variable.

  • Π quantity : Matrix or variable related to a physical quantity (e.g. velocity or “v,” angular speed or “ $\omega,$ ” etc.).

  • Π sensor : Matrix or variable related to a sensor reading (e.g. compass, camera, barometer, etc.).

  • Π T : Transpose of a matrix.

  • Π x : Variable related to the horizontal (or x) coordinate of a pixel in an image.

Is noteworthy that the notations may appear combined in the text.

1.5. Organization

The rest of this article is organized as follows. Section 2 describes the overall proposed hybrid strategy, the mathematical foundations and main idea of the autonomous trajectory and navigation, as well as the cable messenger delivery. Section 3 presents the results in the simulation environment and discusses the adopted algorithms. The final concluding remarks and ideas for future works are in Section 4.

2. Proposed methodology

As stated before, the mission of reaching the messenger cable’s delivery point to transfer it from the platform to the FPSO, or vice versa, is divided into three stages. Figure 2 presents the process in a simplified way.

2.1. Autonomous trajectory

In the first step, the operator must configure the UAV’s offboard controller with the delivery point GPS coordinates and the region of the vessel that this point is located. At this point, the user must also inform where is the antenna that provided the position of the destination ship, the vessel’s length (in meters), the vessel’s heading angle (in degrees), the flight height, and the desired distance between the ship and the vehicle at the end of this mission’s stage, both in meters. The trajectory calculation reads the coordinates of the takeoff point through GPS, making this the origin of a three-dimensional Cartesian system(x-axis oriented toward the east, y-axis oriented toward magnetic north, and z-axis being altitude). The system receives the information and designs a trajectory in the form of a straight line, as the red line in Fig. 3. The reason for the trajectory shape is to avoid gas escapes points and classified areas around thevessels.

Figure 2. Operation flowchart.

Figure 3. Linear trajectory.

The points of this trajectory are the x and y coordinates (in meters) regarding the takeoff point, and the z value is the flight height in meters. Thus, based on ref. [Reference Karney and Deakin39], given the destination coordinates set by the operator and the takeoff coordinates provided by the GPS, it is possible to calculate the distance D between the takeoff point and the arrival point:

(1) \begin{align} D &= R \cdot \arccos (\!\sin (Lat_i) \cdot \sin(Lat_f)\nonumber\\[3pt] &\quad +\cos(Lat_i) \cdot \cos (Lat_f) \cdot \cos(Lon_i-Lon_f)) \end{align}

where Lat and Lon are the GPS coordinates (in radians) and the notations “i” and “f” symbolize start (i.e., takeoff location) and destination, respectively. The parameter R symbolizes the Earth’s radius,admitted as $6.373 \times 10^6$ m. Equation (2) gives the azimuthal direction $\theta$ (in degrees) of the destination from the takeoff point.

(2) \begin{equation}\theta = \arctan \cdot \frac{\Delta_{(Lon)}}{\Delta_{(\phi)}} , \end{equation}

where

(3) \begin{equation}\Delta \phi = \log\left[\frac{\tan(\frac{Lat_f}{2}+\frac{\pi}{4})}{\frac{Lat_i}{2}+\frac{\pi}{4}}\right] \end{equation}
(4) \begin{equation}\Delta lon = |Lon_i - Lon_f| \end{equation}

The line end $(x_f, y_f)$ is approximately the destination vessel’s point where the UAV must deliver the messenger cable. This coordinate is informed to the control node and serves as feedback to measure the UAV’s distance to the target in step 2. Thus, the trajectory calculation publishes this route to the controller with the points (x, y), the flight height at each point (z), and the robot’s orientation (yaw) at each point. The main objective is to continuously maintain the destination vessel in the UAV’s field of view (FOV). Thus, when the robot reaches the flight height, the value of $\theta$ is set up into the yaw controller. After this, the UAV starts the trajectory maintaining the destination on the center of the field of view. The yaw angle varies ± $90^\circ$ (depending on the direction of $\theta$ ) along the trajectory. Thus, the UAV orientation will be approximately perpendicular to the destination point at the end of this stage. It will always be aligned with the vehicle’s line of sight during navigation. The coordinates $(x_f, y_f)$ of the destination point are obtained using:

(5) \begin{equation}\begin{split}x_f= D \cdot \sin \theta \\ y_f= D \cdot \cos \theta\end{split} \end{equation}

Thus, the trajectory is described by points (x, y) that belong to the line h, according to Eq. (6).

(6) \begin{equation} h: y = x \tan b \end{equation}

With $x \in [0, h\cos(b)]$ and $y \in [0, h\sin(b)]$ , where

(7) \begin{equation}a + b = \theta \end{equation}
(8) \begin{equation}a = \arctan \left(\frac{d}{h}\right). \end{equation}

and

(9) \begin{equation}h = \sqrt{D^2+d^2}. \end{equation}

Notice that the authors only developed the flight control methodology for this paper’s proposed operation. For an initial viability analysis, the messenger line’s catenary is considered due to its geometry related to the tension applied to the line; therefore, the payload that the UAV can carry. The project’s concept was designed to avoid the messenger line touching the water for the robot operation safety, as shown in Fig. 4. The purpose of those assumptions is to previously estimate if the flight methodology would be applicable and to what class of UAV.

Figure 4. Messenger line catenary.

Based on ref. [Reference Veritas40], the minimum point $z_m$ of the catenary, in meters, can be described as:

(10) \begin{equation}z_m = \frac{L}{8}\left(\frac{wL}{T_0}\right)\left(1+\frac{T_0}{EA}\right) \end{equation}

The parameter L is the distance between takeoff point and UAV, w is the fluctuated weight of the messenger line, E is the elasticity modulus of the line, A is the section area of the line, and $T_0$ is the traction applied to the messenger cable. This way, once the operator configures other parameters, the system can provide approximate traction that has to be applied to the messenger line to prevent it from touching the water, which would be a risk. The system can also predict if the operation is safe, given the payload capacity of the UAV. Notice that the messenger line is a very thin and light polymeric naval rope, being only used to gradually pull heavier rope and chain compositions until only the mooring link remains between the two naval structures.

Similarly to ref. [Reference Chang and Padır41], the thin messenger line in the simulation will be modeled based on the Gazebo’s “fire hose model.” This model considers a fire hose which is composed of cylinders connected by spherical joints. The model was modified to make the cylinder’s radius smaller and its length bigger. The mass and inertial parameters of each cylinder link in the “hose” were also modified to behave more like the thin messenger line in simulation. Figure 5 presents the schematic of the line model used in simulations.

Figure 5. Messenger line model for simulations.

In practical terms, the main dynamic forces issues to be experienced by the UAV are due to the line’s length, which will increase mass over time, and the drag effects caused by the wind acting on the messenger line. Therefore, in simulation, each cylinder link will be subjected to wind drag (D) and weight (W) forces, as presented in Fig. 6. For each link, the resultant (R) will be solved, and therefore the simulated UAV will be subjected to a resultant force.

Figure 6. Dynamic forces acting on a line model link.

The weight force W, always pointing downwards, will be solved by:

(11) \begin{equation}W = m \cdot g\end{equation}

where m is the link mass and g is the gravity acceleration. The drag force will be solved on each link by:

(12) \begin{equation}D = \frac{\rho.C.A.v^2.\sin\theta}{2}\end{equation}

where $\rho$ is the air density, C is the coefficient of drag, A is the cross area over which the wind is striking, v is the relative velocity between the line model link and the wind, and theta is the angle of incidence of the wind over the cross area.

With two forces calculated in each link, it is possible to obtain the resultant vector force acting along the line. As the vehicle gets apart from the takeoff point, Eq. (10) will be applied to verify the tension force necessary for the line not to touch the water surface. In order to verify the dynamic effects on the UAV, it will be analyzed its roll and pitch angles, as well as the vibration in these axes. This will indicate if the UAV is struggling to carry the line due to the weight and drag of the cable. Thus, these results can be used to ensure the methodology’s feasibility.

2.2. Inertial navigation aided by CV and EKF

This operation stage refers to the destination approach. In addition to flight control, it is necessary to detail the adopted CNN to perform vessel detection and why it was chosen, the network training, and its results. This step’s flight control will have a state prediction and sensor fusion step with an EKF. Therefore, this section will also present control and filter modeling. The methodology uses computational vision to aid the navigation since the vehicle has a vast empty space around it and the destination objects are too extensive. Furthermore, the GPS and avionic sensors have some noise readings that could cause navigation errors and maybe a collision with the ship. Fusing the camera readings with the sensors using EKF is an approach to avoid that problem.

2.2.1 Convolutional neural network

The CNN used in this work is the YOLOv3. Such choice was motivated by its versatility, ease of training, known results of evaluation metrics, and credited speed in detection, and also because it uses only convolutional layers, making the CNN invariant to the input image’s resolution and size [Reference Hiemann, Kautz, Zottmann and Hlawitschka42]. For more details, the authors refer to ref. [Reference Choi, Chun, Kim and Lee43]. Based on YOLOv3, YOLOff was developed to detect the ship and its regions (bow, mid-ship, and stern), as presented in ref. [Reference Gonzalez, Kacete, Murienne and Marchand44]. A quantity of 580 images of oil shuttle vessels and FPSO platforms were selected from an open-source database [45].

A novel approach to the training was used, dividing it into two stages. The first is responsible for freezing the first 26 of the 53 layers, only changing the last layers’ weights. The training monitors the “Yolo-loss function” [Reference Redmon, Divvala, Girshick and Farhadi46], so if the loss and validation loss reach the minimum desirable value with this step, there is no need to train the weights of the frozen layers, finishing the train in this first step.

The second training only begins if the first one does not reach the desired precision. All layers will be unfrozen in this step. It will be applied a variable descent learning rate for making the training process more aggressive by modifying more weights if the loss and validation loss functions stabilize themselves on a baseline for three consecutive epochs. The goal is to reduce the loss function output between epochs even more, resulting in a more accurate detection. The purpose of these two steps is a possible reduction of time and optimize the training process. Figure 7 illustrates the YOLOv3 structure used in this work.

Figure 7. YOLOv3 convolutional neural network structure.

A detector was developed to select the objects of the most significant interest (destination region), especially with the greatest confidence in detection. The chosen object centroid position goes through an EKF’s prediction step to avoid sudden fluctuations. This centroid value of the predicted region is published as a ROS node in the controller to keep the UAV always oriented toward the destination region.

The YOLOff network test was based on average precision methodology (mAP) and on $Precision \times Recall$ (or $P\times R$ ) curve [Reference Padilla, Netto and da Silva47]. The mAP metric allows inferring how well a detected object is related to its respective ground truth (GT). However, this methodology can lead to misinterpretations because the network may detect annotated objects and several non-existing objects within the image. The $P\,{\times}\,R$ curve eliminates this problem, where P is the reason between true positives (TPs) and the total of detected objects in an image, and R denotes the reason between TPs and the total GTs in an image. It is expected to P falls as R improves, but a good object detector has high P and R at the end of the process.

2.2.2 Extended Kalman filter

As described in ref. [Reference Kalman48], the EKF is a recursive technique for dynamic and linear systems that are subjected to noise. The EKF model used in this work is derived from ref. [Reference Mao, Drake and Anderson49]. The algorithm can be divided into two steps: (i) prediction and (ii) update. In this paper, the prediction estimates the state from the UAV’s orientation (yaw), the x and y positions, and linear (v) and angular ( $\omega$ ) velocities. The update step corrects the state found in the prediction phase. The state equations of the modeled filter for this work are:

(13) \begin{equation}\begin{cases}x_{k+1} = x_k + v_k \Delta t_k \cos (yaw_k) + w_{x_k}\\ \\[-9pt] y_{k+1} = y_k + v_k \Delta t_k \sin (yaw_k) + w_{y_k}\\ \\[-9pt] yaw_{k+1} = yaw_k + \omega_k \Delta t_k + w_{yaw_k}\\ \\[-9pt] \omega_{k+1} = \omega_k + \epsilon_{\omega,k} w_{\omega_k}\\ \\[-9pt] v_{k+1} = v_k + \epsilon_{v,k} w_{v_k} \end{cases}\end{equation}

where $\epsilon_{\omega,k} $ and $\epsilon_{v,k} $ are the noise gain therms in linear and angular velocities, and the w therms are assumed as the “extra errors” or noise, in-process or state equations, imposed by non-deterministic disturbances. The $\Delta t_k$ represents the time step between two sensor readings. The linearized state equations can be represented in state-space matrix form as:

(14) \begin{equation} \mathbf{T}_{k+1} = \mathbf{F}(\mathbf{T}_k) \mathbf{T}_k + \mathbf{\Gamma}(\mathbf{w}_k) \mathbf{w}_k \end{equation}

where

(15) \begin{equation}{\mathbf{T}_k}^T = \begin{bmatrix} x_k & y_k & yaw_k & \omega_k & v_k \end{bmatrix} \end{equation}

is the state vector and

(16) \begin{equation}\mathbf{F}(\mathbf{T}_k) = \begin{bmatrix} 1 &\quad 0 &\quad - \Delta t_k v_k \sin\!(yaw_k) &\quad 0 &\quad \Delta t_k \cos\!(yaw_k) \\ \\[-8pt] 0 &\quad 1 &\quad \Delta t_k \cos\!(yaw_k) &\quad 0 &\quad \Delta t_k \sin \!(yaw_k) \\ \\[-8pt] 0 &\quad 0 &\quad 1 &\quad \Delta t_k &\quad 0 \\ \\[-8pt] 0 &\quad 0 &\quad 0 &\quad 1 &\quad 0 \\ \\[-8pt] 0 &\quad 0 &\quad 0 &\quad 0 &\quad 1 \end{bmatrix} \end{equation}

denotes the Jacobian of $ T_k $ . Matrix $ \Gamma(w_k) $ contains the partial derivatives of the state equations with respect to the stochastic terms in the $ w_k $ vector:

(17) \begin{equation}{\mathbf{\Gamma}(\mathbf{w}_k)}^T = \begin{bmatrix} 1 &\quad 1 &\quad 1 &\quad 1 &\quad 0\\ \\[-8pt] 1 &\quad 1 &\quad 1 &\quad 1 &\quad 0\\ \\[-8pt] 1 &\quad 1 &\quad 0 &\quad 0 &\quad 1\\ \\[-8pt] \end{bmatrix}. \end{equation}

Note that $ w_k $ relates with noise covariance, but $ yaw_k $ is estimated with two sensor components, the flight controller compass and the camera-estimated orientation.

One of the most interesting features of the EKF is sensor fusion. Sensor fusion is defined as integrating information from many sensors to infer anything about a state component. The gain of doing this is to reduce the values within process noise mathematically and measurement noise covariance matrices, obtaining a more assertive prediction of the state measured by combining the sensors [Reference Zhang and Liao50]. Supposing an example of a simplified system of a UAV with the following observation model:

(18) \begin{equation}\mathbf{s}_k=\mathbf{M_k}\mathbf{T}_{k-1}+\mathbf{\nu}\end{equation}

where $\mathbf{s}_k$ is the measurement vector in the time k, $M_k$ is the measurement matrix, $\mathbf{T}_{k-1}$ is the state vector in time $k-1$ , and $\mathbf{\nu}$ is the measurement noise vector. The example’s sensors measurements can be described as:

(19) \begin{equation}\begin{bmatrix}GPS_k\\ \\[-7pt] barometer_k\\ \\[-8pt] compass_k\end{bmatrix}=\begin{bmatrix}f(position) &\quad 0 &\quad 0\\ \\[-8pt] 0 &\quad f(altitude) &\quad 0\\ \\[-8pt] 0 &\quad 0 &\quad f(heading)\end{bmatrix}\begin{bmatrix}position_{k-1}\\ \\[-8pt] altitude_{k-1}\\ \\[-8pt] heading_{k-1}\end{bmatrix}+\mathbf{\nu}\end{equation}

being f the sensor’s functions that determine the state by each sensor reading. This way, this UAV model for a simple EKF will use GPS readings to predict the vehicle’s position, a barometer to predict altitude, and a compass to predict heading. Supposing the installation of a sonar sensor also to obtain altitude readings, the new system’s model would be:

(20) \begin{equation}\begin{bmatrix}GPS_k\\ \\[-8pt] barometer_k\\ \\[-8pt] compass_k\\ \\[-8pt] sonar_k\end{bmatrix}=\begin{bmatrix}f(position) &\quad 0 &\quad 0\\ \\[-8pt] 0 &\quad f(altitude) &\quad 0\\ \\[-8pt] 0 &\quad 0 &\quad f(heading)\\ \\[-8pt] 0 &\quad f(altitude) &\quad 0\\\end{bmatrix}\begin{bmatrix}position_{k-1}\\ \\[-8pt] altitude_{k-1}\\ \\[-8pt] heading_{k-1}\end{bmatrix}+\mathbf{\nu}\end{equation}

In this way, the model will then consider two sensor readings to altitude. The measurement noise vector will also be incremented by an element representing the noise of the new sensor. Within the filtering steps, the new sensor will improve altitude estimation by diluting the uncertainty of the altitude state observation. This work will use this technique to fuse two camera heading (or yaw) estimation with the compass readings, improving positioning estimation and providing robustness to the controller on inertial navigation assisted by CV. The sensor matrix $\mathbf{s_k}$ representing measured states in this work is represented by:

(21) \begin{equation}\mathbf{s_k}^T = \begin{bmatrix} x_k & y_k & yaw_{k,compass} & yaw_{k,camera} & \omega_k & v_k \end{bmatrix} \end{equation}

where $yaw_{k,compass}$ is just the orientation from the flight control unit (FCU) compass and $yaw_{k,camera}$ is the orientation deviation based on compass and camera readings, defined by:

(22) \begin{equation}yaw_{k, camera} = yaw_{(k-1), camera} + \frac{|img_x - obj_x|}{img_x}\Delta t_k \end{equation}

where the parameter $img_x$ is the center of the camera’s image, $obj_x$ is the x-coordinate corrected by a linear Kalman estimation process of the object centroid detected by YOLOff CNN in the camera image, and $\Delta t_k$ is the time step between two successive detections.

The higher the $\Delta t_k$ , the higher the current yaw estimation deviation compared to the previous yaw estimation from the camera. The initial camera yaw is set to be equal to the measured by the compass, and if the yaw estimation from YOLOff differs from the compass’ measurement of more than $10^\circ$ , the $yaw_{(k-1), camera}$ is reset to be the last compass measurement to avoid yaw estimations with a high error if the detection fails. As already mentioned, the model used in this work is based on ref. [Reference Mao, Drake and Anderson49], thereby the EKF steps are executed as follows:

  1. i. State prediction:

    (23) \begin{equation} \mathbf{T}_{k+1}^{(-)}=\mathbf{T}_k \end{equation}
  2. ii. Calculate prediction covariance matrix $\mathbf{P}$ :

    (24) \begin{equation} \mathbf{P}_{k+1}^{(-)}=\mathbf{F}(\mathbf{T}_k) \mathbf{P}_k \mathbf{F}(\mathbf{T}_k)^{T} + \mathbf{\Gamma}(\mathbf{w}_k) \mathbf{Q}_k \mathbf{\Gamma}(\mathbf{w}_k^{T}) \end{equation}
    where $\mathbf{Q}_k$ is the process noise covariance matrix
  3. iii. Calculate filter gain $\mathbf{K}$ :

    (25) \begin{equation} \mathbf{K}_{k+1}=\frac{\mathbf{P}_{k+1}^{(-)}\mathbf{H}(\mathbf{T}_k)^{T}}{\mathbf{H}(\mathbf{T}_{k+1})^T+\mathbf{R}_{k+1}} \end{equation}
    where $\mathbf{P}_{k+1}^{(-)}$ is obtained by Eq. (24), $\mathbf{H}(T_k)$ is the Jacobian of sensor matrix $\mathbf{H}(\mathbf{T}_k)$ , and $\mathbf{R}_k$ is the covariance matrix of the sensor noise.
  4. iv. Update states:

    (26) \begin{equation} \mathbf{T}_{k+1}=\mathbf{T}_{k+1}^{(-)}+\mathbf{{M_{k+1}}}^{(-)}\mathbf{K}_{k+1}, \end{equation}
    where $\mathbf{M}_k$ is the measurement matrix, obtained as a function of sensor matrix $\mathbf{s_k}$ , the sensor error vector $\boldsymbol{\nu}$ and the states matrix $\mathbf{T_k}$ :
    (27) \begin{equation} \mathbf{M}_k=\mathbf{s_k}\mathbf{T_k}^{-1}-\boldsymbol{\nu} \end{equation}
  5. v. Finally, the prediction covariance matrix is updated:

    (28) \begin{equation} \mathbf{P}_{k+1}=[\mathbf{I}-\mathbf{K}_{k+1}\mathbf{H}(\mathbf{T}_{k+1})]\mathbf{P}_{k+1}^{(-)} \end{equation}

Using the camera to maintain the right orientation and predicting the UAV’s position with the EKF, the vehicle can begin to navigate toward the destination ship with safety. In order to correct the yaw deviation error and maintain the direction of the destination, a generic PID controller is used to set the yaw increase or decrease command to the yaw controller of the yaw controller UAV. The attitude control of the PX4 flight controller unit is based on ref. [Reference Brescianini, Hehn and D’Andrea51]. Thus, the authors covered all the controller stability referred to as yaw, pitch, and roll angles. The methodology described in this work is used to predict deviations and correct them. Still, to execute the corrections, the UAV uses the native control loops implemented on the FCU [52]. Figure 8 presents the closed-loop control diagram that corrects the UAV’s yaw for stage 2 of the present methodology.

Figure 8. Control loop of approaching flight mode strategy adopted in stage 2 (consider $yaw_y$ as the $yaw_{k, camera}$ and $yaw_c$ as the $yaw_{k, compass}$ of the model description).

2.3. Cable delivery location

This final stage of the proposed methodology occurs when the robot reaches the maximum approach distance or recognizes the messenger cable’s marker. The sign can be a fiducial marker, a system comprised of a set of valid markers and a CV algorithm that is responsible for performing the detection [Reference Garrido-Jurado, Muñoz-Salinas, Madrid-Cuevas and Marn-Jiménez53], or a simples QR Code, which is implemented in this work. This marker is a symbol strategically positioned at a vessel’s area where the robot must make the final approach and release the cable. Another possible approach is for an operator to carry out the signal and show the UAV where to land. Both fiducial markers and QR Code allow the controller to estimate the camera’s positioning data (and therefore the UAV position data) based on the position of the found sign and also calculate the distance to the target more reliably than using the ship’s region detected by the YOLOff camera node. However, in the current work, the UAV simply stops the mission when it detects the marker in simulation. Still, it will use the marker for satisfactory approximation and release the messenger line in future works.

The marker detection was performed using pyzbar barcode and QR code detection on digital images python library. As a numerical precision metric, authors used the intersection over union (IOU) [Reference Blanger and Hirata54, Reference Rezatofighi, Tsoi, Gwak, Sadeghian, Reid and Savarese55] parameter for evaluating the marker detection algorithm accuracy. The IOU measures how well a detected object in an image relates to the real object (or GT). Mathematically, the metric is the relation between the intersection on the detected and GT bounding boxes areas and the union of those same areas, as shown in Fig. 9.

Figure 9. IOU metric definition.

When the camera-kf node detects the marker on simulation, the IOU for each image frame is calculated and saved on a vector. The results are evaluated for their average, standard deviation, and variance in the results section.

3. Results and discussion

3.1. System description

The robotic system in the Gazebo simulation environment was used for this experiment. The robot is a Typhoon H480 using the PixHawk PX4 FCU and the MavLink communication protocol for telemetry and offboard controller as shown in Fig. 10. The developed simulation scenario is presented in Fig. 11. ROS Kinetic was used for developing the controller, the trajectory calculation, and camera nodes packages in the Python programming language. The MavLink was simulated through the MavROS package. The simulation and YOLOff training were executed in Linux 16.04 OS, with an Intel® Core™ i7-3537U 2.00GHz processor, NVIDIA® GeForce® 625M graphics board, and 6GB DDR3 1600MHz RAM.

Figure 10. Typhoon H480 3D model used in simulation.

Figure 11. Offloading world designed for the simulation.

The developed offboard controller (denoted as /offboard-controller node) can read sensors’ data from the simulated FCU (such as GPS position, IMU data, compass orientation, and barometer altitude) and parse command inputs to the controller (takeoff, navigate, turn, accelerate, etc.). It also can receive the camera information from /camera-kf node and the trajectory points from the /gps-reading node. Figure 12 presents the active nodes and topics. It is possible to check the methodology applied in the proposed scenario: https://youtu.be/RHPUf1Ss93Y

Figure 12. Active ROS nodes and topics.

3.2. Simulation results

As described in Section 3, the path configured in simulation has a takeoff location point read by simulation GPS near $(\!-42.963770^\circ S, -26.236411^\circ W)$ . The destination point has coordinates around $(\!-42.9638135^\circ S, -26.2391134^\circ W)$ , obtained in Gazebo. The configured distance between UAV and ship at the end of trajectory was d = 50 m, and the trajectory was calculated by Eqs. (1)–(9), providing as result D = 301.716 m and $\theta = -90.1011^\circ$ . The calculated and performed trajectory projected in XY plane is presented in Fig. 13.

Figure 13. Calculated and performed trajectories.

The designed controller allows a difference up to 2 m in reading location compared to the desired location point in trajectory beyond GPS and conversion noises. Still, it presented itself as a very stable trajectory-guided navigation system and has some advantages from MavLink (or MavROS in the simulation) and PX4 firmware, such as automatic takeoff and landing and the “return to home” failsafe. The flight height related to the takeoff point was configured as 5 m. The maximum UAV speed related to the ground is set by default as 1 m/s in offboard controller mode. This way, the UAV took around $415.764$ seconds to complete the first stage, but it can be accomplished faster by setting the maximum speed to a higher value. Figure 14 shows the measured altitude results during operation step 1.

Figure 14. Flight height in simulation step 1.

For safety reasons, the takeoff height was around 10 m, but the UAV quickly went to flight height and maintained it during step 1 execution. There are some deviations caused by aerodynamics and inertial conditions, but the controller performs reasonably well, fixing the setpoint’s flight height. Therefore, it also justifies the presumed fixed z position in EKF modeling due to the reliable native PX4 altitude controller [52].

Another important parameter of trajectory planning for the first step was drone orientation. For each (x,y) point of the calculated trajectory, a z height and a desired yaw (orientation) were associated. The initial orientation is the destination’s direction, varying $\pm 90^\circ$ through the trajectory (depending on the destiny direction), to maintain the objective always in FOV. Figure 15 presents UAV orientation results during operation step 1.

Figure 15. Drone orientation in simulation step 1.

When the trajectory begins, the orientation is approximately $-90^\circ$ , and it varies linearly to approximately $-180^\circ$ , satisfying the designed model and proving the efficiency of orientation angle control. At the end of operation step 1, the UAV faces the objective at approximately $43.908$ m from the goal. The next step is the approximation to the destination guided by vision.

3.3. YOLOff CNN training and detecting results

For the training, a set of 560 real shuttle tanker or FSO/FPSO images were annotated manually using Computer Vision Annotation Tool (CVAT) [56]. This caused the YOLOff not to detect the vessels in simulation because the models used in the simulation are simplified solids, and using only real ship images caused wrong classifications because the images used to train were much more detailed (more texture characteristics). The second training used 580 images, where 20 of those ship images were generated by the 3D Gazebo models using the simulated world as a data augmentation strategy. With this, the results improved for the detection in simulation. There were four classes to detect: the ship and its stern, bow, and mid-ship. Figure 16 presents example of images used on training and Fig. 17 presents an example of an annotated image using CVAT. Note that a databank of 580 images of shuttle tankers, FPSOs, FSOs, and digital 3D models was built, and it is available in ref. [Reference Silva Ramos and Faria Pinto38].

Table II. Detection results.

Figure 16. Example of images used for training: (a) FPSO; (b) FSO; (c) shuttle tanker, and (d) digital model of a shuttle tanker.

Figure 17. Example of an annotated image using CVAT.

The training was not completed within the first step with frozen layers, taking 2.633 days (227,515 s) in total. At first, the initial loss was $1288.57$ , and the validation loss was $252.10$ . The final and validation losses were $24.45$ and $27.22$ , respectively. The YOLOff training results and some detection examples in real and simulated images are presented in Figs. 18 and 19, respectively.

Figure 18. YOLOff results for all classes.

Figure 19. YOLOff example of detection.

Table II presents more detailed results for TPs, false positives (FPs), true negatives (TNs), and false negatives (FNs) when YOLOff was applied on the 580 training images. Compared to the total number of annotated GTs, TPs’ results demonstrate that YOLOff is a good object detector for the designed application.

For a comparative result, Table III presents the accuracy metric (ACC) and also the true positive, false positive, true negative, and false negative ratios (TPR, FPR, TNR, and FNR, respectively). This table shows that the YOLOff has good accuracy, mainly detecting the ship, and the error ratios (FPR and FNR) are considerably smaller than the correct detection ratios.

Table III. Detection evaluation metrics.

YOLOff proved to be a good object detector for the application described. In the simulation, it is running on the /camera-kf node. The ship mAP was $93.06\%$ , and the bow, mid-ship, and stern were $80.81\%$ , $81.39\%$ , and $90.82\%$ , respectively. The PxR curves also have appropriate behavior, with Precision metric above 90% and Recall metric above 80% after all detections in the 580 image dataset. It feeds its centroid as a data message to the offboard controller node when detecting the destination region in the image. The YOLOff processed the images at 20 frames per second. Table IV presents the precision and recall of the YOLOff detection compared to ref. [Reference Zhang, Li and Zang34]. For ship detection, ref. [Reference Zhang, Li and Zang34] methodology is slightly more precise than YOLOff, but it does not detect the ship’s region as already described. It is also based on discrete photography, and the YOLOff detection is performed in real-time between 15 and 20 fps.

Table IV. Detection comparison between YOLOff and ref. [Reference Zhang, Li and Zang34].

The controller corrects the drone’s orientation. After the EKF prediction step, the controller sends a velocity command to MavROS. Therefore, approaching the desired region until it finds the final marker or reaches the approximation limit. Figure 20 presents the approaching results for operation step 2.

Figure 20. Approaching results. (a) Approaching to the objective on operation step 2. (b) Complete performed trajectory.

In order to test a failsafe scenario, the same simulation was executed, but before the QR Code marker object in the 3D digital environment was removed. The UAV executed the route as expected, but it did not find the marker since it was not there. When the safety distance limit was reached, which in this simulation was 5 m from the UAV to the approximate location of the marker, the UAV started to fly around, trying to locate the marker. When the operator canceled the mission, it autonomously returned to the home point and landed. The UAV’s position in this mission is presented in Fig. 21.

Figure 21. UAV position in failsafe test.

3.4. Sensor fusion

The navigation results are presented in Fig. 20, in which is showed that the controller was able to autonomously perform an triangular-shaped trajectory as expected in Fig. 3, and approach to the objective only aided by CV. The modeled EKF has crucial importance, as it corrects the orientation prediction of the controller. Also, the position prediction using EKF is smoother than the measured position points, although this position is not used for the inertial navigation controller. This result is still important for a future positional controller implementation during the inertial navigation approaching step. The orientation points overtime during operation step 2 are shown in Fig. 22.

Figure 22. EKF yaw predictions on operation step 2.

The peak at the beginning of predictions is caused by the high process covariance P matrix configured. As expected, the EKF predicts orientation with a high error that quickly is dissolved. The predictions are then made with enhanced precision, determining a yaw angle more stable than sensors readings. Another notable result is the values predicted when sensor occlusions occur. The values predicted with the EKF supply the absence of sensor readings, giving robustness to the orientation controller.

3.5. Marker detection

In the final step, the /camera-kf detects the objective marker, sending and ending command to the offboard controller node. Figure 23 shows the marker detection within the simulation.

Figure 23. Marker signal detection on operation step 3.

Figure 24 shows the IOU results for the marker detection in the simulated camera frames. As demonstrated, the proposed methodology has a $92.06\%$ average IOU, which means the detected marker represents the true marker in position and area with good confidence. This is a good result for marker detection and can be improved for future vision-aided positioning works. The standard deviation of IOU is $2.41\%$ , which means the detections do not suffer significant variations between frames, giving robustness to the controller and camera nodes.

Finally, Fig. 25 presents the UAV distance from the takeoff point compared to the same distance related to the marker. As stated, the vehicle will hold its position after reaching the delivery point. The mission was completed in simulation with a root squared mean error of 1.189 m and a mean absolute error of 1.04 m, proving the proposed methodology’s precision of reaching the delivery point without expensive GNSS sensors, with a robust control strategy.

Figure 24. IOU results for marker detection on simulation.

Figure 25. UAV position related to the origin (takeoff point) after marker is found.

3.6. Line transport

As seen before, the final point is located approximately 36 m below the takeoff point. At the end of stage 1, the UAV was 41 m above the destination and 301.6914 m from the takeoff point. Using the messenger line’s characteristics described in Table V and Eq. (10), the traction applied to the messenger line would prevent the cable from touching the water surface would be 13.0422 N.

The UAV must carry the messenger line’s weight, support the traction applied to the line, and deal with environmental forces. To maintain the cable above water at the maximum distance, it would have to deal with a payload of approximately 20.519 kg. It is considered almost 12 kg for the 301.6914 m of rope’s weight, around 5 kg (more than double of the calculated) of line’s traction to maintain it above water, and a surplus of 5 kg representing external environmental forces acting on the UAV’s body. It is also important to highlight that the distance is an exaggeration, once the shooting position of the messenger line’s launching demands that the ship and the oil rig must be 50 m apart.

To meet the safety improvements required, a distance of 100–150 m to receive the messenger line would be a considerable improvement. Hence, 301.6914 m is much more than what would be experienced in a real operation. This serves as an estimate for the proposed application viability, indicating that a heavy-lifting UAV would be necessary, such as in ref. [57], a robust UAV that can transport around 25 kg of payload and is also water resistant. Notice that the Typhoon H480 used in the simulation would not be able to carry a messenger line through the simulated path. Still, the objective of this work is to address the flight methodology to perform this task.

In order to verify the methodology feasibility, it is proposed to replace the normal messenger line used in operations with a thinner and lighter wire, such as a braided microfiber line [58]. The typical messenger line must serve as a guide for the ship to pull the mooring link composition, so it has a breaking strength of the order of 500 tons. However, these lines have a linear weight of around 30 g/m and a diameter of 10 mm. The replacement strategy consists of delivering the thinner line to the ship, then using it to pull the typical messenger line, continuing the operation as usual. As the simulated UAV Typhoon H480 could not deliver the normal messenger line, the simulation considered this thinner and lighter line to accomplish the task, which has breaking tension of $38.6$ kg in the diameter of $0.5$ mm, with a linear weight of $0.49$ g/m [58]. So, more than enough to be carried out by the simulated UAV and then pulled the normal messenger line. Figure 26 presents the line in simulation.

Table V. List of messenger line’s parameters.

Figure 26. Messenger line in simulation.

Three hundred and fifty links of 1 m each were used with the mass of $0.49$ g per cylinder (sphere joints were configured without weight) and $0.5$ mm of radius, the same used in the spheres. This modeled line was so thin and light that Eq. (10) could not be applied to it since it is applicable only if the tension in the cable is larger than the weight of the cable (the total weight for the $301.716$ m maximum distance is $147.841$ g). The typhoon successfully delivered this line without the catenary even touching the water surface. To evaluate the simulated UAV behavior and stability for this application, Fig. 27 shows the vehicle’s pitch and roll angles and also the vibration on the yaw, pitch, and roll axis.

Figure 27. Typhoon H480 pitch and roll angles (a) and vibration (b) transporting the line in simulation.

As the manual describes [59], the maximum rotation angle is $\pm 35^\circ$ and the maximum rotation rate is $\pm 85^\circ/{\rm s}$ . In the simulation, the vehicle behavior was framed within these stability parameters, facing more vibration in the last stages where the visual-inertial navigation was active. The vehicle presented very stable behavior and operated far from the stability limits imposed by the manufacturer, assuring the methodology feasibility given the considered models.

Compared to the literature, this research brings a sensory organization and processing approach for autonomous UAVs applied to offshore messenger cable transfer operations. As can be seen, this framework contributes to the literature with an application not yet widely explored. The tests to evaluate the proposed approach were performed in a realistic environment simulation considering practical operational constraints as a proof of concept. The developed framework behaved very well, showing feasible results.

Although errors encountered in the proposed approach are mainly caused by the GPS and IMU sensors presented in the sensory fusion, the CV method tries to mitigate those errors. Those sensors have accuracy limitations that cannot be easily removed without using complementary filtering strategies or CV, such as optical flow or CV. The idea of the EKF in the proposed frameworks is that the predicted values supply the absence of sensor readings, offering robustness to the orientation controller.

When evaluating CV results, very few works evaluate ship region segmentation to allow direct comparison. Results shown in these refs. [Reference Pitsikalis, Do, Lisitsa and Luo35] and [Reference Zhao, Zhang, Sun and Xue36] are the closer ones. The results obtained are 5–10% better for overall ship detection and similar for ship segmentation. The dataset from these works is also not available. This makes it difficult for a direct comparison. The authors expect that the dataset published here will allow future works in the field to perform a direct comparison.

Note that it is difficult to determine precisely the sources of errors in the segmentation process. However, the authors believe that camera position and database size may be the principal factors at play. A larger database with more camera angles may enable more accurate detection.

4. Conclusions and future work

The methodology proved its contribution as a reliable precise navigation framework applied to the described task as the UAV was able to accomplish the mission without using expensive GNSS systems that would be susceptible to failure due to electromagnetic interference around the vessels. As shown by the metrics, the YOLOff training went well, so did the EKF modeling and its implementation in the controller node. ROS proved to be a robust platform for modeling the proposed application and showing robust results.

In terms of evaluation, this paper opens up several future works. For instance, there is still a need to use the detected marker as approximation controller feedback in future implementations. Also, the aerodynamics, inertial forces, and messenger line’s dynamic forces need an improved analysis in the simulation, such as the ship’s movements during the operation. The normal sea messenger line cable must be coupled in the UAV on the simulation. Its effects on the flight must be analyzed on the UAV’s flight behavior before testing the offboard controller on the real proposed environment.

Note that the implemented methodology is thoroughly ready to be tested on a physical PixHawk PX4 flight controller (without the messenger line’s cable attached to it) due to its robust simulation results and the MavROS ready-to-use offboard controller package implemented. Even using a replacement line in the operation presented solid results and may be tested in a real environment. Therefore, an important next step would be testing the developed methodology in a real vehicle. The simplified messenger line’s catenary analysis also shows that industrial-class UAV models available on the market could perfectly meet the requirements to perform a real mission such as described.

A few extensions are foreseen in this research work. First, an economic and technical feasibility study will be performed. There is a need to estimate the economic gain of applying the proposed framework and all the costs and risks associated with the proposed methodology. Another future work is creating an autonomous feasibility map in case of changes in the ships’ position and orientation since an operator is responsible for this operation. Therefore, it is intended that the UAV lands in a safe place.

5. Declarations

To ensure objectivity and transparency in research and to ensure that accepted principles of ethical and professional conduct have been followed, all authors provide these declarations statement below.

Ethical approval

Not applicable.

Consent to participate

All authors are consent to participate.

Consent to publish

All authors are consent to publish.

Authors’ contributions

The authors Gabryel S. Ramos and Milena F. Pinto contributed to the conception and solution of the proposed methodology. Gabryel S. Ramos, Milena F. Pinto, Fabricio O. Coelho, Leonardo M. Honório, and Diego B. Haddad have contributed equally to write the manuscript. Fabricio O. Coelho, Leonardo M. Honório, and Diego B. Haddad have reviewed the manuscript.

Funding

Not applicable.

Conflicts of interest

The authors have no conflicts of interest to declare that are relevant to the content of this article.

Availability of data and materials

Not applicable.

References

Bogue, R., “Robots in the offshore oil an d gas industries: A review of recent developments,Ind. Robot Int. J. Rob. Res. Appl. 47(1), 16 (2019). CrossRefGoogle Scholar
Tan, J. H., Roberts, B., Sundararaju, P., Sintive, C., Facheris, L., Vanden Bosch, J., Lehning, V. and Pegg, M., “Transforming Offshore Oil and Gas Production Platforms into Smart Unmanned Installations,” Offshore Technology Conference Asia. Offshore Technology Conference (2020).CrossRefGoogle Scholar
Petillot, Y. R., Antonelli, G., Casalino, G. and Ferreira, F., “Underwater robots: From remotely operated vehicles to intervention-autonomous underwater vehicles,” IEEE Rob. Autom. Mag. 26(2), 94101 (2019).CrossRefGoogle Scholar
Durdevic, P., Ortiz-Arroyo, D., Li, S. and Yang, Z., “Vision aided navigation of a quad-rotor for autonomous wind-farm inspection,” IFAC-PapersOnLine 52(8), 6166 (2019a).CrossRefGoogle Scholar
Yu, L., Yang, E., Ren, P., Luo, C., Dobie, G., Gu, D. and Yan, X., “Inspection Robots in Oil and Gas Industry: A Review of Current Solutions and Future Trends,” 2019 25th International Conference on Automation and Computing (ICAC) (IEEE, 2019) pp. 16.CrossRefGoogle Scholar
Stavinoha, S., Chen, H., Walker, M., Zhang, B. and Fuhlbrigge, T., “Challenges of Robotics and Automation in Offshore Oil&Gas Industry,” The 4th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent (IEEE, 2014) pp. 557562.CrossRefGoogle Scholar
Menegaldo, L. L., Ferreira, G. A. N., Santos, M. F. and Guerato, R. S., “Development and navigation of a mobile robot for floating production storage and offloading ship hull inspection,” IEEE Trans. Ind. Electron. 56(9), 37173722 (2009). doi: 10.1109/TIE.2009.2025716.CrossRefGoogle Scholar
Oil Companies International Marine Forum OCIMF, Guidelines for Offshore Tanker Operations (Oil Companies lnternational Marine Forum, London, UK, 2018).Google Scholar
Sphaier, S. H., Amaral Netto, J. D. do, Oliveira, L. F., Junior, J. S. S., Henrique Sá Correa da Silva, S. and Henaut, M. P. G., “On the safety of offloading operation in the oil industry,” Int. J. Comput. Appl. Technol. 43(3), 207 (2012). doi: 10.1504/ijcat.2012.046307.CrossRefGoogle Scholar
Rodriguez, C. E. P., de Souza, G. F. M., Gilberto, F. M. and Martins, M. R., “Risk-based Analysis of Offloading Operations with FPSO Production Units,” Proceedings of the 20th International Congress of Mechanical Engineering. Gramado, Brazil (2009).Google Scholar
Faiz Ahmed, S., Kushsairy, K., Bakar, M. I. A., Hazry, D. and Kamran Joyo, M., “Attitude Stabilization of Quad-Rotor (UAV) System Using Fuzzy PID Controller (an Experimental Test),2015 Second International Conference on Computing Technology and Information Management (ICCTIM) (IEEE, 2015) pp. 99104.CrossRefGoogle Scholar
Pinto, M. F., Honório, L. M., Marcato, A. L. M., Dantas, M. A. R., Melo, A. G., Capretz, M. and Urdiales, C., “Arcog: An aerial robotics cognitive architecture,” Robotica 39(3), 120 (2020a).Google Scholar
Pinto, M. F., Honorio, L. M., Melo, A. and Marcato, A. L. M., “A robotic cognitive architecture for slope and dam inspections,” Sensors 20(16), 4579 (2020b).CrossRefGoogle Scholar
Rodin, C. D., de Lima, L. N., de Alcantara Andrade, F. A., Haddad, D. B., Johansen, T. A. and Storvold, R., “Object Classification in Thermal Images Using Convolutional Neural Networks for Search and Rescue Missions with Unmanned Aerial Systems,” 2018 International Joint Conference on Neural Networks (IJCNN) (2018) pp. 18. doi: 10.1109/IJCNN.2018.8489465.Google Scholar
Andrade, F. A. A., Hovenburg, A., L. N. de de Lima, C. D. Rodin, T. A. Johansen, R. Storvold, C. A. M. Correia and D. B. Haddad, “Autonomous unmanned aerial vehicles in search and rescue missions using real-time cooperative model predictive control,” Sensors 19(19), 4067 (2019). ISSN 1424-8220. doi: 10.3390/s19194067.CrossRefGoogle Scholar
Silva, M. F., Ribeiro, A. C., Santos, M. F., Carmo, M. J., Honório, L. M., Oliveira, E. J. and Vidal, V. F., “Design of Angular Pid Controllers for Quadcopters Built with Low Cost Equipment,” 2016 20th International Conference on System Theory, Control and Computing (ICSTCC) (IEEE, 2016) pp. 216221.Google Scholar
Mullenders, V., Design of a motion monitoring system for unmanned offshore topside installation based on real-time visual object tracking using drones and fixed cameras on an SSCV (2019).Google Scholar
Selecký, M., Rollo, M., Losiewicz, P., Reade, J. and Maida, N., “Framework for Incremental Development of Complex Unmanned Aircraft Systems,” 2015 Integrated Communication, Navigation and Surveillance Conference (ICNS) (2015) pp. J3–1–J3–9.Google Scholar
Faria Pinto, M., Melo, A., Marcato, A. and Urdiales, C., “Case-based Reasoning Approach Applied to Surveillance System using an Autonomous Unmanned Aerial Vehicle,” (2017) pp. 13241329. doi: 10.1109/ISIE.2017.8001437.CrossRefGoogle Scholar
Pinto, M. F., Marcato, A. L. M., Melo, A. G., Honório, L. M. and Urdiales, C., “A framework for analyzing fog-cloud computing cooperation applied to information processing of UAVs,” Wireless Commun. Mobile Comput. 2019, 14 (2019).Google Scholar
Coelho, F. O., Carvalho, J. P., Pinto, M. F. and Marcato, A. L., “EKF and Computer Vision for Mobile Robot Localization,2018 13th APCA International Conference on Automatic Control and Soft Computing (CONTROLO) (IEEE, 2018)pp. 148153.Google Scholar
Guo, Y., Miao, L. and Zhang, X., “Quantitative Research on GPS Positioning in an East-North-up Coordinate System,” 2019 Chinese Control Conference (CCC) (2019) pp. 39583963. doi: 10.23919/ChiCC.2019.8866190.CrossRefGoogle Scholar
SPE Society of Petroleum Engineers, Tugboat company takes its business flying with drones (2018). https://jpt.spe.org/tugboat-company-takes-its-business-flying-drones. Accessed: 2020-06-15.Google Scholar
Kotug, Drone technology for safer tug operations (2018). https://www.kotug.com/newsmedia/drone-technology-safer-tug-operations. Accessed: 2020-06-15.Google Scholar
Frederiksen, M. H. and Knudsen, M. P., Drones for Offshore and Maritime Missions: Opportunities and Barriers. Center for Integrative Innovation Management, SDU, April 2018.Google Scholar
Kim, G., Park, H. and Zhang, X., “A Method on Using Drone for Connecting Messenger Lines in Power Transmission Line Construction,” 2019 International Conference on Information and Communication Technology Convergence (ICTC) (2019) pp. 731736. doi: 10.1109/ICTC46691.2019.8939933.CrossRefGoogle Scholar
Tomaštík, J., Mokroš, M., Surový, P., Grznárová, A. and Merganič, J., “UAV RTK/PPK method—an optimal solution for mapping inaccessible forested areas?,” Remote Sens. 11(6) (2019). ISSN 2072-4292. doi: 10.3390/rs11060721.CrossRefGoogle Scholar
Gryte, K., Hansen, J. M., Johansen, T. and Fossen, T. I., Robust Navigation of UAV using Inertial Sensors Aided by UWB and RTK GPS. doi: 10.2514/6.2017-1035.CrossRefGoogle Scholar
Casella, E., Drechsel, J., Winter, C., Benninghoff, M. and Rovere, A., “Accuracy of sand beach topography surveying by drones and photogrammetry,” Geo-Marine Lett. 40(2), 255268 (2020). ISSN 1432-1157. doi: 10.1007/s00367-020-00638-8.CrossRefGoogle Scholar
Zimmermann, F., Eling, C., Klingbeil, L. and Kuhlmann, H., Precise positioning of UAVs - dealing with challenging RTK-GPS measurement conditions during automated UAV flights,” ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci. 42W3, 95102 (2017). doi: 10.5194/isprs-annals-IV-2-W3-95-2017.CrossRefGoogle Scholar
Durdevic, P., Ortiz-Arroyo, D., Li, S. and Yang, Z., “Vision aided navigation of a quad-rotor for autonomous wind-farm inspection,” IFAC-PapersOnLine 52(8), 6166 (2019 b). ISSN 2405-8963. doi: 10.1016/j.ifacol.2019.08.049. https://www.sciencedirect.com/science/article/pii/S2405896319303696. 10th IFAC Symposium on Intelligent Autonomous Vehicles IAV 2019.CrossRefGoogle Scholar
Qin, X. and Wang, T., “Visual-based Tracking and Control Algorithm Design for Quadcopter UAV,” 2019 Chinese Control and Decision Conference (CCDC) (2019) pp. 35643569. doi: 10.1109/CCDC.2019.8832545.CrossRefGoogle Scholar
Chuang, H.-M., He, D. and Namiki, A., “Autonomous target tracking of UAV using high-speed visual feedback,” Appl. Sci. 9(21) (2019). ISSN 2076-3417. doi: 10.3390/app9214552.CrossRefGoogle Scholar
Zhang, Y., Li, Q.-Z. and Zang, F.-N., “Ship detection for visual maritime surveillance from non-stationary platforms,” Ocean Eng. 141, 5363 (2017). ISSN 0029-8018. doi: 10.1016/j.oceaneng.2017.06.022. https://www.sciencedirect.com/science/article/pii/S0029801817303190.CrossRefGoogle Scholar
Pitsikalis, M., Do, T.-T., Lisitsa, A. and Luo, S., “Logic Rules Meet Deep Learning: A Novel Approach for Ship Type Classification,International Joint Conference on Rules and Reasoning (Springer, 2021) pp. 203217.CrossRefGoogle Scholar
Zhao, H., Zhang, W., Sun, H. and Xue, B., “Embedded deep learning for ship detection and recognition,” Future Internet 11(2), 53 (2019).CrossRefGoogle Scholar
Koenig, N. and Howard, A., “Design and Use Paradigms for Gazebo, an Open-Source Multi-robot Simulator,” 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)(IEEE Cat. No. 04CH37566), vol. 3 (IEEE, 2004)pp. 21492154.Google Scholar
Silva Ramos, G. and Faria Pinto, M., Shuttle tanker, FPSO and FSO image dataset for ship recognition (2021).doi: 10.21227/9xzy-fd09.CrossRefGoogle Scholar
Karney, C. F. F. and Deakin, R. E., “F.w. bessel (1825): The calculation of longitude and latitude from geodesic measurements,” Astronomical Notes 331(8), 852861 (2010).Google Scholar
Veritas, D. N., Recommended practice dnv-rp-h103: Modelling and analysis of marine operations. https://home.hvl.no/ansatte/tct/FTP/H2020%20Marinteknisk%20Analyse/Regelverk%20og%20standarder/DnV_documents/RP-H103.pdf (2020).Google Scholar
Chang, P. and Padır, T., “Model-based manipulation of linear flexible objects: Task automation in simulation and real world,” Machines 8(3) (2020). ISSN 2075-1702. doi: 10.3390/machines8030046.CrossRefGoogle Scholar
Hiemann, A., Kautz, T., Zottmann, T. and Hlawitschka, M., “Enhancement of speed and accuracy trade-off for sports ball detection in videos—finding fast moving, small objects in real time,” Sensors 21(9), 2021. ISSN 1424-8220. doi: 10.3390/s21093214.CrossRefGoogle ScholarPubMed
Choi, J., Chun, D., Kim, H. and Lee, H.-J., “Gaussian Yolov3: An Accurate and Fast Object Detector Using Localization Uncertainty for Autonomous Driving,” Proceedings of the IEEE International Conference on Computer Vision (2019)pp. 502511.Google Scholar
Gonzalez, M., Kacete, A., Murienne, A. and Marchand, E., Yoloff: You only learn offsets for robust 6DOF object pose estimation. arXiv preprint arXiv:2002.00911 (2020).CrossRefGoogle Scholar
Airbus, Airbus ship detection challenge. https://www.kaggle.com/c/airbus-ship-detection/data (2021). Accessed in 08/03/2020.Google Scholar
Redmon, J., Divvala, S., Girshick, R. and Farhadi, A., “You Only Look Once: Unified, Real-Time Object Detection,” In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2016) pp. 779788. doi: 10.1109/CVPR.2016.91.CrossRefGoogle Scholar
Padilla, R., Netto, S. L. and da Silva, E. A. B., “A Survey on Performance Metrics for Object-Detection Algorithms,” 2020 International Conference on Systems, Signals and Image Processing (IWSSIP) (2020) pp. 237242.Google Scholar
Kalman, R. E., “A new approach to linear filtering and prediction problems”, ASME. J. Basic Eng. March 82(1), 3545 (1960).CrossRefGoogle Scholar
Mao, G., Drake, S. and Anderson, B. D. O., “Design of an Extended Kalman Filter for UAV Localization,” In: 2007 Information, Decision and Control (2007) pp. 224229. doi: 10.1109/IDC.2007.374554.CrossRefGoogle Scholar
Zhang, T. and Liao, Y., “Attitude measure system based on extended kalman filter for multi-rotors,” Comput. Electron. Agric. 134, 1926 (2017). ISSN 0168-1699. doi: 10.1016/j.compag.2016.12.021. https://www.sciencedirect.com/science/article/pii/S0168169916312418.CrossRefGoogle Scholar
Brescianini, D., Hehn, M. and D’Andrea, R., Nonlinear quadrocopter attitude control: Technical report. Technical report (2013). http://hdl.handle.net/20.500.11850/154099.Google Scholar
PixHawk, Controller diagrams. https://docs.px4.io/master/en/flight_stack/controller_diagrams.html (2021). Accessed in 05/26/2021.Google Scholar
Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. J. and Marn-Jiménez, M. J., “Automatic generation and detection of highly reliable fiducial markers under occlusion,” Pattern Recognit. 47(6), 22802292 (2014).CrossRefGoogle Scholar
Blanger, L. and Hirata, N. S. T., “An Evaluation of Deep Learning Techniques for QR Code Detection,” 2019 IEEE International Conference on Image Processing (ICIP) (2019) pp. 16251629. doi: 10.1109/ICIP.2019.8803075.CrossRefGoogle Scholar
Rezatofighi, H., Tsoi, N., Gwak, J., Sadeghian, A., Reid, I. and Savarese, S., “Generalized intersection over union: A Metric and a Loss for Bounding Box Regression”, Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2019, pp. 658–666 (2019).Google Scholar
CVAT, Computer vision annotation tool. https://cvat.org/ (2021). Accessed in 01/09/2020.Google Scholar
VulcanUAV, Aircraft series. http://vulcanuav.com/aircraft/ (2021). Accessed in 02/28/2021.Google Scholar
Rikimaru, Rp9 professional pe. http://www.fishingc.com/productinfo/99929.html (2020). Accessed in 10/16/2020.Google Scholar
Yuneec, Typhoon h user manual v1.1. https://static.bhphotovideo.com/lit_files/319649.pdf (2020). Accessed in 10/16/2020.Google Scholar
Figure 0

Figure 1. Tandem approach scheme.

Figure 1

Table I. Related works summary.

Figure 2

Figure 2. Operation flowchart.

Figure 3

Figure 3. Linear trajectory.

Figure 4

Figure 4. Messenger line catenary.

Figure 5

Figure 5. Messenger line model for simulations.

Figure 6

Figure 6. Dynamic forces acting on a line model link.

Figure 7

Figure 7. YOLOv3 convolutional neural network structure.

Figure 8

Figure 8. Control loop of approaching flight mode strategy adopted in stage 2 (consider $yaw_y$ as the $yaw_{k, camera}$ and $yaw_c$ as the $yaw_{k, compass}$ of the model description).

Figure 9

Figure 9. IOU metric definition.

Figure 10

Figure 10. Typhoon H480 3D model used in simulation.

Figure 11

Figure 11. Offloading world designed for the simulation.

Figure 12

Figure 12. Active ROS nodes and topics.

Figure 13

Figure 13. Calculated and performed trajectories.

Figure 14

Figure 14. Flight height in simulation step 1.

Figure 15

Figure 15. Drone orientation in simulation step 1.

Figure 16

Table II. Detection results.

Figure 17

Figure 16. Example of images used for training: (a) FPSO; (b) FSO; (c) shuttle tanker, and (d) digital model of a shuttle tanker.

Figure 18

Figure 17. Example of an annotated image using CVAT.

Figure 19

Figure 18. YOLOff results for all classes.

Figure 20

Figure 19. YOLOff example of detection.

Figure 21

Table III. Detection evaluation metrics.

Figure 22

Table IV. Detection comparison between YOLOff and ref. [34].

Figure 23

Figure 20. Approaching results. (a) Approaching to the objective on operation step 2. (b) Complete performed trajectory.

Figure 24

Figure 21. UAV position in failsafe test.

Figure 25

Figure 22. EKF yaw predictions on operation step 2.

Figure 26

Figure 23. Marker signal detection on operation step 3.

Figure 27

Figure 24. IOU results for marker detection on simulation.

Figure 28

Figure 25. UAV position related to the origin (takeoff point) after marker is found.

Figure 29

Table V. List of messenger line’s parameters.

Figure 30

Figure 26. Messenger line in simulation.

Figure 31

Figure 27. Typhoon H480 pitch and roll angles (a) and vibration (b) transporting the line in simulation.