فرود خودکار پرنده بدون سرنشین با استفاده از بینایی ماشین

نوع مقاله: مقاله پژوهشی

نویسندگان

1 عضو هیات علمی / گروه مهندسی مکاترونیک، دانشکده مهندسی فناوریهای نوین، دانشگاه تبریز

2 کارشناس ارشد / گروه مهندسی برق الکترونیک، دانشکده مهندسی برق و کامپیوتر، دانشگاه تبریز

چکیده

یکی از مشکلات پرنده‌های بدون سرنشین خطر فرود ناموفق یا برخورد با زمین است. هدف این مقاله، تخمین دقیق و پیوسته موقعیت پرنده نسبت به نشانگر فرود با استفاده از تصاویر دوربین پرنده و در نهایت فرود خودکار بر روی محل از پیش تعیین شده است. پردازش‌ها به صورت همزمان و با کمترین تاخیر انجام می‌شوند. برای فرود دقیق و کاهش اثرات تاخیرهای موجود در حرکت پرنده الگوریتمی به نام "روش برش حرکتی" ارائه می‌شود که حرکت در نزدیکی نشانگر را به بازه‌های کوچک "حرکت" و "انتظار" تقسیم می‌کند. مدت زمان و سرعت حرکت متناسب با فاصله پرنده از هدف تنظیم می‌شود. نتایج آزمایش‌های تجربی موفقیت عملکرد روش ارائه شده را تایید می‌کند و پرنده می‌تواند با دقت زیر 3 سانتیمتر و زمان کمتر از 15 ثانیه با موفقیت بر روی هدف فرود آید.

کلیدواژه‌ها


عنوان مقاله [English]

Vision-based auto landing of a UAV

نویسندگان [English]

  • Maryam Shoaran 1
  • Mohammad Fattahi Sani 2
1 Department of Mechatronics Engineering, School of Engineering-Emerging Technologies, University of Tabriz, Tabriz, Iran
2 Department of Electronics Engineering, Faculty of Electrical and Computer Engineering, University of Tabriz, Tabriz, Iran
چکیده [English]

Unmanned aerial vehicles (UAVs) have recently become very useful in human's life. Unsuccessful landings or the danger of collision in landing is one of the problems of quadrotor UAVs. The goal of this paper is to present a precise and continues pose estimation method using monocular machine vision for a quadrotor to automatically land on a predefined place. For an accurate landing and to reduce the effects of existing delays in the drone's motion we propose an algorithm called "time slicing method", which divides the drone's moves close to the marker into smaller intervals called "movement" and "waiting". The time and the speed of the movements are proportional to the distance of the drone from the marker. The processing is parallel and of a minimum delay. Experimental results verify the success of our method and show that the drone can successfully land on the marker with an error of less than 3cm and in a time less than 15 seconds.

کلیدواژه‌ها [English]

  • unmanned aerial vehicle
  • quadrotor
  • automatic landing
  • pose estimation
  • Machine vision
[1] J. M. Daly, Y. Ma, S. L. Waslander, Coordinated landing of a quadrotor on a skid-steered ground vehicle in the presence of time delays, Autonomous Robots, Vol. 38, No. 2, pp. 179-191, 2015.

[2] Y. Mulgaonkar, Automated recharging for persistence missions with multiple micro aerial vehicles, M.Sc. Thesis, University of Pennsylvania, USA, 2012.

[3] K. E. Wenzel, A. Masselli, A. Zell, Automatic take off, tracking and landing of a miniature UAV on a moving carrier vehicle. Journal of intelligent & robotic systems, Vol. 61, No. 1-4, pp. 221-238, 2011.

[4] M. Saska, T. Krajnik, L. Pfeucil. Cooperative μUAV-UGV autonomous indoor surveillance, Proceedings of 9th International Multi-Conference on Systems, Signals and Devices (SSD), Germany, March 20-23, 2012.

[5] Y. Bi, H. Duan, Implementation of autonomous visual tracking and landing for a low-cost quadrotor. Optik-International Journal for Light and Electron Optics, Vol. 124, No. 18, pp. 3296-3300, 2013.

[6] P. Benavidez, J. Lambert, A. Jaimes, M. Jamshidi, Landing of an ardrone 2.0 quadcopter on a mobile base using fuzzy logic, Proceedings of 2014 World Automation Congress (WAC), USA, Aug 3-7, 2014.

[7] C. Yu, J. Cai, Q. Chen, Multi-resolution visual fiducial and assistant navigation system for unmanned aerial vehicle landing, Aerospace Science and Technology, Vol. 67, pp. 249-256, 2017.

[8] S. Shah, Real-time Image Processing on Low Cost Embedded Computers, Techincal report No. UCB/EECS-2014, 2014.

[9] Y. H. Shin, S. Lee, J. Seo, Autonomous safe landing-area determination for rotorcraft UAVs using multiple IR-UWB radars, Aerospace Science and Technology, Vol. 69, pp. 617-624, 2017.

[10] S. Piskorski, N. Brulez, P. Eline, F. Dhaeyer, Ar. drone developer guide, Revision SDK 2.0, Parrot, S.A., 2012.

[11] J. L. Bowditch, The new American practical navigator, E. & G. W. Blunt, New York, 1857.

[12] R. E. Kalman, A new approach to linear filtering and prediction problems, Journal of basic Engineering, Vol. 82, No. 1, pp. 35-45, 1960.

 [13] J. Diebel, Representing attitude: Euler angles, unit quaternions, and rotation vectors, Matrix, Vol. 58, No. 15-16, pp. 1-35, 2006.

[14] S. Mitra, Autonomous quadcopter docking system, Project report, Cornell Uiversity, USA, 2013.

 [15] M. Podhradsky, Visual Servoing for a Quadcopter Flight Control, M.Sc. Thesis, Czech Technical University, Czech Republic, 2012.

[16] S. Lange, N. Sünderhauf, P. Protzel, Autonomous landing for a multirotor UAV using vision, Proceedings of International Conference on Simulation, Modeling, and Programming for Autonomous Robots, Italy, Nov. 3-4, 2008.

[17] S. Lin, M. A. Garratt, A. J. Lambert, Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment, Autonomous Robots, Vol. 41, No. 4, pp. 881-901, 2017.

[18] C. Patruno, M. Nitti, E. Stella, T. D’Orazio, Helipad detection for accurate UAV pose estimation by means of a visual sensor, International Journal of Advanced Robotic Systems, Vol. 14, No. 5, pp. 1-15, 2017.

 [19] T. Krajník, V. Vonasek, D. Fiser, J. Faigl, AR-drone as a platform for robotic research and education, Proceedings of International Conference on Research and Education in Robotics,  Czech Republic, June 15-17, 2011.

[20] S. Garrido-Jurado, R. Munoz-Salinas, F. J. Madrid-Cuevas, M. J. Marin-Jimenez, Automatic generation and detection of highly reliable fiducial markers under occlusion, Pattern Recognition, Vol. 47, No. 6, pp. 2280-2292, 2014.

[21] S. Suzuki, Topological structural analysis of digitized binary images by border following, Computer Vision, Graphics, and Image Processing, Vol. 30, No. 1, pp. 32-46, 1985.

[22] N. Otsu, A threshold selection method from gray-level histograms, Automatica, Vol. 11, No. 285-296, pp. 23-27, 1975.

[23] V. Lepetit, F. Moreno-Noguer, P. Fua, EPnP: An accurate O(n) solution to the PnP problem, International journal of computer vision, Vol. 81, No. 2, pp. 155-166, 2009.

[24] Z. Zhang, A flexible new technique for camera calibration, IEEE Transactions on pattern analysis and machine intelligence, Vol. 22, No. 11, pp. 1330-1334, 2000.

 [25] T. G. Carreira, Quadcopter automatic landing on a docking station, M.Sc. Thesis, Instituto Superior Técnico, Portugal, 2013.

[26] K. Ling, Precision Landing of a Quadrotor UAV on a Moving Target Using Low-Cost Sensors, M.Sc. Thesis, University of Waterloo, Canada, 2014.

[27] G. Bradski, A. Kaehler, Learning OpenCV: Computer vision with the OpenCV library, O'Reilly, 2011.

 [28] C. Patruno, M. Nitti, A. Petitti, E. Stella, T. D’Orazio, A Vision-Based Approach for Unmanned Aerial Vehicle Landing, Journal of Intelligent & Robotic Systems, pp 1-20, 2018.

[29] L. Wang, X. Bai, Quadrotor Autonomous Approaching and Landing on a Vessel Deck, Journal of Intelligent & Robotic Systems, Vol. 92, No. 1, pp. 125–143, 2018.

[30] F. Cocchioni, E. Frontoni, G. Ippoliti, S. Longhi, A. Mancini, P. Zingaretti, Visual Based Landing for an Unmanned Quadrotor, Journal of Intelligent & Robotic Systems, Vol. 84, No. 1–4, pp. 511–528, 2016.