Keynotes

Prof. Davide Scaramuzza
Professor of Robotics and Perception at the University of Zurich
BiographyDavide Scaramuzza is a Professor of Robotics and Perception at the University of Zurich. He did his Ph.D. at ETH Zurich, a postdoc at the University of Pennsylvania, and was a visiting professor at Stanford University. His research focuses on autonomous, agile microdrone navigation using standard and event-based cameras. He pioneered autonomous, vision-based navigation of drones, which inspired the navigation algorithm of the NASA Mars helicopter and many drone companies. He contributed significantly to visual-inertial state estimation, vision-based agile navigation of microdrones, and low-latency, robust perception with event cameras, which were transferred to many products, from drones to automobiles, cameras, AR/VR headsets, and mobile devices. In 2022, his team demonstrated that an AI-powered drone could outperform the world champions of drone racing, a result published in Nature and considered the first time an AI defeated a human in the physical world. He is a consultant for the United Nations on disaster response and disarmament. He has won many awards, including an IEEE Technical Field Award, the levation to IEEE Fellow, the IEEE Robotics and Automation Society Early Career Award, a European Research Council Consolidator Grant, a Google Research Award, two NASA TechBrief Awards, and many paper awards. In 2015, he co-founded Zurich-Eye, today Meta Zurich, which developed the world-leading virtual-reality headset Meta Quest. In 2020, he co-founded SUIND, which builds autonomous drones for precision agriculture. Many aspects of his research have been featured in the media, such as The New York Times, The Economist, and Forbes.
show lessTitle: Agile Vision-based Flight
AbstractAutonomous drones play a crucial role in inspection, agriculture, logistics, and search-and-rescue missions and promise to increase productivity by a factor of 10. However, they still lag behind human pilots in speed, versatility, and robustness. What does it take to fly autonomous drones as agile as or even better than human pilots? Autonomous, agile navigation through unknown, GPS-denied environments poses several challenges for robotics research regarding perception, learning, planning, and control. In this talk, I will show how the combination of model-based and machine-learning methods, united with the power of new, low-latency sensors, such as event cameras, can allow drones to achieve unprecedented speed and robustness by relying solely on onboard computing. This can result in better productivity and safety of future autonomous aircraft.
show less
Prof. Guido De Croon
Full Professor in Bio-inspired Micro Air Vehicles
Department of Aerospace Engineering
Delft University of Technology
Received his M.Sc. and Ph.D. in the field of Artificial Intelligence (AI) at Maastricht University, the Netherlands. His research interest lies with computationally efficient and often bio-inspired algorithms for robot autonomy, with an emphasis on computer vision. Since 2008 he has worked on algorithms for achieving autonomous flight with small and light-weight flying robots, such as the DelFly flapping wing MAV. In 2011-2012, he was a research fellow in the Advanced Concepts Team of the European Space Agency, where he studied topics such as optical flow based control algorithms for extraterrestrial landing scenarios. Currently, he is associate professor at TU Delft and scientific lead of the Micro Air Vehicle lab (MAV-lab) of Delft University of Technology.
show lessTitle: Neuromorphic Artificial Intelligence for small, autonomous drones
AbstractSmall, autonomous drones are promising for many applications, such as search-and-rescue, greenhouse monitoring, or keeping track of stock in warehouses. Since they are small, they can fly in narrow areas. Moreover, their light weight makes them very safe for flight around humans. However, making small drones fly completely by themselves is an enormous challenge. Most approaches to Artificial Intelligence for robotics have been designed with self-driving cars or other large robots in mind – and these are able to carry many sensors and ample processing. In my talk, I will argue that a different approach is necessary for achieving autonomous flight with small drones. In particular, I will discuss how we can draw inspiration from flying insects and endow our drones with similar intelligence. Examples include flapping wing drones that can deal with collisions, and swarms of CrazyFlie quadrotors of 30 grams able to explore unknown environments and find gas leaks. The intelligence of these small drones will be further enhanced by novel neuromorphic sensing and processing technologies. I will discuss recent experiments we performed on neuromorphic AI in our lab – from ego-motion estimation to agile flight. Finally, I will present our research in the context of the European SPEAR project: that it is possible to evolve drone bodies for specific tasks, outcompeting traditional drone designs in tasks such as drone racing.
show less
Prof. Hyunchul Shim
Professor in the Department of Aerospace Engineering at Korea Advanced Institute of Science and Technology
BiographyDr. David Hyunchul Shim received the B.S. and M.S. degrees in mechanical design and production engineering from Seoul National University, Seoul, Korea, in 1991 and 1993, respectively, and the Ph.D. degree in mechanical engineering from the University of California Berkeley, Berkeley, USA in 2000. From 1993 to 1994, he was with Hyundai Motor Company, Korea. From 2001 to 2005, he was with Maxtor Corporation, Milpitas, CA, USA as Staff Engineer. From 2005 to 2007, he was with the University of California Berkeley as Principal Engineer, in charge of Berkeley Aerobot Team. In 2007, he joined the Department of Aerospace Engineering, KAIST, South Korea, as an Assistant Professor. He is now Professor in School of Electrical Engineering. During his career at KAIST, he has led a number of efforts as Director of KI Robotics Institute at KAIST from ’19-’22, advisor for RPAS panel at ICAO, and Director of Korea Civil RPAS Research Center. He is also very active in a number of world’s premier competitions such as DARPA SubT, Lockheed Martin AlphaPilot, Indy Autonomous Challenges. He has received a number of major awards from Korean government and global events. His interests center on the combination of robotics and AI with aerial and ground vehicles for highly intelligent vehicles.
show lessTitle: A Quest for Autonomously Exploring Drones
AbstractTowards the end of the 20th century, a Japanese TV show featured electric helicopters flying through complex mazes, hand-controlled by human pilots for a big prize. Watching it, I wondered: what if UAVs could do this autonomously? At that time, drones were bulky—typical drones in those days were the size of a motorbike—and onboard SLAM or lightweight perception was still far in the future. By the mid-2010s, the landscape had shifted: small multirotors had emerged, and GPUs became compact enough to fly. In 2016, I co-founded what is recognized as the first autonomous drone racing competition, held at IROS in Daejeon. Since then, we have witnessed a surge of autonomous drone challenges, from the DARPA Subterranean Challenge to Lockheed Martin’s AlphaPilot and, most recently, the UAE’s A2RL DCL. In this talk, I will reflect on this trajectory—from the point when quadrotors came to symbolize drones, through the breakthroughs enabled by global competitions, to the lessons learned from my own first-hand experiences. I will also discuss the challenges ahead in building more capable, trustworthy, and socially beneficial drone technologies.
show less
Capt. Rafael Cruz Gómez and Lt. Gustavo Guerrero Clavel
Captain Rafael Alberto Cruz Gómez and Lieutenant Gustavo Guerrero Clavel of the Secretariat of the Navy (SEMAR)
BiographyCaptain Gómez graduated as an electrical engineer from the Escuela de Ingenieros of the Mexican Navy. He has a dual master’s degree in Master of Sciences of Electrical Engineering and Master of Sciences in Engineering Acoustics from the United States Naval Postgraduate School. He has held several appointments in the Mexican Navy such as the Embedded Systems designer for UAV SPARTAAMv44, SPARTAAMv88 and, SPARTAAMv200. LCDR Gómez has been awarded with several distinctions, including Medals of Perseverance, Mexican Navy Research, and Development distinctive, and Honorable Mention for the excellence in the design of UAV. Currently, LCDR Gómez is the head of the Embedded Systems Lab in the SPARTAAM UAS Program at the Mexican Navy.
Lieutenant Guerrero, Electronics and Naval Communications Engineer, graduated from the Naval Engineers School of Mexico (2006–2011) and completed a specialisation in Operational Logistics at CESNAV in 2021. He has received technical training from Microchip, a PCB Design Course for military, aerospace, and extreme environments, and IPC/WHMA-A-620 certification from the Institute of Printed Circuits. He has served in various positions at INIDETAM and UNINDETEC, and is currently Head of the Integration and Technical Support Laboratory for Unmanned Aircraft. His work includes avionics engineering in the design, construction, and testing of avionics subsystems, simulation models, UAV prototype integration, as well as operation and technical support for UAV systems such as SPARTAAM-44, SPARTAAM-88, SPARTAAM-VTOL, and SPARTAAM-Satellite. For his contributions, he was awarded the Technological Research and Development Distinction.
show lessTitle: Detection and Classification of Small UAVs
AbstractSmall Unmanned Aerial Vehicles (UAV) or drones are providing big amounts of data. Often, new classes of drones are deployed on a regular basis to help in different tasks. The purpose of this talk is to present a technique of drone detection using acoustic means. Although there are several techniques of drone detection, such as electromagnetic signal detection, radar location, electro-optic, and infrared visual detection. It is feasible to detect the unique acoustic signature of a drone class using modern techniques of signal processing. Using acoustic sensors, a drone can be detected, and with machine learning techniques, its path can be tracked. Furthermore, there are several other data that can be extracted using deep learning techniques based on neural networks such as the class of the drone, which can help in the decision process of the main function of the drone. At the end of the talk is presented a small routine used to detect and classify drones using its acoustic signature.
show less
Xavier Páez
WG Drones Company
Title: The history of a drone show company

Prof. Tom Richardson
Professor Tom Richardson, Faculty of Engineering, University of Bristol, UK.
BiographyTom Richardson is Professor of Aerial Robotics at the University of Bristol in the UK and specializes in the application of modern control theory and novel sensors to UAS/Drones. He has been granted permission for BVLOS (Beyond Visual Line of Sight) operations in multiple countries and was the International Drone Safety Lead for the Multinational Deep Carbon Observatory (DCO) funded ABOVE field campaign to Papua New Guinea in 2019. Tom is a founding partner of Perceptual Robotics, has held an NPPL (pilots license) for over 15 years and has worked with a range of industrial partners including DSTL, BAE Systems and Thales. Recent projects include MIMRee on the robotic inspection and maintenance of wind turbines and Prometheus on the use of drones for underground inspection.
show lessTitle: MACE: Development of a Medium-Altitude Fixed-Wing Drone for Environmental Monitoring
AbstractNatural events, particularly volcanic eruptions, release tiny particles (aerosols) into the atmosphere and offer invaluable real-world opportunities to study processes relevant to climate science. However, safely and rapidly collecting data from these events is challenging. The MACE project is developing advanced, automated drone technology specifically designed for observing and analysing emissions from active volcanoes. These drones will operated at altitudes up to 10km above sea level and will be flown over regularly erupting volcanoes including Volcán de Fuego (Guatemala), Soufrière Hills (Montserrat), and Lascar (Chile). By analysing these natural volcanic emissions in situ, the research will investigate how tiny cloud droplets form and how natural aerosol layers affect radiation. A key goal is to develop a rapid-response capability using these drones, enabling the scientific community to safely gather crucial data from future significant volcanic eruptions, thereby improving our understanding of natural climate processes.
show less