Advanced manufacturing involves complex autonomous and robotic systems and monitoring them or the product for possible anomalies are of utmost importance to ensure quality of products as well as the accuracy and safety of the system. While some of the anomalies can be delay-tolerant, the critical anomalies need real-time detection and mitigation, e.g., irregular motion of a robot or sudden presence of an obstacle in the robot’s pathway. Timely detection of these anomalies is essential. Anomaly detection in the products is also required for quality control and early detection can save from costly product losses.
Although fixed camera-based machine vision system can be used for monitoring, it has limited field of view and not flexible in terms of adaptively focusing on the area of anomaly. Further, large machinery or other obstructions could also potentially create blind spots for fixed cameras, thus it can miss important sections of the view. Drones, on the other hand, can fly above the target areas, adjusting their position based on the detected events or cues. They can also maneuver around obstacles, providing views from multiple angles, ensuring no blind spots. Drones can also equip with other sensors, e.g., thermal, chemical for non-visual monitoring, thus can provide a compact integrated system for inspection and monitoring the advanced manufacturing systems. However, there is significant challenge for drones to operate in the indoor spaces in industrial environments in terms of indoor localization and safe operations. This project aims at developing a lightweight drone-based inspection and monitoring system for Industrial IoT by solving those challenges.
Figure 1. AIMSLab’s indoor fully netted drone flying facility with Opti-track motion capture system and multiple advanced drones of different sizes
Research Objectives: The objective of this project is to develop a drone-based Inspection and monitoring system that can carry multiple sensors, e.g., depth sensing and thermal imaging cameras, localization sensors etc. and, also can have onboard computing capacity for executing anomaly detection algorithms in real-time while also running its own autonomous navigation and collision avoidance algorithms. Figure 1 shows the facilities in the AIMSLab at LARRI at UofL which will be used for this project. We will start with a small drone (Crazyflie with a small visual camera) and the Optitrack motion capture system for indoor localization. If we have time, we can try a more capable drone with thermal camera to extend the capabilities.
Research Plan: (1) Building the integrated drone system, integrate the optitrack based indoor localization for autonomous or planned navigation in the indoor environment, (2) Use previously developed Deep learning based Anomaly detection algorithms to detect certain anomaly, along with drones autonomous navigation algorithm (3) Create a real-time demonstration of the system , (4) Subject to the available time, make the system more efficient by lightweight AI models and more capable by using a bigger drone with more sensors.
REU Student Outcomes: This project fits a student with coding skills (e.g., computer science, AI experience, if any, is a plus), along with a bit of hands-on hardware integration skills. The REU student will gain hands-on experience in drone technology (hardware and software) as well as learn usage of computer vision and other algorithms. Student will be supervised by Dr. Sabur Baidya (Computer Science and Engineering) and participate in writing conference and/or journal papers.