Autonomous vehicles are part of an expanding industry that encompasses various interdisciplinary fields including but not limited to dynamics and control, thermal engineering, sensors, data processing, and artificial intelligence. Autonomous vehicles require the use of various sensors, such as optical cameras, RADAR (radio detection and ranging), or LiDAR (light detection and ranging), to navigate on the road with the aim of self-driving. However, the exposures to environmental conditions related to the combination of surrounding temperature and humidity lead to challenges in sensor performance. For example, the sensor’s temperature will increase as the heat is generated during the vehicle’s usage. On the other hand, the sensor system will undergo thermal shock from the temperature difference the due to sudden changes in temperature, such as moving from an indoor garage at room temperature to −10°C environments. Furthermore, the consistent exposure to the cold weather may occur frosting, which can obstruct the optical sensor’s visibility. Those issues limit the potential of data processing from optical cameras and consequence autonomous driving reliability at extreme environmental conditions.
To review the requirements for sensor performance used in autonomous vehicles and to formulate solutions addressing potential concerns to improve autonomous driving safety, we simulate camera operating conditions in the real world. First, we correlate the common placements of optical sensors, mainly focusing on cameras, in autonomous vehicles to naturally occurring environmental conditions in relation to temperature and humidity. With this correlation, we aim to provide an understanding of potential areas on the vehicle that may be more prone to environmental factors of thermal shock or humidity variations. Second, we examine the condensation and frosting mechanism and formation sequence on the vehicle surfaces (e.g., windshield and camera lenses), which is then used to determine the level of water on the lenses before the sensor vision is impeded. Third, we introduce and conceptualize machine learning models that can extract features by employing object detection algorithms that perform image restoration to reconstruct areas with deterioration despite the presence of the droplets or frosts on the camera. With this research, we aim to provide a better understanding of the potential caveats and algorithm solutions that can help the capability for autonomous driving even under extreme environmental conditions.