If your windshield wipers have ever stopped working during a rainstorm and you had to drive, you know how hard it can be to see out of your window. Even though there are hydrophobic coats, they can only keep water and dirt from getting in your way so much. Like human eyes, the cameras outside current cars have trouble seeing in low light.

That makes it hard for automakers to sell driving aid systems that are based on autonomy. Automakers fight to produce cars with the best driver aids for the least amount of money. Most of these driver-aids are cameras. These helper technologies, from staying in your lane to “Full Self-Driving,” use cameras. Hyundai and Kia have developed a device that fixes one of the biggest problems with camera-based Advanced Driver Assistance Systems (ADAS): surface trash getting in the way of the camera’s view.

The way to clean a camera has been introduced previously. For example, the BMW iX has sprayer tubes that pop out to clean the front and back sensors of dirt and dust. But these sprayers might not get all of the stuff off the lens, and they might also temporarily block the camera’s view if they cover the lens with liquid or, in the worst case, don’t dry it out enough.

Hyundai does things differently. To solve the problem, the maker made a big, enclosed unit with a spinning case that hides a camera. A glass lens at the front of the case protects the vulnerable camera behind it. The glass lens is most likely to get damaged by the weather.

The upper lens needs to be cleaned when it gets dirty. That’s the same as any other car with an image camera on the outside, even ones like the iX with built-in cleaning systems. Hyundai’s technology doesn’t just wet the lens and let it dry on its own; instead, it uses a small fan blade that moves across the surface of the outer glass. When the car’s software notices something is blocking the camera, it will spray the lens before turning only the top lens and leaving the wiper still. So, the wiper can act like a squeegee, sweeping away dirt and water to give you a better view of what’s happening around you.

Hyundai’s method is a simple way to fix a problem that other cars with cams have had but have yet to be able to solve. Also, we’re not just talking about rearview mirror cams. As car technology improves, companies worldwide try to make cars that can drive themselves or take most of the stress from going for the driver.

OEMs have had to spend a lot of money to reach this goal since tier-one parts like lidar scanners can cost a few thousand dollars each. Vision-based devices are much cheaper than other types but also have some problems. If only monocular cameras were used to measure distance, it would be like trying to hold something in front of you while closing your eyes. So, for cars to have the proper depth sense, they need many cameras. Conversely, this approach costs less than putting lidar and high-definition radar sensors in a vehicle.

The general accuracy of tools that can only be seen is a drawback. For example, Tesla has spent almost a decade trying to solve the problem of self-driving cars almost entirely by using a sensor suite built on vision. CEO Elon Musk has called lidar technology, often used in the market for self-driving cars, a “crutch.” On the other hand, water droplets, dust, and other things often get in the way of the sensors in Tesla’s vision-based cars. This is called camera occlusion, which stops the cameras from doing their jobs.

Remembering that Hyundai’s new technology is still just an idea is essential. The car company has made prototypes that work, but they have yet to test them to see if they are stable and durable enough to sell to customers. If the technology works, it could be a big step forward for automakers who want to use only cameras to eliminate the difference between Level 2, Level 3, and Level 4 ADAS systems.

Invoice Pricing

Take out the drama and hassle of negotiating at the dealership. Find the best price fast!