Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The camera alone seems to see a lot:

In perfect conditions, on a sunny day.

I'm in Sweden, and the sun shining directly into your eyes from barely above the horizon while the road is wet/covered with snow and reflects that sun at you is a regular occurence during winter months. I odubt Tesla's camera will be able to see anything.



This is the reason why a single camera alone is not capable of being the sole source of information for a self-driving system. The technology currently available for camera systems does not capture a high enough dynamic range to be able to see details in darkness when the Sun in in frame. You could use multiple cameras all with different sensitivities to light and combine them, but it's going to be very difficult.


I really don't see what's difficult. You don't even need multiple cameras, you can simply use very short exposures and combine short exposure shots into a longer exposure one when needed. Multiple cameras are useful to handle glare though.


I think you two just proved my point


Why would it be very difficult? You can split the same light beam after the lens, and send it to two cameras with different diaphragm or sensitivity. You'd then synthesize a perfectly aligned HDR picture.


I think you two just proved my point




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: