This week, the Council held its usual meeting in a space co-located with the Sensors Expo at San Jose's McEnery Convention Center. While these meetings always assure us that we prefer the access, parking, and community spirit that our usual meeting hosts provide, the co-location gave our attendees the benefit of visiting the larger and very interesting Sensors Expo show floor, with all of its sensor vendors, analytics platforms, connection technologies, and solutions providers.
The bulk of the show floor demonstrated what analysts have been saying: That the Sensor's market is thriving, and is largely focused on Industrial IoT, 4th Wave Industrial Revolution use cases, but that there were many active sub-categories such as automotive, smart city, smarthome, consumer IoT, Consumer Electronics, and more.
Our side event focused the wider conference down to the topic of Automotive Sensors. And our format was primarily one of inviting promising auto sensor companies to pitch their solution. Inside the automotive sector, it's clear that the "hottest" category of sensors is that which includes LiDAR, RADAR, and any innovative way to produce a 3D point cloud model of the world around the car, for the primary purpose of self-driving. ADAS also figures strongly, followed by other car sensor uses - which are numerous and important, but overshadowed by the self-driving craze.
Some other uses of auto sensors discussed:
- smart doors that open and close, but sense obstacles
- close range point clouds 360 degrees around the car
- driver alertness and health
- gas/chemical sensors to improve combustion and thermal efficiency
Our sponsor, Osram Opto Semiconductors, discussed the use of light emitters for sensors, but also very interesting uses of smart lighting, where oncoming cars or people are identified, and the headlight's beam is aimed to avoid illuminating their eyes. Also, Dr. Markus Arzberger shared interesting ideas around sensing pedestrians, and then projecting information and guidance back to the on the road surface.
Our panel discussed some big issues around sensors, and the contributions they make to AI in the narrow case of vehicle autonomy. Hot topics included a discussion around whether to put smarts at the edge (for example: bumpers or suspension) or whether to centralize intelligence in a console or trunk-mounted central computer. The panelists seemed to agree that it is better to pre-process collected data at the edge, and send up intelligence as opposed to a data dump. This:
- puts less strain on the central computer
- reduces load and lag on car network
- allows sensor makers to tailor specialized software instead of generalized central processing, and put value-add into their products.
The panel also introduced the interesting concept of "ALL of it", meaning there IS a central computer that offers the AI or overall system management, but there are also smart sensors pre-processing data, and the results go not only to the central computer, but can also be useful when used in a Peer-to-peer format among other sensors, for example leveraging and fusing one sensor's output to focus and improve another's. For example, if temperature, windshield wiper, and rain sensors detect snow, the LIDAR could use that information to adjust, and the central processor could use that information to de-emphasize LIDAR's influence on the 3D point cloud it uses to navigate.
As usual, we wrapped up with some great startup presentations, and those are all available in our Member Library. Thanks to our presenters, Osram, Sensor's Expo, and those who came.