- 图为高分辨率的Quanergy激光雷达图像:中间部分为街道,蓝色部分为树木。
- 大陆集团的固态激光雷达模块(如图所示)和其他类似模块,预计将为大多数自动驾驶车辆所采用(图像:Continental)。
许多自动驾驶技术发展计划都要求在车辆上安装固态激光雷达传感器,但由于当前原型车所采用的激光雷达模块都是由活动部件组成的机械系统,因此这一需求引发了人们对激光雷达的巨大兴趣,OEM和供应商竞相投资开发非机械组装的激光雷达技术。
一些小型公司已经研发出了固态激光雷达技术,但还无法实现在汽车上的应用,并且其中一些技术已经在过去18个月内被大型汽车公司所收购:福特投资了Velodyne,采埃孚收购了Ibeo 40%的股份,大陆集团和Analog Devices则分别并购了Advanced Scientific Concepts和Vescent Photonics。
激光雷达通过发射激光来测量物体间的距离,该功能与雷达类似,但激光可以支持系统在夜晚和雨雪天气下提供高分辨率的图像,这激发了市场对固态激光雷达技术的兴趣。
“高分辨率‘快闪光’激光雷达的功能适用于所有照明和天气条件,因此成为了自动驾驶的一项必要技术。”Continental北美高级辅助驾驶系统的客户项目主管Dean McConnell介绍道,“我们以30赫兹的速度捕捉图像,然后以每秒30次的频度构建3D点簇。”
该技术还有助于安全系统对值得注意的对象进行归零校正,这对确定一个对象是否会对驾驶构成威胁至关重要。
“激光雷达的作用就像人类的眼睛,视野宽广,可快速进行扫描,然后专注于感兴趣的对象,”ADI的汽车安全总经理Chris Jacobs说道。
激光雷达供应商正在加速开发紧凑型固态模块。由于目前自主驾驶研究人员采用的大型机械装置体积过大,生产成本太高,研究人员正在努力缩小模块尺寸,并将距离和视野完美结合。
“我们的固态塻块尺寸是9×6×6厘米,大约相当于两盒扑克牌的大小,”Quanergy的首席执行官Louay Eldada介绍道。“固态塻块目前有120度的视野范围,所以三个可以构成360度视野覆盖。三个固态塻块的位置安排,可以是左前方和右前方各一个,第三个位于后方中间位置,或者每个角落各一个。”
车辆与物体间的距离是关键的安全参数,而缩小视野范围有助于提高该参数的测量精度。开发人员正在努力实现与相机和雷达相同的测距能力,目标约为200米(656英尺)。要获得理想的测距能力,需要权衡不同方面,位置点是帮助确定视野覆盖范围的关键参数;侧视模块不需要与前视模块拥有相同的测距能力,所以其视野范围可以更宽。
“我们已经实现了在15度视野内进行70米(230英尺)范围内的精确测定,但这明显不够,”采埃孚主动和被动安全部门产品规划负责人Aaron Jefferson说。“视野宽度至少需要达到50或60度。成本下降后,可以将模块与尾灯和前大灯合并。”
激光雷达可以弥补摄像机和雷达的不足,提供的信息在与其他传感器信息“融合”后,能够生成可靠的汽车环境图像。所有这些传感器会产生大量的数据,所以通信和数据管理将成为整体设计中的一个重要因素。
“3D激光雷达传感器将生成大量数据,但与雷达和摄像机类似的是,软件技术可以帮助减少数据量,剔除无用或不重要的数据,并从关注的数据中提取细节,”Jefferson说道。“此外,用于数据过滤、数据分组/分类、对象识别的技术也决定了需要处理的数据量,对于数据量管理而言,这才是真正的问题。”
市场充满兴趣,但仍在观望
尽管出现了很多新发展,但预计市场在未来一段时间内不会有太多活动。许多工程师都表示,激光雷达技术可以在等待自动驾驶车辆设计更加成熟的同时慢慢发展。目前,系统设计人员可以在等待下一代模块推出的时间内,采用机械组件创建原型。
“固态激光雷达将于今年晚些时候投入生产,但测试和软件开并不需要固态产品,”Eldada说道。“虽然我们计划固态模块在9月出货,但要到一年以后我们才能准备好车用级别的零件。”
激光雷达车辆的大规模推广和自动驾驶汽车一样,前景尚未明朗。在主流OEM开始订购激光雷达传感器之前,企业车队项目可能会扩大规模,并带来市场机会,优步(Uber)在匹兹堡正在进行的自动驾驶测试项目就是一个例子。
“我们正在研究2021年以前的批量生产情况,但不同领域的实现时间可能会有所差异,”McConnell说道。“一些车队服务公司正积极地推动在已进行地图测绘的区域应用自动驾驶技术。”
许多开发人员认为,激光雷达投入使用后并不会大量取代其他传感器,因为我们仍需要一系列技术,以支持在各种天气条件下实现自动驾驶所需的能力和冗余水平。
“我们并不认为3D激光雷达会替代现有传感器,而是将其看成是一种能够提供高分辨率感知系统、帮助实现SAE4级自动驾驶的创新技术,”Jefferson说道,“3D固态激光雷达、摄像机、雷达、超声波传感和其他技术将继续发挥作用,这些技术是在驾驶过程中360°实时感知车辆周围状况所必须的。
然而,这并不是一个普遍公认的结论。
“超声波技术将会消失,”Eldada反驳道。“摄像机可以支持色彩辨认需求,如查看交通信号灯。将激光雷达和摄像机融合,对数据进行‘着色’,可以使数据更有价值。雷达可以支持冗余需求,而进行转向或刹车决策还需要其他传感器信息的支持。”
Many autonomous-driving development plans call for deploying a handful of solid-state Lidar sensors on each vehicle, but the Lidar modules used for today’s prototype vehicles all are mechanical systems with moving parts. That’s prompted huge interest in Lidar, with OEMs and suppliers racing to invest in non-mechanical technologies.
Several small companies have developed solid-state Lidar technologies that aren’t ready for automotive applications—and some of those have been gobbled up by major automotive companies past 18 months. Ford made a large investment in Velodyne, while ZF bought a 40% stake in Ibeo. Continental acquired Advanced Scientific Concepts. Analog Devices Inc. (ADI) acquired Vescent Photonics Inc.
The interest stems from Lidar’s advanced use of emitted laser light to measure the distance of objects, functioning much like radar. The laser lets the system provide high resolution imagery at night and in rain or snow.
“High-resolution 'flash' Lidar is a necessary technology for autonomous driving because its capabilities are available in all lighting and weather conditions,” said Dean McConnell, Director of Customer Programs, Advanced Driver Assistance Systems, at Continental North America. “We’re capturing images at 30 Hz, constructing 3D point clusters thirty times per second.”
The technology also helps safety systems zero-in on objects of interest. That’s important to determine whether an object is a threat to driving.
“Lidar acts more like the human eye: it views a broad scene, doing a quick scan, then if it sees something interesting, it can focus in on that,” said Chris Jacobs, General Manager of Automotive Safety for ADI.
Lidar providers currently are racing to develop compact solid-state modules because the large mechanical pucks now used by autonomous-driving researchers are too bulky and costly to go into production vehicles. Researchers are striving to shrink sizes and come up with a good combination of distance and field of view.
“Our solid-state box measures 9 x 6 x 6 cm, about the size of two decks of cards,” said Louay Eldada, Quanergy's CEO. “Currently, it has a 120-degree field of view, so with three you have 360 degree coverage. There will always be two in the front, on the right and left sides, and one in the back middle or one on each corner.”
Determining the vehicle's distance to objects, a key parameter for safety, can be increased by narrowing the field of view. Developers are trying to achieve the same distance levels as cameras and radar, with a goal of around 200 m (656 ft). To achieve desirable distance performance, several tradeoffs are being considered. Location points are key parameters that help determine field-of-view coverage; modules looking to sides, for example, won’t need the same range capability as forward-facing units, so their field of view can be wider.
“We’ve demonstrated 70 meters (230 ft) with a 15-degree field of view, which is clearly not sufficient,” said Aaron Jefferson, Director of Product Planning for ZF’s Active and Passive Safety Division. “It needs to go up to 50 or 60 degrees to start. When the cost gets down, it’s conceivable that they could be integrated into taillights and headlights.”
Lidar will complement cameras and radar, providing information that typically will be "fused" with that from other sensors to create a reliable image of vehicle surroundings. All these sensors generate a huge amount of data, making communications and data management an important factor in overall designs.
“3D Lidar sensing will create a significant amount of data, but similar to radar and camera, there are software techniques to help minimize the amount of data, eliminate useless or unimportant data and extract the detail from the data of concern,” Jefferson said. “Furthermore, the techniques used to filter data, group/cluster data, identify objects, etc. also determine the amount of data that needs to be processed, which is the real concern in terms of managing data volume.”
Curiously, no real hurry
Though there’s plenty of development, the market isn’t expected to see much activity for some time. Many engineers say Lidar can develop slowly while waiting for autonomous vehicle designs to solidify. For now, system designers can create prototypes using mechanical components while they wait for next-generation modules.
“Solid-state Lidar will be in production later this year, but for pilots and software development, you don’t need solid-state,” Eldada said. “Though we plan to ship solid state products in Sept., we won’t have automotive-grade parts ready until a year later.”
The rollout of Lidar-equipped vehicles is as murky as the emergence of autonomous cars. Corporate fleet programs like Uber’s autonomous current tests in Pittsburgh may expand into market opportunities before mainstream OEMs start ordering Lidar sensors.
“We’re looking at series production in the 2021 timeframe, but it may happen faster in different segments,” McConnell said. “Some fleet-service companies are aggressive about getting vehicles out with automated driving in a geomapped area.”
Once Lidar is in use, many developers don’t expect it to displace many other sensors. A range of technologies is needed to provide the capability and redundancy needed to drive autonomously in all weather conditions.
“We do not see 3D Lidar as a sensor replacement, but rather as an innovation that can enables the high resolution sensing needed to realize SAE Level 4-plus automated driving,” Jefferson said. “3D solid-state Lidar, camera, radar, ultrasonic sensing and other technologies will continue to play a role—a combination of these will be necessary to properly sense the vehicle environment in 360 degrees, in real time.”
That’s not a universal conclusion, however.
“Ultrasonics will go away,” Eldada countered. “Video is needed for color, things like seeing traffic lights. Fusing Lidar and cameras 'colorizes' our data so it’s more valuable. Radar is needed for redundancy; you need another sensor before deciding to steer or hit the brakes.”
Author: Terry Costlow
Source: SAE Automotive Engineering Magazine
等级
打分
- 2分
- 4分
- 6分
- 8分
- 10分
平均分
- 作者:Terry Costlow
- 行业:汽车
- 主题:零部件安全性人体工程学/人因工程学电气电子与航空电子