Three months later, at the Baidu AI Developers Conference, Qilu stood on the podium again and announced the specific details of Apollo 1.0. 1.0 mainly released a complete closed field tracking autopilot. The specific opening ability focuses on data platform 1.0, 3D obstacle marking data, road hacker data, high-precision map data, etc.
Following the rhythm of "weekly update and new version every two months or so", Apollo 1.5 was released last September. According to Xinzhijia's report at that time, the whole system added 65,000 lines of code, focusing on five capabilities: obstacle perception, decision planning, cloud simulation, high-precision map service and end-to-end deep learning, among which the first four capabilities are all. Moreover, 1.5 supports automatic driving in fixed lanes day and night, and can realize obstacle recognition in night environment and heterosexual obstacle recognition in atypical traffic scenes.
More than three months have passed since the release of Apollo 1.5, and the Apollo program is about to usher in the update of 2.0. This time, Baidu put the major release on CES 20 18. Prior to this, Baidu invited many domestic media to its new R&D center in Silicon Valley and held a large-scale self-driving car experience pre-exhibition.
R&D Center is located in Sunnyvale, Silicon Valley, adjacent to San Francisco Bay. In June last year, 5438+ 10 was officially unveiled, and it will be mainly used for related research and development in the fields of autonomous driving and Internet security in the future. Together with the R&D centers or offices that Baidu previously opened in Cupertino and Seattle, Baidu has now formed a tripartite linkage in the United States.
It is understood that Baidu American Research Institute has more than 200 employees, mainly focusing on artificial intelligence, security and hardware, and autonomous driving for advanced technology research.
Baidu also invited many partners of Apollo platform to attend the autopilot experience meeting, including NVIDIA, a GPU giant, AutonomouStuff, a self-driving car modification company, ZF, an auto parts supplier, and Panda Auto, a time-sharing car rental company.
The cooperation between these enterprises, you have me and I have you, constitutes the "ecological circle" that Baidu expects.
In June 5438+10 last year, ZF became the first automobile supplier to announce the adoption of NVIDIA Drive PX2 computing platform, and both parties wanted to build ZF ProAI;; Automatic driving control platform;
In July last year, Baidu and NVIDIA announced the joint development of autonomous driving technology, and the two formed an alliance; Baidu's current self-driving car modified based on Lincoln MKZ is completed in cooperation with AutonomouStuff; Almost at the same time, AutonomouStuff announced that it will provide related kits for NVIDIA's autonomous driving platform-Drive PX and sensors;
In September last year, ZF and Baidu jointly announced that they had formally reached a new strategic cooperation and would cooperate in the fields of autonomous driving, car networking and mobile travel services to develop a complete autonomous driving solution for the China market;
Last June, 5438+065438+ 10, Baidu cooperated with Panda Automobile. Baidu integrated the autopilot technology of its Apollo platform into the shared operating vehicle of Panda. The latter will take the lead in trial operation of self-driving shared cars in Liangjiang New District of Chongqing, which can reach L3 level and support automatic parking and App calling. The self-driving vehicle cooperated by Baidu and Panda adopts ZF ProAI control platform.
Of course, this is not the whole of Baidu Apollo's autonomous driving ecology. This year's CES, with the update of Apollo 2.0, more partners of Baidu will gather at the "Baidu World" conference.
However, before this, all the limelight belongs to the test ride of self-driving vehicles. This time, Baidu demonstrated L3-class self-driving time-sharing rental models in cooperation with ZF and Panda (including Lifan) and L4-class self-driving models equipped with the latest technology of Apollo 2.0, and provided test rides.
Apollo 2.0+ drives PX2 = L4 autopilot.
This is a self-driving prototype with Baidu Apollo 2.0 system and L4 level capability.
This self-driving car was transformed by Baidu and AutonomouStuff based on the Lincoln MKZ model. The roof is equipped with 64-line lidar and two 16-line lidar, as well as GPS positioning sensors. On the left and right sides of the lidar, a forward-looking camera is installed respectively, which can identify targets by visual means, such as traffic lights. On the front face of the vehicle, a millimeter-wave radar from the mainland is installed, and the installation position has been precisely adjusted.
Of course, such a set of sensor configuration can be flexible, and sensor fusion technology plays a key role. As long as the corresponding algorithm is modified according to the sensor data, the sensor configuration can be changed, which can be customized.
In the trunk of the vehicle, it is the most critical on-board computer for automatic driving and the computing unit of the whole system.
Baidu's self-driving car is equipped with two on-board computers: one is an industrial computer based on Apollo 2.0 (using Neosys Nuvo-6 108GC, which is a powerful X86 deconstruction industrial computer); Next to it is NVIDIA's Drive PX2 for autonomous driving. According to the technical director of Baidu, although these two computers serve two different systems, Apollo system will support both systems.
In fact, any computer can realize the automatic driving of vehicles, and Baidu is configured in this way, avoiding the process of disassembling the machine back and forth during the test. Asked about the performance difference between the two computers, the technical director told Xinzhijia that NVIDIA's Drive PX2 has better performance in image processing because it has its own GPU and image processing software. For another ordinary industrial computer, Baidu joined NVIDIA's GPU to process image data and point cloud data generated by lidar scanning. The computing power of industrial computer is stronger than that of Drive PX2.
Baidu said that at this stage, this car supports driving on simple urban roads and high speed. This test ride is also the first time that Baidu Apollo 2.0 has conducted a road test on a highway in California.
Baidu's L4-class self-driving vehicle equipped with Apollo 2.0 was tested on the public road around Baidu Meiyan office building. The road-related information of the entire driving area has been collected, and Baidu has specially produced a high-precision map for this section to assist self-driving vehicles.
During the whole test ride, you will encounter traffic lights, intersections, other motor vehicles and bicycles, and the vehicles need to brake, start, change lanes and accelerate themselves. In addition, it was rainy that day, which also brought some difficulties to self-driving cars.
Overall experience, because it is a simple urban road, this process is relatively smooth, with a top speed of 56 km/h, and the smoothness of vehicle braking and lane change is still quite remarkable. But when speeding up, the car still looks a little unstable.
The person in charge of Baidu related technology told Xinzhijia that at present, the relevant update code of Apollo 2.0 has been uploaded to Github, including relevant software and hardware guidance. The final conference of Apollo 2.0 will be held at CES 20 18, and it will arrive in a few days.
Apollo 2.0+ProAI+ Panda = L3 Autopilot Landing
In addition to L4-class self-driving cars, Panda, a new energy car time-sharing company headquartered in Chongqing, as a partner of Baidu Apollo platform, airlifted the results of their cooperation to Baidu Meiyan in Silicon Valley.
In fact, both sides are inaccurate. This car is actually the result of the cooperation of Baidu, ZF, Bosch, Lifan and Panda. Among them, the technology provided by ZF is very special ZF ProAI, an autopilot control platform jointly developed by ZF and NVIDIA, has just been mass-produced and adopted by Panda. It is said that the whole process from the announcement of cooperation by several parties to the landing of self-driving cars took only over a month.
In terms of sensors, this car is very simple, with only two front and rear cameras, 12 ultrasonic radar. This sensor configuration has enabled it to have L3-level automatic driving capability, and also supports automatic parking and autonomous parking.
In fact, the low speed, simple sensor configuration and the ability to automatically find parking spaces and park independently make this car very suitable for time-sharing or other shared services to some extent.
Gao Yu, CEO of Panda Auto, told Xinzhijia that at present, Panda has started the pilot operation of self-driving time-sharing vehicles in Liangjiang New District of Chongqing, and dozens of vehicles will be put into trial operation in advance. Whether this is a good business or not, no one knows now.
From the real experience, this car is very slow because it can't drive on public roads at present; When braking and steering, the overall experience is not as smooth as the previous L4 self-driving car.
Whether L3 or L4, after watching the complete video, you should have a sense of Baidu's autonomous driving ability. See CES 20 18 Apollo 2.0.