Apple, which is always exhausted by ridicule and innovation, has finally aroused people's curiosity again.
On March 18, an official news from Apple alarmed the autonomous driving industry. The news is not that Apple announced a breakthrough in its autonomous driving project, but on its brand-new iPad? Pro is equipped with lidar.
In recent years, lidar has almost become synonymous with autonomous driving. As a key sensor, it ensures the perception ability of driverless cars to the surrounding environment, but how to make it miniaturized, high-performance and mass-produced has always been a difficult problem in the industry.
On the iPad? How can Pro accommodate a large laser radar for such a "land-occupied" consumer electronic product? Can Apple's breakthrough in this field inspire or even promote vehicle-mounted lidar? Did you give Musk a slap in the face of "the uselessness of lidar"?
Lidar: From Space to Mobile Phone
Lidar, laser+radar, sounds like a tall device, but its basic working principle is familiar to everyone: distance = speed x time. If we simplify the lidar extremely, it is a ruler that uses light to measure-the light is emitted by the laser light source, reflected after "hitting" the object, received by the sensor, and then processed by the digital circuit to get the round-trip time. The speed of light x time /2 (note: not divided by 2 in high-precision applications) is the length it measures.
This ruler was first used in astronomy. For example, when the space shuttle docks with the space station, NASA will use it to measure errors, and it will also use it to monitor the atmosphere.
In fact, it can often be seen in the field of surveying and mapping. The laser rangefinder in the construction site is a basic form of laser radar to some extent. Accurate ranging can effectively avoid construction errors, which is an indispensable part of human engineering construction.
But now the most famous field of lidar is driverless cars.
How did the lidar go from the sky to the underground, and how did it go from the construction site to the car? History will go back for a while.
In 2007, Darpa (national defense? Advanced? Research? Project? The Agency (Defense Advanced Research Projects Agency) is holding an unmanned challenge, and the participating teams need to modify a vehicle themselves in order to drive dozens of kilometers in the harsh desert environment to reach the finish line. A key technique to be tested in this competition is called obstacle avoidance.
Clever engineers will soon find the connection between laser and this event-since laser rangefinder can mark a point with precise distance, then, if more rangefinders are installed on a device, according to the principle of "three points become one side", can't you draw the general outline of the object, thus avoiding obstacles? -Stanford University's autopilot team thinks so. Their vehicle participating in the Darpa Autopilot Challenge has a "rangefinder" on its head.
Unfortunately, this genius idea didn't help their vehicles run the whole course, and the performance of single-line rangefinder is still not good enough.
Williden, an American acoustic company deeply interested in this competition, summed up these experiences, jumped out of its own line and developed a 64-line lidar, just like a big flowerpot. It is equivalent to installing 64 synchronous rotating surveying and mapping instruments in the front of the car. After processing, its output signal can restore the three-dimensional characteristics of the object to a considerable extent? .
Wilden is a kind of laser radar, which obtains enough depth information through the basic principle of laser ranging to help autonomous driving realize 3D vision.
It proved to be effective. In the second unmanned challenge, after installing the 64-line laser radar of Wilden, Stanford's unmanned vehicle finished the race. Unfortunately, Stanford University is second only to Carnegie Mellon University, because the latter's car is also equipped with the same lidar.
Later, with its more accurate environmental perception, lidar gradually became the standard for unmanned driving. There is only one exception-Tesla CEO Musk thinks it is stupid to use lidar.
However, although Musk played the devil's advocate, it did not stop the progress of lidar, and the vehicle-mounted lidar developed in the direction of higher precision, smaller size, higher reliability and lower cost. These characteristics can be summarized in two words: miniaturization and integration, which are the key words to be talked about in this paper and also represent the development trend of lidar.
Many years ago, Darpa contestants may not have thought that the result of technological development is so interesting that one day lidar will be used on tablets or mobile phones.
In Apple's new iPad? Pro's copy clearly States the function you want to support-AR (enhanced? Reality, augmented reality)
Prior to this, the idea of AR mostly relied on the processor of the mobile phone to drive the AI algorithm, "dig out" objects from the pictures taken by the camera of the mobile phone, and then superimpose them in the corresponding places (such as the dynamic pictures common in various short videos). This has the advantages of low cost and little difficulty in implementation, but the sensor of the camera itself does not output "distance" information, so the depth information is usually an estimate, which is not accurate. For some AR applications that need "high-precision depth information" and "high-restoration 3D modeling", such as AR fitting, AR home improvement and even better experience of AR games, the inherent deficiency of the camera in hardware has become a stumbling block to their development.
Apple is undoubtedly a pioneer in promoting AR as the next generation core technology. In order to remove the obstacles to the development of AR, Apple introduced lidar. As for how to put the original huge lidar into the petite iPad? In Pro, based on the published data and the analysis of people in the lidar industry, Apple should use VCSEL (Vertical Cavity Surface Emitting Laser) as the light source at the transmitting end and SPAD (Single Photon Avalanche Diode) as the signal receiving end. The similarity between the two is that some functions of the large flowerpot lidar can be realized on a small semiconductor.
With the introduction of lidar into iPad, we can also expect that the development trend of miniaturization and integration of lidar will make it possible to appear in the future iPhone and even the legendary Apple ar glasses.
Interestingly, Apple is officially interested in the iPad? The description of "lidar scanner" on Pro has also caused discussion among people in the lidar industry. Usually, a necessary condition for becoming a lidar is that the laser light source can rotate or the laser beam can change its emission angle.
What about the iPad? According to the form and function of Pro, there is no room for moving equipment, and there is no need to emit laser outside the camera's field of vision, so people in the industry speculate that iPad? "Lidar" on Pro does not scan reality, but it is easy to popularize by borrowing the more common concept of "scanner".
Behind Apple's History Reconstruction
Is Apple actually on the iPad? The use of lidar in Pro, although somewhat unexpected, is not groundless. Because of the remarkable role of lidar in three-dimensional reconstruction, Apple should have been exposed to lidar a long time ago-even before they started the development of autonomous driving. As early as a decade ago, when Apple Maps began to develop 3D maps, it used lidar to restore the urban landscape.
Looking back at the field of consumer electronics, when 3D vision becomes a hot technology again due to the prosperity of face recognition, AR and VR, a technical evolution route leading to lidar may have been paved in advance.
In 20 17, when Apple introduced its famous face? ID, this function has been blessed by 3D vision-structured light technology. At that time, the technology of structured light was not just face? ID provides support, in fact, it also derived animoji, and Apple users create personalized expressions by taking selfies. Although animoji is tepid, Apple's intention to lay a solid foundation for AR technology in advance has begun to appear.
In 20 18, Huawei, Samsung, LG, Vivo and other mobile phone companies also aimed at the opportunity of AR and began to apply ToF (time? Yes? Flight) technology, its principle is basically the same as laser ranging, relying on infrared laser as a detection means. Therefore, compared with the mobile phone with camera only, the mobile phone with ToF blessing has stronger three-dimensional visual ability and can also provide better support for AR applications.
However, the ToF technology that began to be applied at this time belongs to iToF(indirect? Time? Yes? Flight (indirect time-of-flight method) does not directly use the flight time of light to calculate the distance, but indirectly calculates the distance by calculating the phase difference generated by continuous light waves with modulation frequency. One advantage of using iToF is that it is well connected with the traditional image sensor CMOS/CCD technology, which can reduce the cost of technical scheme, and the functional module is easy to be miniaturized and stuffed into consumer electronic products.
However, the iToF scheme also has its shortcomings-the indirect phase difference calculation method adopted in this scheme is essentially to make up for the lack of response time of the image sensor (ns, nanosecond level, 1 nanosecond is equal to one billionth of 1 second, and the distance traveled by light 1 nanosecond is about 0.3m). Because of the limitation of conditions, the detection accuracy of iToF technology is not only limited (centimeter level), but also decreases with the extension of distance.
And Apple is on the iPad? DToF (direct? Time? Yes? Flight (direct time of flight) technology is obviously an upgrade of iToF. As the name implies, compared with iToF, dToF directly measures the flight time of light to calculate the distance, and its sensor response time can reach ps (picosecond, 1 picosecond is equal to one millionth of 65438+ 1 second). The detection accuracy can not only reach millimeter level or even sub-millimeter level, but also the key to this technical difference lies in the use of two different sensors-DTOF technology.
SPAD(? Single? Photon? Avalanches? Diode (single photon avalanche diode) is a specially designed photodiode. Compared with traditional photoelectric sensors (such as CMOS and CCD), it has millions of times of gain, extremely high sensitivity and the ability to detect a single photon. Its performance theoretically improves the upper limit of detection distance and detection accuracy of lidar. However, before, SPAD had a feature that could not be ignored: miniaturization was difficult.
We know that the key to a typical electronic imaging system, such as a video camera, lies in the built-in image sensor-CMOS/CCD. A large number of imaging units are integrated on a small chip and arranged in a dense sensor array, which makes various cameras have tens of millions of pixels (resolution).
Similarly, 3D vision needs high enough resolution, and the restored world will be as real as possible. So if SPAD has the ability of imaging, it also needs to form an array. The array of multiple SPADs is called SiPM (Silicon Photomultiplier Tube) in the industry.
Schematic diagram of CMOS array
Although SPAD/SiPM was born in the 1990s, due to industrial maturity, technology and other reasons, the size of SPAD has not come down, so SiPM can't stack enough SPADs in a small volume, and the resolution naturally can't go up.
In recent two years, with the growth of industry demand and the progress of technology, the integration of SPAD has been greatly improved, and production has been brought into the traditional semiconductor manufacturing category. For example, Panasonic exhibited a large-scale SPAD array of1200x900 (10.8 million pixels) this year, which is close to the more mature iToF sensor per inch.
In addition to Panasonic, Sony in Japan, stmicroelectronics in Europe and Anson Semiconductor in the United States are all developing and promoting SPAD sensor arrays in the field of consumer electronics or automobiles. Foreign media have news that Apple iPad? SPAD array for Pro may have a resolution of 320X240 or 640X480, which means it may exceed 300,000 pixels at most.
Back to the automobile field we are concerned about, it is nothing new to use SPAD array on lidar. In 20 18, Ouster, an American laser radar startup, publicly stated that its new laser radar adopted VCSEL light source and SPAD sensor array, thus achieving low cost and high detection accuracy.
So, Apple is on the iPad? Pro's lidar, which stirs up a thousand waves with one stone, is actually supported by the progress of the whole industry, and even does not rule out the mutual learning of law and theory under different application scenarios in different industries.
However, people in the lidar industry also said that although the miniaturization and integration of lidar components in the industry have made remarkable progress, it is still not an easy task to stuff its emitting light source, sensor array and digital processing circuit into consumer products. In this process, Apple has to deeply customize and integrate with suppliers, and does not rule out the use of 3D? IC technology, stacking each unit of lidar, saving space.
Will Musk be hit in the face?
For practitioners of lidar, Apple's application of this technology on iPad is undoubtedly exciting. Because Apple, which has strong engineering innovation ability, uses lidar on iPad, which is undoubtedly an affirmation of the detection ability of this technology.
In 20 18, Tesla CEO Musk, who is also known for his strong engineering innovation ability, publicly opened the lidar and considered it a "crutch" for autonomous driving. The use of lidar in the industry is a manifestation of technical deficiency. He compared the human driving mode and thought that the pure visual intelligence combination of AI+ camera was enough for unmanned tasks.
However, Apple's autonomous driving project is a fan of the lidar technology route. Today, the lidar is on the iPad? Can the application of Pro be deduced to the field of automatic driving to prove the superiority of its technology? Can you pass the iPad? The mass production of Pro further matures the industrial chain, so that lidar will have more application opportunities in the vehicle field?
Although this is an excellent opportunity for lidar to "promote" itself, people in the industry still show restraint when analyzing routing institutions-interlacing is like a mountain, and lidar is on the iPad? The large-scale application of Pro in consumer electronic products will not directly benefit automobile lidar for the time being, nor can it completely prove that Tesla's pure visual route will not work. The reason is that it is difficult to make a simple analogy between consumer electronics and automobile automatic timing because of different scenarios and requirements.
With iPad? For example, Pro uses lidar, which not only takes a fancy to the measurement accuracy of lidar, but also wants to "stand on tiptoe" when the industrial chain grows up. Through this scheme, the performance, volume, cost, manufacturing difficulty and reliability are balanced, and the "local optimal solution" for accurate detection within 5 meters is obtained.
In the field of vehicle-mounted lidar, there are more stringent requirements for detection distance, real-time (dynamic detection capability), resolution, anti-interference and reliability (vehicle trajectory). At least so far, the lidar industry has not found a low-cost, reliable performance, which can be accepted by the downstream enterprises of autonomous driving.
However, the lidar industry is trying to "integrate". Arqie, co-founder of North Mitsuko Hoshi, said that since 20 19, practitioners of lidar are no longer limited to a certain technical route, but jump out from the technical separation of mechanical/solid-state, flash surface light /OPA coherent light /MEMS microseismic mirror, and mix the underlying technologies originally belonging to various camps to solve specific problems that need to be dealt with in laser radar mass production.
One of the reasons why Musk's route of realizing autonomous driving by pure visual intelligence in his early years was not optimistic by the industry was that the technology consumed too much computing power and the AI algorithm was not perfect. In recent years, with the improvement of computing power and algorithm of on-chip chip, this route is attracting more and more enterprises' attention.
So in comparison, both lidar and AI pure vision have their own growth curves. Apple iPad? Using lidar on Pro, it is not the car lidar to cheer for the time being, nor can it form a "face" for Musk.
However, referring to the engineering design trend led by Apple's "Zen" and "minimalism" design concepts for many years, the engineers of the car lidar are dismantling an iPad? After Pro, it may be possible to make lidar more beautiful.
This article comes from car home, the author of the car manufacturer, and does not represent car home's position.