The depth-sensing capabilities of the Apple Car may not be limited to just LiDAR, with Apple suggesting a version using an infrared camera and light pulses could also be used to detect road obstacles.
Of the many challenges of creating a self-driving vehicle, arguably the biggest one is getting the car’s self-driving systems to actually get data from the road itself. There is a myriad of sensors and systems available to use, though with considerable variations in terms of accuracy, cost, and physical requirements.
For example, LiDAR is a technology that is useful for depth sensing, but it is still a relatively expensive one to employ due to the parts involved. By taking advantage of cheaper components in a different system, Apple could feasibly get close to the accuracy of LiDAR without as much of the expense.
In a patent granted to Apple by the US Patent and Trademark Office on Tuesday titled “Remote sensing for detection and ranging of objects,” Apple suggests such a system, effectively consisting of a controllable light source, timing circuitry, and a circuit or processor for determining range.
For Apple’s proposal, the light source could be a vertical-cavity surface-emitting laser (VCSEL), but the more interesting version is its use of a near-infrared light. A camera is also employed to create a video feed of a monitored area, one which could be an infrared camera in the latter case.
Both versions would depend on timing circuitry to emit pulses of light out into the environment, which are then reflected off nearby objects and returned to the camera. In short, by measuring the time it takes for the pulses of light to get back to the camera, the range determination circuit could work out how far away a detected object is, with the results able to be provided to other computing systems used for autonomous driving.
The range determination would be calculated using a window timing circuit, generating multiple windows of exposure that corresponds to the light pulses. The presence of a light pulse within a window would inform the range determination circuitry of how long it took for the pulse to travel the full distance to the object and back.
For example, a detected pulse in one window of time could suggest a short distance, while the following window would put that distance as larger, and with longer distances with subsequent windows.
As a camera is being used, it can cover a wide field of view, allowing for a large area to be monitored by the system rather than individual points, as LiDAR typically covers. Furthermore, it also improves upon the use of standard video cameras, which use context, contrast, and color to determine object placement as part of image processing, as the proposed system would only need to use the detected presence of infrared light pulses to determine distance.
It is proposed that the system could be used alongside an existing LiDAR setup, to create a hybrid system that detects objects at different levels of accuracy. The lower-accuracy light pulse camera setup detailed in the patent could be used to generate areas of interest for the vehicle’s autonomous system, which can then inform of where the LiDAR element should focus its scanning efforts.
In effect, this allows for full scene coverage, but with a higher degree of detection accuracy at object detection and range-finding in select areas where required.
The use of windows of detection can also be used to enhance the accuracy of the proposed system. After a first wave of pulses and exposure windows, the system’s processor could then decide that a second wave could require more resolution of monitored distances.
To accomplish this, the windows of time can be shortened, increasing the number of ranges the distance could be calculated to be within.
The patent lists its inventors as Richards E. Bills, Micah P. Kalscheur, Evan Cull, and Ryan A. Gibbs.
Apple files numerous patent applications on a weekly basis, but while the presence of a patent filing indicates areas of interest for the company’s research and development efforts, there is no guarantee that the concepts described will appear in a future product or service.
The “Apple Car” is widely believed to have some form of self-driving technology. The company has publicly been testing its driving systems on special test beds in California for a number of years, with the vehicles largely relying on LiDAR units attached to the car’s exterior.
In 2017, Apple published a research paper explaining how its LiDAR-based 3D object recognition worked for autonomous vehicles.
As with its extensive research into the subject, Apple has also secured many patents and filed more applications over time surrounding the technology it has come up with.
In 2016, it came up with a method for “Collision Avoidance of Arbitrary Polyonal Obstacles, usable by a self-driving system, as well as a new form of LiDAR mapping the same year. A “Confidence” systemhas also been proposed to allocate resources to specific data sources, rather than wasting processing on a giant mountain of data generated by the onboard sensors.
The idea of using the latest proposed system with other sensors has also been raised, with the “Shared sensor data across sensor processing pipelines” patent from June 2020 suggesting how processing systems for different data sources could be piled together into one pipeline.
As for how this plethora of sensors could appear on the final Apple Car, Apple may be able to hide it all from view. Another patent granted in October 2019 offered the idea of putting sensors within the car bodywork, such as its research into making compact and cheap radar systems.