iPhone X's TrueDepth Module Dissected

iPhone X's TrueDepth Module Dissected

PARIS — Although experts in the imaging industry are aware of a complex “TrueDepth” module that Apple has devised for its iPhone X, most other details inside the device’s 3D system — chips, components, and all the way down to substrates — remain a deep, dark secret.

EE Times talked to Yole Développement, which completed this week a teardown of Apple iPhone X TrueDepth module in collaboration with its partner, System Plus Consulting. They deduced that silicon-on-insulator (SOI) wafers are being used in near-infrared (NIR) imaging sensors. They noted that SOI has played a key role in improving the sensitivity of NIR sensors — developed by STMicroelectronics — to meet Apple’s stringent demands.

Pierre Cambou, activity leader for imaging and sensors at Yole Développement, called the SOI-based NIR image sensors “a very interesting milestone for SOI.”

Many companies located in France’s so-called Imaging Valley, near Grenoble, have used SOI wafers, developed by Soitec — initially for backside illumination (BSI) sensors. Meanwhile, research on SOI for NIR sensors dates back to 2005, according to Cambou.

But Apple’s adoption of ST’s NIR sensors marks the debut of SOI in mass production for image sensors, noted Cambou. “Image sensors are characterized by large surface due to the physical size of light. Therefore, this is a great market to be in for a substrate supplier” like Soitec, he added.

Meanwhile, Jean-Christophe Eloy, Yole's president and CEO, told EE Times that, in designing its TrueDepth module, “Apple took the best of both worlds — STMicroelectronics and Ams.” Apple adopted leading-edge NIR imagers from STMicroelectronics, while it deployed dot illuminators from Ams (Premstaetten, Austria). Eloy noted that Ams is “extremely good at its complex optical module.” Earlier this year, Ams acquired Heptagon, known for its Time-of-Flight (ToF) technology stack.

Apple iPhone X — Optical Hub CostingClick here for larger image (Source: Yole Developpement, System Plus Consulting)Apple iPhone X — Optical Hub Costing
Click here for larger image
(Source: Yole D�veloppement, System Plus Consulting)

Recap on how it works
Apple put a 3D camera on the front of the iPhone X to identify its owner’s face and unlock the phone.

As Yole previously explained, to make this possible, Apple combined a ToF proximity detector with an infrared “structured light” camera that can either use a uniform “flood” or “dot-pattern” illumination.

The way that the 3D system works is very different from a regular CMOS imager taking a photo. First, the iPhone X combines an infrared camera with a flood illuminator that projects uniform infrared light in front of the phone. It then takes images, which, in turn, trigger a face-detection algorithm.

This face-recognition function, however, isn’t meant to run all the time. The infrared camera linked to the ToF proximity sensor signals the camera to take a picture when it detects a face. The iPhone X then activates its dot pattern projector to take an image.

Both the regular and dot-pattern images are then sent to the application processing unit (APU), which puts them through a neural network trained to recognize the owner and unlock the phone.

Yole’s Cambou noted that no 3D image is computed at this point. The 3D information is contained in the dot-pattern image. “To run 3D applications, the same APU can use another algorithm [that] computes the depth map of the image.” He added, “The iPhone X takes advantage of the massive processing power available in the A11 chip, as structured light approaches are known to be computationally intensive. The use of a neural network is the key technology that made it possible.”

Next page: NIR image sensors


PreviousTaiwan R&D Group Rolls AI for Fault Prediction
Next    Intel Updates 5G, LTE Roadmaps