X-factors Hobble iPhone X

X-factors Hobble iPhone X

TAIPEI — The word on the street here is that Apple is having yield problems with its iPhone X. Foxconn, which does all the manufacturing of Apple's highly anticipated forthcoming model, appears unable to make enough of them.

As several media reports indicated, Apple has been forced into a 50-percent cut in shipments of its flagship iPhone X this holiday season, slashing volume from 40 to 20 million units.

What caused this massive drop in the production flow?

Apple has ended up behind the eight ball for a lot of reasons that range from component shortages to technical complexity. The industry consensus is that, if anything, it was Apple’s own engineering ambition that triggered its mass production issues.

Pierre Cambou, activity leader for imaging and sensors at Yole Développement, told EE Times, “This is a risk Apple took,” knowing full well the complexity of the 3D system they were putting together. Cambou cautioned us that we shouldn’t be so dramatic about the problem. None of these are issues engineers can’t solve, he said.

However, since Apple first unveiled iPhone X, Cambou was predicting complications that could arise from its 3D sensor system. It consists of several modules, each containing components sourced from multiple suppliers. The toughest challenge, Cambou said, comes in assembly, when all the modules “need to be actively aligned.” Both sides — emitters and receivers included— must be activated together, to ensure that the whole system functions correctly.

In its recent report, Yole Développement broke down the technical elements of Apple’s 3D sensor system and explained how the system works.
First, there are several steps involved for iPhone to identify faces. The iPhone X combines an infrared camera with a flood illuminator that projects uniform infrared light in front of the phone. It can then take images, and a face detection algorithm searches for, well, faces, Cambou explained.

(Source: Yole Developpement)(Source: Yole D�veloppement)

To make sure the 3D system isn’t working all the time, the infrared camera is “most probably linked to the infrared ToF proximity sensor, which has a different wavelength bouncing off anyone in front of the phone,” Cambou theorized. “This may be the signal for the camera to take a picture and if it detects a face then the iPhone X activates its dot pattern projector to take an image of the face with this dot pattern on it.” 

It then sends both regular and dot-pattern images to the application processing unit (APU), which puts them through a neural network trained to recognize the owner and therefore unlock the phone, according to the Yole analyst.

“What is special about this approach is that no 3D image is ever computed at this point: the 3D information is contained in the dot pattern image,” Cambou said. In running 3D applications, the same APU can use another algorithm to compute the depth map of the image. The iPhone X takes advantage of the massive processing power available in the A11 chip, “as structured light approaches are known to be computationally intensive,” Cambou explained. “The use of a neural network is the key technology that made it possible,” he said. 

Given the many in this process, the question is: which components — supplied by whom — are involved?

Yole listed the infrared camera, proximity ToF detector and flood illuminator. Analysts said that these elements “seem to be treated as a single block unit,” which is supplied by STMicroelectronics, along with Himax for the illuminator subsystem, and Philips Photonics and Finisar for the infrared-light vertical-cavity surface-emitting laser (VCSEL). 

On the right hand of the speaker, the regular front-facing camera is probably supplied by Cowell, and the sensor chip by Sony, the Yole report said. 
On the far right, the “dot pattern projector” is from Heptagon, a subsidiary of Ams. Yole said, “This last component is probably the most exotic one,” since the iPhone X is the first smartphone to use it. It combines a VCSEL, probably from Lumentum or Princeton Optronics, a wafer level lens and a diffractive optical element (DOE) able to project 30,000 dots of infrared light.

Since other companies doing Time of Flight cameras are also demanding dot pattern projectors capable of higher density between “30K and 40K dots,” the supply of Heptagon device is tight, Cambou observed. 

Apple, in a sense, is facing a double whammy of component shortage and design complexity. “Those two are never a good combination,” said Cambou. He also suspects the potential heat problem for a plastic lens in the module.   

But again, Cambou reiterated that Apple knew from the start that it would face potential supply and yield issues.

Although Apple might be getting flak for an iPhone X shortage, “Apple has put its customers ahead of its shareholders,” Cambou said. “Apple wanted the best for consumers and risked the tight supply.”

If Apple sells 20 to 25 million units of iPhone X this year, “I think they’ll do just fine,” he said. Over time, he suspects that Taiwan’s Himax Technologies might get back in play to complement Heptagon’s offerings. 

— Junko Yoshida, Chief International Correspondent, EE Times Circle me on Google+, and additional reporting by Alan Patterson, based in Taiwan, who covers the semiconductor industry for EE Times.


PreviousWinbond Bolsters Flash Security
Next    Drone Challenge Rides Soft Radio