We all know the drill. For the last decade, smartphones have gotten thinner and faster and thinner and faster and, well, you get the picture.
But it’s too soon to write off our smartphones as boring. The gadgets are still evolving with new technologies. And for a clue as to what the smartphone of the future might look like, turn your attention to the device’s cameras and the software and sensors that make them tick.
Apple announced on Thursday it will hold a special event on Sept. 12 at its new Cupertino, Calif., headquarters. The company is expected to introduce a set of new iPhones that has, among other upgrades such as 4K high-definition format, a premium model that can scan 3-D objects. Even your face.
Samsung, the No. 1 phone maker, also recently introduced the Galaxy Note 8, highlighting its fast dual-lens camera as the signature feature. And rivals will soon work to catch up with Samsung and Apple.
“2018 will be the year where the smartphone camera takes a quantum leap in technology,” said Philip-James Jacobowitz, a product manager for Qualcomm, a chipmaker that provides components to smartphone makers.
You can expect that as soon as you pick up your gadget, it will see you and know you are the owner and unlock the screen. Overseas, you will be able to point the camera at a restaurant menu to translate items into your native language. When shopping for furniture, you could point your phone camera at your living room floor and place a virtual rendering of a coffee table down to see how it looks and move around and peek underneath it.
Some of this futurism is already starting to happen.
Jacobowitz also stressed that emerging camera technologies would be the key to stronger security features and applications for so-called augmented reality, which uses data to digitally manipulate the physical world when people look through a smartphone lens.
Here’s a rundown on what this all means for how your next smartphone will work.
For the last few years, we have become accustomed to unlocking our smartphones by scanning our fingerprints or entering a pass code. But when Apple shows its new iPhones, expect infrared facial recognition as a new method for unlocking the device.
How would the new iPhone do that? Apple declined to comment. But Qualcomm’s Spectra, a so-called depth-sensing camera system, is one example of how face scanning works.
The Spectra system includes a module that sprays an object with infrared dots to gather information about the depth of an object based on the size and the contortion of the dots. If the dots are smaller, then the object is farther away; if they are bigger, the object is closer. The imaging system can then stitch the patterns into a detailed 3-D image of your face to determine if you are indeed the owner of your smartphone before unlocking it.
“You’re seeing the contours of the head — it’s not just the front of the face as you’re typically thinking about,” said Sy Choudhury, a senior director of product security for Qualcomm.
Although there will be limitations, because of the uniqueness of a person’s head shape, the likelihood of bypassing facial recognition with the incorrect face is one in a million, he added. That compares with a false acceptance rate of one in 100 for previous facial recognition systems, which had very poor security.
It remains to be seen how exactly face scanning will work in the next iPhone. But Apple is well acquainted with depth-sensing camera technologies. In 2013, the iPhone maker acquired PrimeSense, a company that developed sensors for Microsoft’s Kinect, a depth-sensing camera system that let Xbox players control games using body movements. Analysts expect some rendition of PrimeSense’s technology to appear in future iPhones.
Depth-sensing cameras may be critical to enhancing augmented reality, a jargony industry term that probably makes your eyes glaze over. But bear with me for one moment: Augmented reality will have major implications for future mobile apps.
It’s no secret that Apple is bullish about augmented reality. In a recent financial earnings call, Apple CEO Timothy Cook called augmented reality “big and profound,” with major implications for gaming, entertainment and business products. This fall, Apple will release iOS 11, its next mobile operating system that includes support for applications made with ARKit, a tool kit for app developers to easily create augmented-reality applications.
ARKit uses a combination of the iPhone’s camera and motion sensors, including the accelerometer and gyroscope, to let people lay digital objects on top of the real world and interact with them with precise movements.
The furniture maker Ikea is using ARKit on its coming app Ikea Place.
“This is like a real application that real people can use to make real-life decisions,” said Michael Valdsgaard, the head of digital transformation at Ikea. You can get a sense of how, for example, a bed fits in with current furniture in the room.
Vertical planes — so in Ikea’s example, the walls — are trickier for ARKit to detect, so there will still be work on that end.
All the tech giants are betting big on augmented reality. For years, Microsoft has been developing HoloLens, an augmented-reality headset. In April, Facebook announced Camera Effects Platform, an environment for software developers to build augmented-reality apps for Facebook. This week, Google unveiled ARCore, an augmented-reality tool kit for Android devices, in response to Apple’s ARKit.
Blair MacIntyre, a research scientist who is working on augmented reality for Mozilla, the organization that makes the Firefox web browser, said augmented reality will have huge potential. He envisioned people being able to take a tour of a natural-history museum, pointing their smartphone cameras at a fossil exhibit to bring a dinosaur back to life.
But he said that augmented reality on smartphones was a stopgap to the inevitable: wearing data in front of your face at all times through some kind of headset.
“If you look at science fiction, a lot of it has this characteristic of being always on and serendipitous,” he said. “You get a lot closer to that when you get a head-mounted display.”
Until that happens, smartphones are about to become much smarter.