Apple is exploring methods that improve the usability of iPhones when exposed to high-moisture scenarios, such as underwater or in the rain. The main goal is to somehow improve the touch sensitivity of the screen in such a scenario – or at least create a system that can provide enough touch sensitivity to distinguish between a valid finger gesture and accidental ghost touch through exposure to liquid.
According to an Apple patent applicationcould one of the implementations include a Force Touch like system – that Apple should bring all the way back – which measures the amount of force applied to the screen. This patent raises the idea of using a force input detection sensor or load detector to identify the touch point and determine accordingly whether it is from a finger gesture or just a liquid splash.
There are several technical avenues in which the proposed system can come to life. For example, the ambient light sensor will set things in motion when it detects that the amount of ambient light exposure has been reduced due to moisture above the sensor or in the event of underwater activity. Here, according to the patent, the ambient light sensor serves as the ambient sensor.
However, a pressure sensor that can determine events of water immersion can also serve as an environmental sensor. An electromagnetic sensor that emits radiation such as infrared waves can also be used as an environmental sensor to check whether the surrounding space is flooded or covered with water.
The camera array can also be used to perform depth analyzes of the environment by studying optical features. This includes the refractive index of the surrounding medium, as well as the light absorption pattern to analyze whether the device is submerged in liquid. The patent application raises the idea of using a system of capacitors to distinguish between “false touch” and “true touch”.
If an environmental sensor detects moisture above the screen, the capacitance detectors help the processor calculate whether the touch input is coming from the user or the liquid covering the screen. A certain threshold for change in capacitance across the screen will be set, and anything below or above it will be categorically identified as true or false touch input.
Aside from improving touch sensitivity, the patent also talks about tweaking certain UI elements to make it easier for users to use a device underwater. For example, icons of the most used apps can be enlarged for easier access. In addition, the user experience of apps like the camera application can be simplified to make it easier for users.
Another proposal involves assigning shortcuts to a few custom keys that appear on the screen once the device is submerged or when a lot of moisture is detected (such as rainfall). The idea here is that when the usual touch input is prone to fail, these customizable keyboard shortcuts allow users to quickly perform tasks such as clicking a photo, recording a video, or making a call.
As good as the ideas described above sound, keep in mind that this is just a patent application. Simply put, it’s just an exploration of tech ideas that may or may not show up on an iPhone. But after reading a handful of Apple patents exploring wild concepts like a curved glass MacBook with a connected touch-sensitive keyboard or an iPhone with the Mac Pro cheese grater designthis one sounds feasible enough to really get off the ground.