The Google Pixel’s Squeeze for Assistant was a buttonless button

The Google Pixel’s Squeeze for Assistant was a buttonless button

The Pixel 2 is a nearly five-year-old phone, but it introduced a feature that I’m missing more and more with each passing year. It was called Active Edge and it allowed you to bring up Google Assistant just by squeezing your phone. In a way, it’s an unusual idea. But it effectively gave you something that modern phones are sorely lacking: a way to physically interact with the phone to simply get something done.

If you look at the sides of the Pixel 2 and 2 XL, you won’t see anything that suggests you’re holding something special. Sure, there’s a power button and volume rocker, but otherwise the sides are sparse. Squeeze the phone’s bare edges well, however, and a subtle vibration and animation will play when Google Assistant appears at the bottom of the screen, ready to listen to you. No need to wake up the phone, long press physical or virtual buttons, or tap the screen. You press and start talking.

Looking at the Pixel 2’s sides, you’d never guess it’s actually a button.
Photo by Amelia Holowaty Krales / The Verge

We’ll talk about how useful this is in a moment, but I don’t want to sugarcoat how cool it feels. Phones are rigid objects made of metal and plastic, and yet the Pixel recognizes when I apply more pressure than when I’m just holding it. According to an old iFixit teardown, this is made possible by a pair of strain gauges attached to the inside of the phone that can detect the tiniest bend in your phone’s casing when you squeeze it. For the record, this is a change that my human nervous system cannot detect; I can’t say the phone bends at all.

Whether you found Active Edge useful probably depends on whether you enjoy using Google Assistant, as this Reddit thread demonstrates. Personally, I only really used a voice assistant on a daily basis when I had the Pixel 2 because it was literally right at hand. The thing that made it So The practical thing is that pressing it basically always worked. Even if you were in an app that was hiding the navigation buttons or your phone’s screen was completely off, Active Edge still did its job.

While that made it extremely useful for looking up fun facts or doing quick calculations and conversions, I’d argue that Active Edge would have been so much more useful if you’d been able to remap it. I enjoyed having the assistant, but if I could have turned on my flashlight with a press, I would have had instant access to my phone’s most important functions.

This version of the function actually existed. HTC’s U11, which came out a few months before the Pixel 2, had a similar but more customizable feature called Edge Sense. The two companies collaborated on the Pixel and Pixel 2, which explains how it ended up on Google’s devices. In the same year, Google bought HTC’s mobile division team.

Active Edge wasn’t Google’s first attempt to provide an alternative to using the touchscreen or physical buttons to control your phone, either. A few years before the Pixel 2, Motorola let you open the camera by rotating your phone and turning on the flashlight with a karate chop – much like you mix music on a 2008 iPod Nano. The camera link came about during the relatively short time that Google owned Motorola.

However, over time, phone makers moved further and further away from being able to access some essential functions with a physical action. Take my daily driver, an iPhone 12 Mini, for example. To launch Siri I have to press and hold the power button, which has been saddled with responsibility since Apple got rid of the home button. To turn on the flashlight, which I do several times a day, I have to wake up the screen and tap and hold the button in the left corner. The camera is a bit more handy and can be accessed with a left swipe on the lock screen, but the screen still needs to be on for this to work. And if it really is me use On the phone, the easiest way to access the flashlight or camera is through the control center, where you swipe down from the top-right corner and try to select a specific icon from a grid.

In other words, if I look up from my phone and notice my cat doing something cute, she may very well have stopped by the time I actually open the camera. It’s not that launching the camera or turning on the flashlight is difficult – it would just be that much more convenient if there was a dedicated button or squeeze gesture. Apple even briefly acknowledged this when they made a battery case for the iPhone that had a button to launch the camera. A few seconds shaving here or there adds up over the life of a phone.

Just to prove the point, here’s how quick launching the camera is on my iPhone compared to the Samsung Galaxy S22 where you can double click the power button to launch the camera:

Gif showing an iPhone's camera launched with the Control Center shortcut and a Samsung S22's camera launched with one touch.  The S22 launches its camera a second or two faster than the iPhone.

There’s less thinking when you can just press a button to launch the camera.

Neither phone handles camera screen capture and preview very well, but the S22 opens its camera app before I’ve even tapped the camera icon on the iPhone.

Unfortunately, Google’s phones aren’t immune to the disappearance of physical buttons either. Active Edge stopped appearing on Pixels with 4A and 5 in 2020. Samsung also got rid of a button that once existed to summon a virtual assistant (which, tragically, was Bixby).

There have been attempts to add virtual buttons that you activate by interacting with the device. Apple, for example, has an accessibility feature that lets you tap the back of your phone to launch actions or even your own applets in the form of shortcuts, and Google has added a similar feature to Pixels. But to be perfectly honest, I just didn’t find them reliable enough. A virtual button that hardly ever works isn’t a great button. Active Edge worked for me almost every time, even though I had a hefty OtterBox on my phone.

It’s not like the physical controls on phones have completely disappeared. As I’ve alluded to, Apple lets you launch things like Apple Pay and Siri with a series of taps or presses of the power button, and there’s no shortage of Android phones that let you double-press the power button to launch the camera or other apps can start button.

However, I would argue that having a shortcut or two assigned to a single button cannot give us easy access to everything we do should have easy access. To be clear, I’m not asking for my phone to be completely covered in buttons, but I do think major manufacturers should take a cue from phones of the past (and yes, smaller phone makers – I see you Sony fans) and bring it up at least one or two physical links back. As Google has shown, this doesn’t necessarily require the addition of an additional physical key that needs to be watertight. Something as simple as a press can be a button, giving users quick access to features they – or in the Pixel’s case, Google – deem essential.

Leave a Reply

Your email address will not be published.