Smartphones are just smart, they’re also phones. Here at PCMag, we try not to forget both halves of that equation when we’re testing voice phones, smartphones, and mobile carriers—which we do to the tune of around 100 handsets each year.
We submit all the phones we rate to a string of tests in our lab. For carriers, we rely on annual national drive tests to 30 cities in our Fastest Mobile Networks project, and a reader survey that results in the PCMag Readers’ Choice awards.
To prevent reviews from running overly long, we don’t always include all of our test results in every review. But rest assured, every phone we review has been thoroughly tested in the following categories:
Even if handsets are on the same carrier, reception can be different for each one. We have identified locations throughout our testing lab where each of the major wireless carriers has very weak signal. In those locations, we attempt to connect and listen to three one-minute calls.
For a limited number of flagship handsets, we compare their performance at multiple locations around New York City on different US networks with Ookla Speedtest software. (Note: Ookla is owned by Ziff Davis, PCMag.com’s parent company.)
With psychoacoustics playing such a large role in call quality, a trained ear is the best guide—and our reviewers have listened to hundreds of cell phones. We make calls to automated voice-recognition systems and landline answering machines, from a room where we have simulated traffic noise playing. Then we listen to our messages to gauge sound quality. We listen especially for the quality of background noise cancellation, both incoming and outgoing.
We also measure maximum speaker volume at six inches with a decibel meter, using a test call to a recording of a person reading a book out loud.
We test phone screens using a Klein K-80 colorimeter, a device that precisely measures light and color. The K-80 connects to a Razer Blade Pro laptop with SpectraCal’s CalMAN 5 software. This software analyzes the data the colorimeter measures and turns it into usable numbers and charts. CalMAN 5 translates the luminance information into a phone’s peak brightness in cd/m2 (nits), and the chromaticity information into X and Y levels for positioning on a color chart. It also gives us maximum brightness values.
For more, see How We Test Phone Screens.
We measure battery life by streaming a wide-screen, 1080p video that we created, over a 5GHz Wi-Fi network, from YouTube, with the screen brightness turned all the way up. The idea is that this simulates a stressful, but easily repeatable use case.
Third-Party Application Benchmarking
We run Browsermark, Geekbench, GFXBench, and PCMark (the Work 2.0 and Storage workloads) to test the performance of smartphone hardware. We also launch and play high-end games (currently, Asphalt 9) to check frame rate, control fluidity, and jitter.
Music and Video Playback
We play music and video Google Play Music and YouTube and through the phone’s built-in speaker and both wired and Bluetooth stereo headphones, if possible. For phones that advertise audio quality, we use high-quality headphones such as the Meze 99 Classic or Bowers & Wilkins P7.
We check the received speed of a 5GHz Wi-Fi network at 25 foot intervals from a Netgear 802.11ax router with a 900Mbps symmetrical connection using the Ookla Speedtest.net app.
Still and Video Cameras
To test camera capabilities, we use an abbreviated set of our tests for digital cameras.
Other Phone Features
We analyze controls, ports, and storage, along with voice commands, ringtone volume, and the strength of the vibrating alert. We test microSD card slots using a 256GB card.
Once all of these tests have been completed, we combine the data with experience from our actual everyday use, compare the results with other similarly priced phones, and assign a rating.