When Should You Use Real-Device Testing, Exactly?


In our present age of virtualized, software-defined everything, it can sometimes feel archaic to do anything directly on real, bare-metal hardware, whether hosting a server or testing software. Virtual environments tend to be more flexible, nimble and scalable than those that run directly on real hardware.

You may think, then, that there is little reason ever to test software on real devices, as opposed to using simulators or emulators. But you’d be wrong. Real-device testing remains critical for certain use cases.

Let’s take a look at what those use cases are by explaining when real-device testing is more beneficial than testing on simulators or emulators.


Before delving into a discussion of the merits of real-device testing, let me make clear that simulators and emulators both have important roles to play in software testing, too.

An emulator functions as a virtual device that is designed to be a precise digital replica of the actual device in question. Theoretically, emulators make it possible to emulate not just processors and persistent storage, but also I/O events and other low-level features. Simulators are different in that they are designed to simulate the outward behavior of a device, but not virtualize its actual hardware. They are useful for testing how software will behave at a higher level in a given environment—not how it will interact with low-level hardware.

Emulators and simulators are both helpful for situations where you want to change environmental factors quickly in order to see how software responds to a different setting. They allow you to make changes to the environment, including changes as great as modifying the type of processor or amount of memory—in seconds—without having to modify the actual real hardware on which the tests are running.

But emulators and simulators have their drawbacks. As anyone who has ever used a virtual machine knows, the theoretical behavior of a virtualized environment rarely provides a perfect mirror of the actual hardware or software that it is supposed to be virtualizing. There are almost always quirks.


That’s why the ability to run tests on real devices is so important. While real-device testing means losing some of the flexibility of testing with emulators and simulators, the tradeoff is test results that more accurately reflect how an application will behave on a given type of device in the real world.

Plus, real-device testing allows you to test certain types of features that are difficult or impossible to test in an emulated or simulated environment, such as:

  • The temperature of a device while it is running your app.
  • How wireless connectivity impacts your app. (There are ways to emulate wireless network connections, but nothing compares to testing on a real-world connection.)
  • How your app interacts with audio and video hardware on the device. (Again, you can try to emulate sound cards and GPUs, but because the hardware is so specialized, it’s difficult to get reliable testing results when using emulated audio and video hardware.)
  • How well biometric hardware works with your app (such as fingerprint readers).
  • How users interact with your app when they are running it on a screen that they hold in their hands without anything else around it (rather than simply viewing it in an emulator or simulator screen).


Given that real-device testing can be somewhat slower than testing in virtual environments, when can you strike the right balance between test speed and test accuracy for your real-device tests?

The easiest answer is to use real-device testing at the end of the delivery cycle and emulators and simulators during development and system testing. That way, you get the accuracy of real-device test results just before you release into production, but you don’t have to worry about real-device testing slowing down your delivery pipeline in earlier stages. And in most cases, if your app has passed all other tests, any bugs revealed by real-device testing are likely to be minor and quick to address.

That said, there are situations where you may want to perform real-device testing earlier. If your app is centrally dependent on the smartphone’s hardware, it is likely worth the extra time and effort to perform real-device testing as soon as you can. It’s better to identify show-stopping hardware-related issues early on so that you won’t need to scrap a bunch of code in order to fix them.

So, to put it simply, if your app is centrally dependent on hardware features within a device, perform real-device testing early and often (though there is no reason why you can’t also use simulators and emulators alongside real devices). But if it isn’t, you can probably get away with doing real-device testing at the end.

Cordny Nederkoorn is a software test engineer with over 10 years experience in finance, e-commerce and web development. He is also the founder of TestingSaaS, a social network about researching cloud applications with a focus on forensics, software testing and security. Cordny is a regular contributor at Fixate IO. LinkedIn


Click on a tab to select how you'd like to leave your comment

Leave a Comment

Your email address will not be published. Required fields are marked *

Skip to toolbar