In 2013 at a Google-hosted event in New York called the "Google Test Automation Conference", or GTAC for short, a team of Google engineers tried to sell the attendees the idea that a device lab consisting of real world devices is not as maintainable or optimal of a test bed as a vast server farm running emulators.
That's right.
Google said that in order to avoid the tyranny of dealing with the real world everyone should have a Google-scale server farm for testing their apps (video, slides).
And then 2 months later they bought Appurify.
Those are pretty epic mixed signals.
Now don't get me wrong, I would LOVE to have Google's server resources for my automated testing. But like many companies who do not make mobile apps or even software in general as their primary business, my current employer seems to prefer to invest more shall we say "modestly" in their app test automation environment resources. But don't miss the point that Google, for all their nearly infinite server resources still finds live-device-based test automation so important that they bought a company clearly doing it better than they were so far.
Nothing is worse than spending all that time and effort building your device lab, meticulously enabling developer mode's "stay awake while charging" on each device, getting a scalable CI service up and running with good end-to-end scenarios, then seeing that series of test runs failing at 100% and throwing off all your metrics. Invariably this happens when you or anyone else is not available to simply walk over and unlock the screen. Because life is pain.
And because we noble automators like to approach our task by selecting a variety of devices, the methods for undoing those tricksy locks can vary. But before you interrupt me, I know what you're thinking. You're thinking, "wait, they're YOUR devices, why not just disable the lock screen and start each test with a little ...UiDevice.getInstance().wakeUp()...?" Well before you get to feeling too clever, I may have mentioned before that my present employer doesn't exactly invest in mobile device testing labs. The "lab" in question may or may not be on or around my desk so it is unsafe to not include basic security like PIN codes and screen timeouts. On top of that, I firmly support enabling all reasonable security on easily removed hardware that is running unreleased code. Seriously, we've been over this before.
In the time since I first wrote suggesting the use of UiAutomator to deal with the locked screen issue, I've learned a lot about how to write tests with UiAutomator, how to write automation in general, and in recently writing about handling the variances in device system UIs using the Factory Design Pattern. My first shot at this involved a dummy test case that ran the UiWatcher and unlocked the device. That's cool and all but with UiAutomator, you're not able to necessarily dictate the order in which tests run. Furthermore, each individual test should be written without any dependency on any other test. This makes debugging them for logic flaws easier, makes re-running failed tests more useful, and generally keeps the code much cleaner and more maintainable. Seriously, do it. And as I'll briefly describe, it really isn't that hard. It's not, like, rocket surgery or something.
Before we begin, please review the previous posts on using UiWatchers to unlock your devices and the Factory Design Pattern to simplify your system UI interactions. Go ahead, we'll wait...
That's right.
Google said that in order to avoid the tyranny of dealing with the real world everyone should have a Google-scale server farm for testing their apps (video, slides).
And then 2 months later they bought Appurify.
Those are pretty epic mixed signals.
Example of less paradoxical signaling. |
Maybe it is because I like working with my own two hands. Maybe it is because I find a zen-like relaxation when up to my earlobes in cable-wrangling. Or maybe I just like to see all the different screens going blinky blinky at the same time. Beyond their practical value as fully realistic representations of the actual users' experience, I just prefer running automation on live devices. So the cautions in that GTAC 2013 presentation do not fall on deaf ears with me. I am battle-tested. I know their pain all too well. It seems like every week there's some new oddball device issue keeping your tests from running completely smoothly.
The irksome thing bothering me these days is that my tests are sometimes begun on devices with a locked screen. Which is to say, not really running at all. Sure, ADB will dutifully install the app and start the uiautomator test suite but that damned locked screen gets in the way of, well, everything.
My test runs be all like... |
And because we noble automators like to approach our task by selecting a variety of devices, the methods for undoing those tricksy locks can vary. But before you interrupt me, I know what you're thinking. You're thinking, "wait, they're YOUR devices, why not just disable the lock screen and start each test with a little ...UiDevice.getInstance().wakeUp()...?" Well before you get to feeling too clever, I may have mentioned before that my present employer doesn't exactly invest in mobile device testing labs. The "lab" in question may or may not be on or around my desk so it is unsafe to not include basic security like PIN codes and screen timeouts. On top of that, I firmly support enabling all reasonable security on easily removed hardware that is running unreleased code. Seriously, we've been over this before.
In the time since I first wrote suggesting the use of UiAutomator to deal with the locked screen issue, I've learned a lot about how to write tests with UiAutomator, how to write automation in general, and in recently writing about handling the variances in device system UIs using the Factory Design Pattern. My first shot at this involved a dummy test case that ran the UiWatcher and unlocked the device. That's cool and all but with UiAutomator, you're not able to necessarily dictate the order in which tests run. Furthermore, each individual test should be written without any dependency on any other test. This makes debugging them for logic flaws easier, makes re-running failed tests more useful, and generally keeps the code much cleaner and more maintainable. Seriously, do it. And as I'll briefly describe, it really isn't that hard. It's not, like, rocket surgery or something.
Selfies are like, totes rocket surgery right? |
...Now that it is fresh in your mind, here's the secret to my lock screen Jeet Kune Do:
- Create a separate WatchersHelper class and define a lock screen watcher just like you would if you were including a more specific watcher in your test class.
- However, for all UI interactions, use the factory design pattern approach by calling the create method from your factory to instantiate an interface object and reference all UI interactions through those methods
- Remember to use the system fingerprint in your factory to select the appropriate device-specific lock screen helper class that implements the required UI interaction methods
- Then in your test classes, just add the lock screen watcher, register it during setUp(), and remove it during tearDown().
So the summary is, the lock screen UiWatcher in your WatcherHelpers class is generic. All your tests will reference this without specific attention paid to their environments. That class merely calls the factory through the appropriate interface. It doesn't take long to set this up once you're comfortable building UiWatchers in the first place but the first time it saves your bacon on a unexpectedly locked screen, it will pay for itself in lost time for generating valuable test results. Each time afterwards is just gravy. And hey, on top of that you've built the foundation for dealing with any lock screen-based tests (since lock screen notifications are now stock in Lollipop, you can anticipate needing to handle that UI). If there's interest, I can post some sample code showing where to organize this in your project. Until then, keep it groovy, Gotham.
Activate the Bat Hand Jive, Robin! |
Comments
Post a Comment