Skip to main content

Boom! Screenshot! Capturing on-device app state when Android UI tests fail.

"A sub-feature of this is the screenshots feature. While most of the fare you will see will be OMG LOOKEY HEREs, with a little skill, you can take quite good images."
~PARDOX460, The Art of Taking Good Halo 3 Screenshots
Remember back a few posts when I dropped a bit of info about maybe perhaps capturing screenshots on failed UI tests? Well it wasn't a lie and now I'm going to show you how. First, I'll go over the prerequisites. Second I'll present you with a code sample that demonstrates how to implement the screenshot tool in your tests. And finally I'll wrap up with some recommendations on further steps to embed this into your continuous integration environment and practice. Let's get started, shall we?

Phase 1: Prerequisites

The bell I'm forced to ring whenever it comes to Android test automation is that Google have deigned to build it around emulators. This is convenient for them because they're not beholden to manufacturers' engagement to optimize for any devices. It is convenient for developers on a shoestring budget because they don't have to be expected to buy a ton of devices in order to test core functionality. As an old school manual tester, I just cringe at the thought of not being able to validate expected behavior, look and feel, UX, UI, and performance without a real device in hand. So there are some hoops to jump through for getting test output from your beloved gaggle of 'droids.


Robotium

If you're taking screenshots of failed tests, it is probably safe to assume it is because you're not physically, actively monitoring your tests in person. If you're actually hovering over each device, painstakingly observing for all aberrant behavior, I think you've got deeper problems with the concept of automation and continuous integration beyond getting screenshots. There is also a very strong chance that you're familiar with Robotium. One of the new features in 3.4.1 is the ability to not only take screenshots, but pass a string into the screenshot method and specify the filename of the resulting JPEG. This will be important for section 3.


SDCARD write access

As you can read in the utility class itself, you must enable the android permission for SDCARD access in the app under test. This gives the takeScreenshot method the ability to create the /sdcard/Robotium-Screenshots directory on your device and then dump the all important jpeggy goodness in there. This is handy especially if your tests involve uninstalling or clearing your application cache or any other app directory file manipulations. Just remember to clean up after yourself later on when you're ready to run a new pass.


@Override runTest()

This is the sneaky trick that connects the ability to actually take a screenshot in the first place with the context in which it is relevant. The JUnit TestCase framework uses runTest() to intercept thrown assertion errors and report failures. This is the point we want to step in, capture a screenshot displaying the on-device state of the app, and then pass the throwable error through to be handled normally in JUnit. It is also a good place to add some logging.

Phase 2: Code Sample

Once you've decided how you prefer to handle write permission, included Robotium in your test project and written a test case that exercises the UI, insert the following code just after your test case constructor:
@Override
public void runTest() throws Throwable {
     try{
          super.runTest();
     }
     catch (Throwable t) {
          String testCaseName = String.format("%s.%s", getClass().getName(), getName());
          solo.takeScreenshot(testCaseName);
          Log.w("Boom! Screenshot!",String.format("Captured screenshot for failed test: %s", testCaseName));         
     throw t;
     }
}


Phase 3: Recommendations

So now that you can get screenshots, and I'm sure you wrote a test or two that will definitely fail just to prove the code above works, it is time to plug this in to your existing test automation system. At my office, we use Jenkins and test data artifacts are handled in the test runner script from the server in our lab which hosts the USB connections to all our test devices. This script includes our pre-test setup and post-test cleanup operations for each Android JUnit job. 

Juice for your build job

To get the images off the device, use:
adb -s *YOUR DEVICE SERIAL* pull /sdcard/Robotium-Screenshots

Also, don't forget to add a clean-up step with the following:
adb -s *YOUR DEVICE SERIAL* shell rm -R /sdcard/Robotium-Screenshots


Future development suggestions

We also pull the images as well as any XML and code coverage reports from the test host server back to the main Jenkins server for reporting. One thing I'm currently looking into doing next is integrating the screenshots into the XML reports.  As you can see in the code snippet in Phase 2 above, I'm naming the screenshot file after the test that failed and triggered the capture. This aligns well with how the Jenkins JUnit test report plugin arranges the pages it displays. Minimally I'd like to update that plugin to scrape the results directory for JPG files and embed them in the pages that show stack traces. This allows for richer feedback when debugging failed tests. 

From a UX perspective, assuming I succeed with the above extension in the plugin's functionality, I'd like to also see thumbnails generated on the main test results page so users can drill immediately to the results for a given failed UI test visually as well (as opposed to clicking 3 links down the current hierarchy).

Comments

  1. Hi Rusell,

    Great post!. Helps me a lot to take screenshots for my test cases. Do you able to manage integrate with junit reports.

    Regards,
    Sathish

    ReplyDelete
    Replies
    1. Turns out this was already in the works (although not as cleanly as with Spoon).

      https://wiki.jenkins-ci.org/display/JENKINS/JUnit+Attachments+Plugin

      Delete
  2. A fantastic way to give specific information without really understanding any of it is to take a screen shot. This means to take an image of your desktop, and then send it along with your complaint. PrtScreen

    ReplyDelete

Post a Comment

Popular posts from this blog

UiAutomator and Watchers: Adding Async Robustness to UI Automation

"I'm looking over your shoulder... only because I've got your back." ~ Stephen Colbert After my recent UiAutomator review a user brought up an important question about the use of UiWatcher. The watchers serve as async guardians of the test flow, making sure the odd dialog window doesn't completely frustrate your tests. Having a tool that automatically watches your back when you're focused on the functional flow of your tests is awesome. 100% pure awesomesauce. Since the API documentation on watchers is scant and the UI Testing tutorial on the Android dev guide doesn't cover their use in depth, I figured I should add a post here that goes over a simple scenario demonstrating how to use this fundamentally important UI automation tool. In my example code below, I'm using uiautomator to launch the API Demo app (meaning run this against an Emulator built in API level 17 - I used the Galaxy Nexus image included in the latest ADT and platform tools). ...

UiAutomator.jar: What happened when Android's JUnit and MonkeyRunner got drunk and hooked up

"Drunkenness does not create vice; it merely brings it into view" ~Seneca So Jelly Bean 4.2 landed with much fanfare and tucked in amongst the neat new OS and SDK features (hello, multi-user tablets!) was this little gem for testers: UiAutomator.jar. I have it on good authority that it snuck in amongst the updates in the preview tools and OS updates sometime around 4.1 with r3 of the platform. As a code-monkey of a tester, I was intrigued. One of the best ways Google can support developers struggling with platform fragmentation is to make their OS more testable so I hold high hopes with every release to see effort spent in that area. I have spent a couple days testing out the new UiAutomator API  and the best way I can think of describing it is that Android's JUnit and MonkeyRunner got drunk and had a code baby. Let me explain what I mean before that phrase sinks down into "mental image" territory. JUnit, for all its power and access to every interface, e...

Why Developers Shouldn't Perform Software Testing - A Rebuttal

Take a minute and read the following 2-pager entitled " Guest View: Why developers shouldn’t perform software testing ". Man, I tried to like this article because the author, Martin Mudge, clearly knows that businesses who undervalue quality by trying to eliminate testing through simply shuffling the traditional testing task to developers are making a huge mistake. Unfortunately he begins by making an assertion that testing overburdens a developer which at face value is complete nonsense. If you feel overburdened, it is a timeline issue and that is true no matter WHAT the nature of the tasking is.  So his assertion of “one the most important” contributing factors being overburdening the developers is massively flawed. It gets immediately worse from there because his second point is about time constraints.  Mr Mudge just gets so much wrong. What he really should be shooting down is the idea that testing, as a cost-center not as a task, can be eliminated by having your p...