Skip to main content

Thinking, Boxes, and Perspectives

A farmer had a plot of land he wanted fenced in and a limited budget for fence materials. He asked his three friends, an engineer, a physicist, and a mathematician, to propose solutions. The engineer being the practical sort took the full budget and bought the largest amount of fence and laid out the fence in a circle. Happy with his practical solution, he said, "friend, that's the biggest area enclosable with your budget." The physicist scoffed at the engineer's lack of "thinking outside the box" and laid out the fence in a straight line saying, "I presume the fence to extend infinitely in each direction. There, I've fenced half the world." The mathematician calmly took a 5-foot section of fence and built a circle around himself. Once encircled, he proudly announced, "I declare myself to be on the OUTSIDE."
~ Nerdy joke I heard somewhere.
A local shipping company has a slogan on one of its trucks that made me laugh out loud inside my motorcycle helmet on the ride in to work this morning. "Thinking about boxes is our business, thinking outside of them is our passion." It got me thinking about the trope of a phrase "think outside the box" as if it were something a person could do all of the time. To me, the logical conclusion would be that all you've done is redefine and then reoccupy the same old box, that is creating a mental construct and living within it.

I was reminded of a recent conference I attended where engineers working in the space where mobile, agile, and QA intersect self-organized a series of 24 panels on topics in that space. Not one panel was on the topic of high quality manual testing. Most of the topics had to do with automation. I had previously commented on that fact to some colleagues who understood the implications. In the days since the conference I have frequently returned to the notion that in all of this thinking outside the box we're doing, we may actually be neglecting the very valuable work to be done inside the traditional box of manual testing. The elephant in the room was the question of how manual testing fits in mobile, agile QA.

Looking back now, I'd like to have raised several questions within that topic to see if people actually needed to spend perhaps a wee bit more time within their boxes before venturing into the untamed wilds of Automationland.


  1. What is your strategy for truly agile test planning?
  2. How do you identify testable risks versus untestable risks?
  3. For that matter, what is your strategy for triaging risks in an agile manner?
  4. How do you estimate costs for manual testing?
  5. When should you use strict test cases versus more open-ended exploratory tests?
  6. How do you message to clients what has been tested?
  7. What kinds of reports make the most sense for your test scopes?
  8. What is the best way to organize manual testing in the context of 1 week sprints?
  9. How do manual testers best discover what to test in new builds?
  10. How can manual tests establish confidence in product quality over time?
And so on.

Re-examining the box from the inside is not the same thing as "thinking outside the box" unless you, like the mathematician in the referenced joke, prefer to arbitrarily define the box based on the context of what you're trying to solve. So if you're reading this and you are in the business of doing QA on mobile. Consider whether you're placing too much hope on automation to solve the remaining problems found inside the manual testing box. I promise you a little creativity will uncover plenty of room to improve before you simply jump into a whole new box with its own set of problems to solve.

Comments

Popular posts from this blog

UiAutomator and Watchers: Adding Async Robustness to UI Automation

"I'm looking over your shoulder... only because I've got your back." ~ Stephen Colbert After my recent UiAutomator review a user brought up an important question about the use of UiWatcher. The watchers serve as async guardians of the test flow, making sure the odd dialog window doesn't completely frustrate your tests. Having a tool that automatically watches your back when you're focused on the functional flow of your tests is awesome. 100% pure awesomesauce. Since the API documentation on watchers is scant and the UI Testing tutorial on the Android dev guide doesn't cover their use in depth, I figured I should add a post here that goes over a simple scenario demonstrating how to use this fundamentally important UI automation tool. In my example code below, I'm using uiautomator to launch the API Demo app (meaning run this against an Emulator built in API level 17 - I used the Galaxy Nexus image included in the latest ADT and platform tools).

UiAutomator.jar: What happened when Android's JUnit and MonkeyRunner got drunk and hooked up

"Drunkenness does not create vice; it merely brings it into view" ~Seneca So Jelly Bean 4.2 landed with much fanfare and tucked in amongst the neat new OS and SDK features (hello, multi-user tablets!) was this little gem for testers: UiAutomator.jar. I have it on good authority that it snuck in amongst the updates in the preview tools and OS updates sometime around 4.1 with r3 of the platform. As a code-monkey of a tester, I was intrigued. One of the best ways Google can support developers struggling with platform fragmentation is to make their OS more testable so I hold high hopes with every release to see effort spent in that area. I have spent a couple days testing out the new UiAutomator API  and the best way I can think of describing it is that Android's JUnit and MonkeyRunner got drunk and had a code baby. Let me explain what I mean before that phrase sinks down into "mental image" territory. JUnit, for all its power and access to every interface, e

Run-As Like the Wind: Getting private app data off non-rooted devices using adb run-as and a debuggable app

"You're some kind of big, fat, smart-bug aren't you?" ~Johnny Rico, Starship Troopers (1997) One of the most important things about writing bugs is making them descriptive but concise. Screenshots, video, debug logging, and hardware snapshots are all fairly standard or available to Android testers these days and can save you enormously on text when communicating what constitutes the bug. Sometimes though, the app gets into a weird state due to some transient data issue where you may not own the API or the complexity with forcing the app data into a certain state is prohibitively high. In those cases it is very handy to directly inspect the data the app has in its own directories. Getting at this data is trivial on emulators and rooted devices but due to file system permissions, this data is otherwise completely private to the app itself. If you're like me, you tend to test using devices rather than emulators and you probably prefer not to root your devices sin