Biased testing and micro-coaching
scott at tactilemedia.com
Fri Jul 7 23:37:27 CEST 2017
You don’t have to keep your mouth shut. In fact, you should be vocal, but you want your tester to be more vocal.
Generalized suggestions from past experience…
- First, explain to the tester in general terms what your app does. Avoid getting into operating specifics.
- Tell the tester you want them to verbalize as much as possible their thought process when encountering each screen/interaction process. You goal is to get a sense of what the tester is thinking and why, not just whether or not they exhibit expected behavior (you will have to prompt the tester repeatedly to explain their thinking without scolding or leading).
- Explain to your tester there are no right or wrong actions/answers while using your app — you are trying to observe real world behavior and initial responses to what they see/experience, and their interaction (or lack of it) in no way reflects on their “intelligence”.
- Give the tester one or more planned tasks to complete. Remind them to describe their thinking as they attempt to complete each task.
- Each time the tester is shown a new screen/process, ask them what they think they need to do at that point. Ask why. Keep all requests/comments neutral, never correct the tester. If their response doesn’t fit with your intended behavior, ask the tester what they would suggest to improve interaction/outcome, or make the process more intelligible. Avoid allowing the tester complete too many tasks in a row without describing their thought process.
- If the tester can’t figure out how to proceed to a next step, give them a hint (if possible) and determine if they are able to understand the interaction. Again, ask for suggestions on what could be improved. Ask why.
- Rinse and repeat.
- Ask the tester at the end of the test what they felt was the biggest issue with the app. Ask the tester to reiterate how they would correct the problem. Review your list of problems encountered by the tester to confirm your understanding of the issues.
- Record/note all responses. Keep written notes at a minimum, use audio and/or video recording to collect more detailed/nuanced responses. In an ideal world, you would record the tester and the screen they interact with concurrently.
Hope this helps.
Tactile Media, UX/UI Design
> On Jul 7, 2017, at 12:49 PM, Jonathan Lynch via use-livecode <use-livecode at lists.runrev.com> wrote:
> From reading these, it looks like my basic steps are these:
> 1. Make changes to the app
> 2. Test for usability myself a dozen times, trying things in different orders and in different ways to make it fail
> 3. Have my testers, which is really about 3 family members, test it to make it fail
> No coaching, no hints
> Directly observe their tests very closely
> Make notes on any moments of confusion, even if they minor
> Interview them, asking what they were thinking at each step
> Adjust the help file and add hints - and test those as well
> 4. Fix as needed and retest
> 5. Publish
> 6. Try to find virgin testers for next time, varying in age and mindset
> Does that sound about right?
> Sent from my iPhone
>> On Jul 7, 2017, at 1:53 PM, jonathandlynch at gmail.com wrote:
>> Thank you, Jacqueline
>> Sent from my iPhone
>>> On Jul 7, 2017, at 1:39 PM, J. Landman Gay via use-livecode <use-livecode at lists.runrev.com> wrote:
>>>> On July 7, 2017 6:59:52 AM Jonathan Lynch via use-livecode <use-livecode at lists.runrev.com> wrote:
>>>> What steps do you guys follow for accurate testing when you don't have a budget for proper official testing procedures?
>>> Jacqueline Landman Gay | jacque at hyperactivesw.com
>>> HyperActive Software | http://www.hyperactivesw.com
>>> use-livecode mailing list
>>> use-livecode at lists.runrev.com
>>> Please visit this url to subscribe, unsubscribe and manage your subscription preferences:
> use-livecode mailing list
> use-livecode at lists.runrev.com
> Please visit this url to subscribe, unsubscribe and manage your subscription preferences:
More information about the use-livecode