blog

Let's talk about testing

Can a receptionist do QA?

Recently I was at a company whose QA effort was done part time by a receptionist, typically just before the release went out. The senior project manager would get her to spend an hour clicking around a pre-production site to see if – in her estimation – it worked sufficiently well to go live to production. Of course the technical team would do QA work, but she was really the person responsible for the UAT effort.

Sounds like she does exploratory testing then and not really UAT, some of you would say.

And you would be right. Any professional tester worth their salt knows that exploratory testing is a valid and very worthwhile testing technique, but the effectiveness varies from individual to individual. This is why testing evolved from the early days of people doing random things in random ways, to later day written test sets which are repeatable, which in turn gave rise to predictable test coverage which could be expressed as a quantifiable number. Everybody loves statistics on your test effort expressed as: total tests run, tests passed, tests failed, critical errors found. Rather than, a gut feeling expressed as, we think it’s okay to go live!

But what’s the real problem here?

Well the problem that some of the company people expressed to me was that more and more bugs were going into production and people were getting unhappy. Rather important people up top, who were expressing their unhappiness by putting heat on the people below. Quite a lot of heat, as it turns out. So I was asked if there was anything I could do to help their receptionist do QA better?

What they were really asking me was, can a receptionist do QA? They didn’t have the money to take on additional resources, so they couldn’t get in a dedicated person to do the QA job. The short of it was, the receptionist was going to continue to do QA for the foreseeable future.

My view on it is, a receptionist is actually an ideal person to do QA when you really think about it. They perfectly represent the great unwashed masses out there in userland who visit consumer sites every day. They will interpret a UI in the same way, they will perform a sequence of actions in more or less the same way, and most importantly they’ll do random things in the same way that royally screw up a website.

This is perfect! Because I believe that QA is only a valuable activity when it’s focused on the consumer. Which means client side QA from a browser is the only kind of useful QA to a company. Why? Because it’s the consumer that determines whether or not your web service will live or die. If they like it, they’ll come back and more than likely bring others with them. If they don’t, they’ll never come back, and more than likely tell the world through conversations, emails, and blogs why your site is rubbish. So is there even a point of doing server side architecture testing? I’m going to really put the cat amongst the pidgeons and say, not that much. Web consumer service QA should face the browser and go forward from there. Look for the next post from me to explain why.

So the receptionist is an ideal person to perform QA for your technical team! The downside however is that when they try to report the error it’s likely to be a description along the lines of:

The site doesn’t work! I wanted to do something on the groups page (not sure which one, but it was definitely a groups page), and then I clicked a button, and then something happened and it stopped working.

Any developer getting this type of bug assigned to them will probably mark it as RESOLVED – WORKS FOR ME. After all, he knows how the groups feature works, and when he goes through the set of actions that any “reasonable” user should go through, it works. Or they’ll write a flaming comment designed to flay the skin off the original author for their epic fail in concise, accurate reporting. Regardless of which way they go, they could be overlooking an important bug that is a bug, but you have to be Average Joe to see it. (Devs in no way represent Average Joe – the user. They shouldn’t ever be allowed to think they can or should, or do. If they were Average Joe, they wouldn’t be working as high end web/internet developers).

Okay, so this is where some training can help. Anyone who gets sent to a beginners testing course will be taught how to write accurate bug reports. Really though, in a couple of hours you can give someone the necessary process for writing bug reports that devs can use in a meaningful way. It’s not rocket science, it’s just a set of principles based around the process of outlining

1. What you were trying achieve.

2. Where you started.

3. What you did.

4. Where you ended.

5. What happened in between as you made each specific action.

Put all this into a bug report without flowery language, and even the most bitterly cynical developer will gruffly acknowledge you did an adequate job describing the problem. Good devs will have enough to go on to track a problem down, without an endless exchange of probing questions designed to extract relevant information out of you, all of which costs time; more and more time for each message in the thread.

I’m hoping to get the opportunity to work with the receptionist in the near future so I can put her through a program that I believe will improve her overall productivity in measurable way, and that will take a couple of hours at most. I’ve never seen a completely non technical user do QA, so I’m looking at it as a challenge; with all the tools available in the professional web testers tool box, just a couple of them should be usable by someone with very few technical bones in their body. I believe they can be imbued with the mental process to learn the art and science of good QA and learn for themselves how to improve with each subsequent cycle they perform.

I think I’ll make a case study out of it as something interesting to discuss in the near future.

Andy.


  • I’m about 95% with you, but this bit caught my eye:
    > So is there even a point of doing server side architecture testing? I’m going to
    > really put the cat amongst the pidgeons and say, not that much. Web consumer
    > service QA should face the browser and go forward from there.

    I think that’s a bit aggressively phrased. While in the end you’re right, if “QA facing the browser” finds flaws, there’s no point in yammering about how the servers side architecture passed tests just fine. That part I absolutely agree on.

    Still, I’m a fan of multiple lines of defence. You can catch a ton of errors testing from the inside out, as it were. Most likely if the results of server side architecture tests disagree with the consumer side tests, that’s a hint that your architecture doesn’t fit the use case. That’s valuable information, and information you don’t easily see with “just” consumer side testing, or “just” architecture side testing.

  • Hi Jens,

    Well I was being slightly controversial with this, I do believe that for the consumer web services, the most valuable tests are from the browser. However this isn’t to say that server side testing is irrelevant; it’s not.

    Server side testing is in fact a necessary activity in complex services delivery because identifying why you get errors on a page that displays wrong information is impossible from the page itself. You have to understand what transactions occurred further up the chain to isolate the fault. And this is where the value of testing the server components is valuable; you verify and validate it works according to some spec, and bugs will be found that won’t be found through the browser.

    Good test planning is then the art of balancing browser testing with server side testing with the time and resources you have available.

    Andy.

  • Quality assurance is the systematic measurement,comparison with a standard,monitoring of processes and an associated feedback loop that confers error prevention.Two principles included in QA are: Fit for purpose and Right first time.so learn more about online training courses on QA training,.net,sas and sap

Reputation. Meet spriteCloud

Find out today why startups, SMBs, enterprises, brands, digital agencies, e-commerce, and mobile clients turn to spriteCloud to help improve their customer experiences. And their reputation. With complete range of QA services, we provide a full service that includes test planning, functional testing, test automation, performance testing, consultancy, mobile testing, and security testing. We even have a test lab — open to all our clients to use — with a full range of devices and platforms.

Discover how our process can boost your reputation.