I took a long break from blogging and the series of “what software testing is”, so before I go back to them, I would like to share my personal observations on some interesting facts. It’s actually related to what software testing is, but it’s sort of outsider’s view. It’s actually part of the reasons why I started the series what software testing is/not- the fact that opinions about testing are often shaped by people outside of testing.
So, have you ever been wondering what are others in your company thinking you’re doing? What is their perception of your job? You don’t know? Well it’s fairly easy to test this, simply read through the job description that is posted when hiring new employees. Because that’s what it is – job descriptions are often prepared by HR, leads and managers sometimes development managers, so I consider this a really good measure of what other people think testing is.
I read job postings very often – not because I am constantly looking for new job, but because I am constantly looking for personal improvement and I want to make sure my skills are still relevant to the market. I believe that’s something every good professional should do and also something that might give us perspective over the craft. Reading job posting always leaves me with the impression that people posting job offers for testers are either non-testers, copy-pasting job descriptions, without putting any sort of analytical thinking in it or they are drunk. Here are some of my favorites that I found while I was making research for this blog post(these are all requirements from real job postings):
Analyse automation and other failures and:
- Accurately report and track defects
- Identify areas of improvement
“Automation and other failures…” seems to me like the company is telling you, pretty straight-forward, that their automation testing strategy/tool/thing is a total failure and they want you to analyze it and probably fix it. Joke aside, I bet the author meant something way different, but didn’t pay attention close enough.
“Keeping track of developer changes and doing the appropriate tests when needed”
“Developer changes”?! Does this mean the tester should pay attention if developers switch places or if a developer changes his clothes, haircut, political views, what kind of change do they imply?
Not to mention ridiculous role headings like “We are hiring an automated tester“. Wtf do you mean by “automated tester”? Who are you going to hire T 1000? The Iron man? Wall-E? Testing is a human activity, performed by humans, which occasionally use automated tools, but that doesn’t turn them into automated testers.
Besides that job descriptions normally contain the following list of activities that are sort of “what are you going to do” when you get hired explanation. So, here’s how this looks like – it’s a raw material that I practically copy-pasted from several offers:
- Analyse business and technical requirements, identify potential software issues
- Design and execute test plans and test strategies
- Execute functional, regression, integration and performance tests
- Create test reports, defect analysis and troubleshooting
- Maintain and regularly update QA related documentation
- Prepare, monitor and maintain test environments and systems
- – Execute manual tests;-
- Create detailed test plans and high fidelity test suites with test cases;-
- Test suite execution, result analysis and reporting;- Estimate testing complexity and tasks for User Stories;- Coach other team members.
- All applicants should be familiar with industry best practices for testing products including:
- Defect classification and issue severity rating
- White box, black box, and gray box testing
- Usability testing
- Code coverage
- Unit, integration, system, and regression testing
- Security and performance testing
- Common automated testing tools
- Continuous integration and continuous delivery
- Agile methodologies e.g. SCRUM
- Team sprint planning tools like JIRA
- Customer use cases
- Test documentation
- Build tools and scripts to reduce the need for repetitive and manual tasks and tests.
- Analyze requirements and product specifications.
- Create, implement, and execute tests to break our software
- Interact with engineers and managers to create good testing processes and test plans for software projects.
- Interact with customers to understand their testing requirements and report issues.
- enforce the acceptance criteria of features;
- Design automated test cases, review existing such and analyze results;
- Executing different types of black-box testing, including functional and non-functional
- Automating test cases using various tools and languages
- Documenting and assuring the quality of software applications across all architectural layers
- Define and execute functional, automation and performance test plans and strategies;
• Prepare test environment, business scenarios and scripts, test scenarios, data and test scripts;
• Execute test cases, file bug reports, and report on product quality metrics;
• Drive testability requirements into the product;
• Follow good engineering practices.
- Develop new and maintain existing automation tests;
- Write automatic integration tests.
- Write test documentation.
- Work on understanding scenarios, and reviewing test cases to ensure that they meet the testing approach.
- Ensure that the business process is respected
- Work in collaboration with the managers responsible for the quality of the deliveries.
I believe every person in IT, and testing in particular, is very good at seeing patterns. Very often, when I review job offers I see the following pattern:
Testing is mostly presented as:
- Writing documentation:
“Create detailed test plans and high fidelity test suites with test cases”
“Design automated test cases”
“Write test documentation“
- Performing predefined tests:
“Execute functional, regression, integration and performance tests”
“Executing different types of black-box testing, including functional and non-functional”
“Execute test cases, file bug reports, and report on product quality metrics;”
Somewhere along these, not so frequently, but yet worth mentioning, appear stuff like “being a good team player”, “exchange knowledge with team members”, “analyse documentation”, “communicate with clients”, etc.
So, the problem itself is that none of the above descriptions mention some of the core activities during testing:
- Creative thinking – test design is something mentioned very often, but nobody ever mentioned how do we design those tests, it’s like they are written on their own, or “we just figured it out already”
- Problem solving – every job description reviews testing just like it is execution of steps and not an active process of problem analysis and resolution.
- Exploration of the product – Few thousand times I saw “analyse documentation” and “analyse requirements” and none I saw “analyse the product”. We are shipping the product, after all, analysing the requirements is testing the requirements, but they are not the product. And we are not even reviewing the case when requirements and docs are missing or being outdated.
- Experimental approach – Testing is all about learing new information about the product by the process of experimenting with it. If we remove this activity from testing we turn it into mindless zombie clicking.
In other words – job offers describe testing as the unsexiest activity ever
The way that job offers describe testing is like a simple set of activities that all revolve around – documenting, test execution and reporting.
This is why testing is often falsely viewed as an activity that can be fully automated – because the part of testing visible to management and individuals outside of testing is the one that has components easy to automate – execution, documentation and reporting. While the important components of testing, ones that are vital for expert testing are often omitted or taken for granted, such as:
- Problem solving
- Risk assessment
And guess what – all of them are totally non-automatable!
Which leads to my personal favorite from the list above:
Build tools and scripts to reduce the need for repetitive and manual tasks and tests.
I’ve also seen this formulated as: “Perform automation to reduce boring and repetitive tasks.”
Now, let’s be honest about two things:
- You can reduce the “need of repetitive testing”, by not performing repetitive testing, anymore. Captain Obvious thought me this one.
- If you consider testing “boring”, you probably lack the proper mind-set in order to perform it.
Dear testers, testing is activity based on natural curiosity, if you are curious enough and love what you do, you will find every part of your job interesting. And by “interesting” I don’t mean “amusing”, you don’t work to entertain yourself, by “interesting” I mean challenging. If you have the mind-set of looking for challenge in every task that you perform, you will be curious enough to look for a solution. On the other hand – if you lack the proper motivation and curious mind-set, you will find every activity in testing being dull and boring.
What can we do about it?
I think I stated the problem well enough, but the more important question comes to mind: what should be our reaction to this?
At first place I think it’s our fault. We have been agreeing for too long to be told what are we doing and what our job consists of, without making any complaints about it and now we complain about the result that we see.
What we can do in order to fix things. Well it’s fairly simple – tell your testing story in compelling and scientific way. I know and you know that testing is not what we are told it is – execution of test cases and filing bug reports. Well, then tell the true story about it. Make a presentation about what problem solving approach you’ve chosen in resolving a particular problem or write a post about how traditional test cases strategy failed you, so you had to “switch gears” to something more effective.
My advice is very simple – Tell the true story of testing, don’t let others tell you what it is.
I would love to read your thoughts on the topics. Any shares and retweets are highly appreciated.