Start automating your tests 10X Faster in Simple English with Testsigma
Try for freeThe iterative approach to software development emerged around the 1990s. What started as a small co-located, self-sustaining team approach was widely adopted by many development teams. The agile mindset paved way for multiple development frameworks, including the infamous “Scrum methodology”.
With time, processes undergo metamorphoses, during which a few unpleasant practices sneak in causing distractions within the teams. This article focuses on a few unpleasant software testing distractors within an Agile Scrum team.
The below agile testing distractors may not have a significant impact in plain sight, if left unattended they may cause potential harm in the long run.
Table Of Contents
Muted Refinement sessions
By default, the complexity of a user story is linked to developmental effort conveniently ignoring the testing effort involved. Sometimes implementation would hardly take any time but, the impact of this change stimulates a larger blast radius which significantly boosts the time required for testing.
A user story is complete and common understanding is established only if the story is addressed in 360degrees. Test implications should be considered during refinement sessions. Test engineers should facilitate this discussion with the product owner and the wider development team during the refinement session.
Absence of testing notes
Refinement sessions provide a good opportunity for the PO(Product Owner) and the development team to scrub through the user story and add more details. Breaking down a user story into implementable tasks, clarifying any open questions, and validating certain aspects of implementations are the main aim of refinement meetings. However, most of the teams miss out on capturing the test notes.
Let us first discuss what a testing note is?
The supplemental details required to execute the tests for that particular user story is called a testing note. It can be any details like – test data, plugin requirements, dependency on external services, test limitations, test relevancy, etc.
In real-world projects, not all stories can be tested in all environments.
For example: develop or test environments have limited configuration capacity and files of certain sizes cannot be processed. Capturing testing notes helps teams to bring clarity and a shared understanding of how a particular scenario can be approached. It also gives POs a good opportunity to understand the hidden risks.
Separate QA board
Mainstream scrum boards restrict by displaying only the user stories movement across swim lanes. However, an agile tester performs much more test activities than just the new feature testing or regression testing. For example:
1. Supporting small hot-fixes
2. Short enhancements
3. Legacy applications support
4. Usability testing
5. Automating the user stories
6. Maintenance of automated scripts
7. TestOps etc
Easily automate your user stories with completely customizable no-code test automation tool Testsigma.
By and large development teams remain unaware of any of these activities. Agile development team tasks should be as transparent as possible. The available capacity of every team member should be taken into account before pulling in sprint tasks.
Especially the automation tasks are separated out to different scrum boards, this is detrimental in a long run. To stop this from becoming a major distractor, all the peripheral test activities should be made visible in the sprint scrum board to account for overall delivery within the agile scrum team.
The idea is not to completely discourage having a separate QA scrum board, this can be handy in some scaled agile teams when multiple scrum teams have to coordinate several tasks. However, the clarity of what should stay away from mainstream scrum boards should be given a thought.
Ignored test efforts
After scrubbing and combing the backlog in the refinement session, PO prepares a prioritized backlog list which will be presented in a scrum planning meeting. The development team maps each task against the size which implies the effort required to implement the story. Planning is a crucial scrum activity, and most teams miss out on taking testing efforts into account.
This is an anti-pattern in agile methodology. A story is considered to be complete only if it goes through development and testing activities. Agile teams should encourage every team member to actively participate in the planning poker session. Test Engineers should confidently step forward to provide their view on the user stories during the planning meetings.
Commenting defects
Agile recommends reducing waste and weeding out all the overhead activities. Development teams who do not use external bug reporting tools, encourage team members to report the bugs within the user story tickets. In such cases, the test team will miss out on the most valuable QA activity “root cause analysis”. Why is it even important? Defects are a mirror to the health of the application, a good quality report states the weak links and helps the development team in strengthening those areas which need deeper attention.
Teams commenting defects within the user story will be unable to aggregate the bugs and feed the root cause analysis activity. The whole idea of iterative development is driven by continuous improvement, when defects are not accessible teams miss out on valuable intrinsic details. To avoid this becoming a distractor in agile testing, it is important to educate development teams about the importance of defect analysis and follow bug reporting either using dedicated tools or some other way where they can aggregate data over a period of time
Piggybacking testers
Early feedback helps development teams to adopt changes more easily. User stories checked earlier in the development cycle help to establish an early feedback loop. In the real world situation, when the user story is developed and tested PO’s are expected to do a walk-through and bring forth any changes if required.
POs/PM create dependency on testers to give them the demo of a new feature. If the tester and PO fail to catch up on time, the tickets just pile up in the QA lane thus increasing the cycle time. Apprehend this bottleneck even before it piles up to cause deeper issues. Encourage POs to put their hands on the application and try out the features themselves. Doing so not just reduce the latency but also helps business to understand how the feature fits with wider application.
Related reads:
https://testsigma.com/regression-testing https://testsigma.com/blog/common-challenges-in-continuous-testing/
https://testsigma.com/blog/top-test-automation-challenges-to-look-out-for/ https://testsigma.com/blog/understanding-agile-testing-framework/ https://testsigma.com/ai-driven-test-automation