Assessor- Release- Retrospective-ESI This retrospective board is for the Release- for Assessor- project What worked well? The team work and support within scrum teams. 9 Dev's working well with the UI team & the timely support from the UI team which has been crucial for the success of this project. Implementation of some of the good Agile practices like Dev's writing Automation code when needed & also doing manual tests. Scrum Masters focusing on the build related issues as well & ensuring that an Smoke failures are attended promptly. Co ordination with BA's and support was good Implementation of Scrum in a better way than other projects with Continuous Integration Good improvement initiatives in the QA Space from the QA Lead Sprint Reviews done using a collaborative effort wherein everyone in the Scrum Team takes chances to showcase the demo. Good technology stack chosen and scrum team members gained confidence quickly in initial sprints. Scrum Masters, TA & Senior QA religiously supporting the Attinad team` to ensure that the engagement was successful in all possible ways. Teams picked up the technology & process pretty fast & is a good sign that we could mature further. TA's effort in getting the designs right & doing strong PR reviews especially in case of Attinad deliveries Good help from the management team in understanding the crisis situation & supporting the de-scoping of some of the user stories. Great vision initially on creating the epics and stories properly. Good knowledge of POs on existing web assessor logic helped Movement of Manual Testers to the Automation Space helped bring the right fit interms of tester mentality & automation capability. Constant focus on refactoring ensures maintainable code quality New team members who joined mid of the project picking up Assessor technologies pretty well going straight into the sprint after a day's training Understanding Agile work style + support from TA, SAs/ Scrum masters and BA were helpful
Technical stack selection. What hasn't worked well? AT-Difficult to verify the automation regression test failures every day, since QA team occupied with their own sprint tasks 9 Weekend working Project Management still in waterfall while development happens in Agile - contradictory & painful. Work Load heavy for the manual testers in verifying the Defenition of Done, retest & regression vs Automation team. Attinad engagement with restrictions on sharing data has caused an big overhead on some of the team members to make the engagement working Automation approach on what to be automated,right ownership & clarity of the Automation Strategy & Early detection of defects to aid quality delivery within a Sprint & for a release. Certain points of unlearning in the QA space is very crucial for the success of this project. https://www.scrumalliance.org/community/articles//january/agiletesting-key-points-for-unlearning BA has to analyze all the business scenarios and write the requirement accordingly instead of finding in the late stage of Execution by QA/Dev QA' colleagues to speak up & challenge more in Sprint Planning & Retrospectives. 6 Most sprints the capacity was in Red for MT. No action/substitute?` 5 late in finalising of test strategies and automation strategies till the issues came up. more reactive than proactive The prioritization of User stories & weighing the business value against complexity as a priority for the stories in earlier Sprints. Not getting timely help for deployment/build issues from outside, space issues in octopus, FAM deployment etc. Need ownership here. Team often tempts to take more stories in sprint and thus restricting QA's or dev to be creative and exploring more limited functionalities. Are we implementing retrospective points effectively? We discuss many points during retro, but able to see similar issues persisting Following the process to minimize the defect/issue leakage to integration day. 5 4
The General Acceptance Criteria too large that the focus on estimating a Story would rely only on the Acceptance Criteria put in story. not keeping the automation failures within control after reducing failure count to less than once User stories waiting for the Testing tasks to close while the Devs have capacity and can take up those tasks and close the story losing valuable team members in between and team reshuffle Less functionality issues in NLP, but office test affected confidence in product. Proper planning with teams need to be done beforehand Proper analysis on already developed functionalities to be done while writing new User stories. Story misses could finally lead to unwanted defects or CRs. Evidenced by (i) Response submission from worklist and from outside the response (ii) the impact of recent functionality for opening Responses from messages on the Quality Feedback functionality There is only one android device available for all three teams and this has been highlighted from early sprints and not resolved till now. implementation changes after facing performance issues like more annotations in IE, hammer events rework etc Build delivery during a sprint should be as per plan and the agreed sprint QA tasks should get equal weightage Scope of testing - There were quite a lot scope creep and which has played wrong with the estimate and coverage/quality QA team have challenged during planning meetings for estimation, but the asked estimates were cut down which impacted the testing efforts. More regression issues was seen by moving towards next sprint Proper milestone planning in terms of requirement. The requirement should be forecast and the development of the same should be planned Defining the Defenition of Done wrt the various combinations which can be tested within a Sprint. PO's unclear on the agreed UI designs & not in sync with other PO's during various Scrum ceremonies causing confusions. PO's should ensure that User stories have coverage of live EE scenarios within sprint. FNT Closure Delays during the last couple of Sprints Exit criteria for US is not clear in many cases. Most time the Impacted areas are identified once the development starts Web Assessor, Assessor and Database not in proper sync in March environment. Evidenced by the issues caused during recent SUP Testing.
Devices performance needs to be analysed right at the sprint level itself. Current practice seen in most of the cases is to make sure that it's just working Regression issues with different teams working around the same functional area in the application - a strong beforehand thought required during planning and architecture design. Eg: Messaging and Exceptions Memory leakage analysis to be done during the sprint - Chrome and IE developer tools are good tools to start with. More robust process required for automation data setup. Currently the restore to the functional database happens in a clumsy way. Technical upskilling of QA members - know to back up and restore DBs so that they dont get blocked when dev is not around What could be improved? Dev QA relationship to work as one team in delivering & closing user stories together than raising & closing relationship. Finding the best solution or how can we cross the crisis should be the way of thinking instead of passing the buck. Team effort is still missing in some areas especially when it goes to higher level. Management should be more flexible in adding resources if they really feels so. Feature testing, EE, Smoke testing should require continuous support and managing current resources or working extra days is not a solution. BAs to be more confident in making decisions and tell existing behaviours which exist in web assessor Completely avoid naming a scrum member as 'Dev' or 'QA'. Use names like 'testing member'. Avoid separate QA meetings for scrum members Avoid comparison on task times between teams as this differ on individual basis ESI Management needs to bring changes to smoothen transition to agile rather than causing team to adjust to old reports/ process More devices required for testing, development & UI investigation since the focus is more around usablility Implementation of BDD for the Smoke & EE scenarios with frequent review of Test cases by the PO, BA, QA combinations. 8 8 8 Mark anything anywhere..!! Difficult to handle the end users satisfaction proper estimate model from QA team and process for Entry and exit criteria
Continuous Testing Strategies needs improvement with more focus on Automation & Exploratory remove pressure on story points delivery and allow team the freedom to say what is Done. Entire team should be aware of Vision Document and Test Strategy to ensure what is the validation scope to challenege the user story coverage during sprint planning. Unless team is aware of this the impacts might not be understood and could cause increase in defects, Contigenceis in release plan to accommodate any crisis Automation tests to run one sprint behind as a regression suite will be more effective. Risk based approach for delivery of stories -> development and Testing Rely on Dashboards to see Trends rather than numeric targets as far as project management is considered & reduce time in meetings. Need more scrum team members develop new skills and try to be multiskilled cross functional teams (skills like build/deployment awareness, automation scripting, hosting own build in local) rather than depend on others proper agile training would be good for some of the members who joined late / missed initial Scrum training so that we don't go back to old processes and lose agile mindset. Defect fix time,scope additions also to be considered during release plan Device Testing Strategy & availability of devices for Testing by the Dev, QA, UI teams - Now 4 ESI teams needs a relook. Estimation in Inception to factor some amount of uncertainity so that estimation variations are not huge. There needs to be a clear RACI for various roles along with an escalation point of contact. Impacted areas need to analyze and cover during the sprint itself to find the regression issues early Write more manageable code when we are modifying existing functionalities What will we work on? AT- Discuss & agree on the objective & fulfill the strategy with the plan. Relook at what went wrong in conveying the expectations. Look at stopping sprint when the confidence of the team is low.
QA Plan, RACI & strategy to be jointly agreed by Scrum Masters & QA Lead before passing to team. More focus on the Integration Issues. Inception, Scope & visibility of the realease to happen earlier in the project. Automation Plans & structuring under discussion. Manual reload- test initiatives to be discussed & resource plan to be put. Accessibility ownership- to be with the UI team. Discuss on the scope of accessibility.