How to close tasks in the bug tracker

I wrote this article in working confluence in 2013. And at the time of this writing (2019), it was still relevant.



Initially, I wrote down the checklist as a reminder, including to myself. Because you have to return to tasks, including people who did NOT check them. For example, during a regression, you need to at least fundamentally check the functionality.



And so you open the task, scroll to the last comment to see what documentation, what works, and there ... It's empty. Or the modest "Everything is checked, everything is ok." Where is the documentation? Iā€™m not in the subject of the task, I want to read more!



Or if the customer writes that something is not working for him, and you want to check whether the situation is covered by self-tests. You go to the task, and there is no link to autotests. They did not write at all? Or just didnā€™t give the link? I have to find out ...



image






So the checklist for closing the task appeared:



  1. Check the task (hello cap). Use ready-made checklists for typical tasks + avoid typical errors in autotests.
  2. Write documentation on it (min - user, max - user and "for colleagues").
  3. In JIRA, leave a comment "I checked on the assembly core ***, customer ***, looked at this, this and that, here it is."
  4. Attach test data to the task if something was checked manually.


Itā€™s kind of ā€œabout us,ā€ but actually universal. Well, unless the tester writes documentation in your company, then we change the second paragraph to "check that all the necessary documentation is written or updated."



We will analyze each item separately.



Documentation



If there is no documentation for the task (it doesnā€™t matter if this is a bug or improvement), it is rediscovered.



If there is documentation, but in JIRA there is no link to it in the last comment, the task is rediscovered. (harsh times, harsh measures)



Minimally write / fix user requirements.



If there were changes in the projectā€™s configs, you need to check that there is technical documentation for such a change. If it is not, write.



If migration tasks were added, immediately write a general instruction for updating the release.



Comment



If you have to return to the task later, it is sometimes useful to find out which version the tester tested and what he checked (in short).



We write the versions from mercurial, give a link to the documentation and a brief description: I checked it manually through the / SOAP / buffer interface.



Test data



If the task was checked manually, be sure to apply test data to it (if they do not weigh 3Tb)



Yes, this data can be obtained from the shared repository, but there they can already be changed or even deleted. (We have a common repository of test data, but all the files are stored on the disk. We tried it in the version control system, I didnā€™t like it, nobody commits there at all, but they put it on the disk somehow)



Sometimes it seems that this is all garbage, they created one counterparty or collected selections by view.



But if in half a year a problem pops up at the Customer and the old tasks for playback are raised, these files will help a lot, verified more than once. But the actual data from the storage is already out of date.



Remember - it will take 1 minute to take a ready sql query and check against the database. And again to sit and make this request - it may take an hour, if not more. Save your time!



Therefore:



  1. Created a database from dbStart - attachment dbStart (excel in which a slice of the database for testing is saved).
  2. We downloaded test data from the storage - we attach the downloaded file.
  3. We downloaded them from somewhere else - add to the repository and attach to the task.


See also:



How to quickly create a template database in Maven? - about how we do dbStart for testing.



Commentary on tests, docs and data should be final. Not that ā€œI wrote such and such tests, found such and such problemsā€ and then correspondence with the developer for 20 comments, and your ā€œfinalā€ got lost somewhere in the middle. If you get lost - duplicate (the old one can be deleted).



Examples



When we recruited new testers, this article became few. Because I rediscovered the tasks of juniors and told how to fix the final comment so that it was understandable. And they, in turn, asked for examples of "how to."



So in the article another block appeared - ā€œExamplesā€. Here it is important to give links to real tasks in the working jira + some additional articles in confluence. Examples should be different: both for huge comments when 200 autotests are written on a task, and for small tasks that the guys will face every day.



I will not give links, but the meaning, I hope, is clear. The section looks something like this:



Examples of large tasks where there are many docks and tests:





Examples of small tasks: TEST-816 - expanded the consent model.



When testing "add functionality to the Demo", pay special attention to refactoring - submit documentation to core, all tests in the Demo, and the rest through inclusion, etc. Example: TEST-4519 - Add cross-reconciliation of FL-IP to the Demo.



Examples of useful comments ā€œhow did I test thisā€ that you return to again (itā€™s better to put them out in the HOWTO, but you should leave them in the task): TEST-812 - Make the rebuild of the index non-blocking (where to put the break * to check)



* Bryak: Breakpoint (slang) - the code breakpoint. This is when you run the source code in debug mode, such points help to track the parameters, to localize the error.



PS - see other testing articles in my blog



All Articles