Show simple item record

dc.contributor.authorOgao, Patrick
dc.contributor.authorSensalire, Mariam
dc.contributor.authorTelea, Alexandru
dc.date.accessioned2022-06-27T12:41:10Z
dc.date.available2022-06-27T12:41:10Z
dc.date.issued2009
dc.identifier.urihttps://ieeexplore.ieee.org/document/5336431
dc.description.abstractMany software visualization (SoftVis) tools are continuously be- ing developed by both researchers as well as software development companies. In order to determine if the developed tools are effec- tive in helping their target users, it is desirable that they are exposed to a proper evaluation. Despite this, there is still lack of a general guideline on how these evaluations should be carried out and many of the tool developers perform very limited or no evaluation of their tools. Each person that carries out one evaluation, however, has experiences which, if shared, can guide future evaluators. This paper presents the lessons learned from evaluating over 20 SoftVis tools with over 90 users in five different studies spread on a period of over two years. The lessons covered include the selection of the tools, tasks, as well as evaluation participants. Other discussed points are related to the duration of the evaluation experiment, its location, the procedure followed when carrying out the experiment, as well as motivation of the participants. Finally, an analysis of the lessons learned is shown with the hope that these lessons will be of some assistance to future SoftVis tool evaluators.en_US
dc.language.isoenen_US
dc.publisherIEEEen_US
dc.titleEvaluation of Software Visualization Toolsen_US
dc.typeArticleen_US


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record