ITECH7409 Software Testing

Verified

Added on  2023/06/08

|13
|2909
|472
AI Summary
This report discusses two software testing standards, ISO 829 Software documentation standards and AS/NZS ISO/IEC 25062:2006 Software engineering, and compares them. It also discusses test management tools that can be used to evaluate software systems.
tabler-icon-diamond-filled.svg

Contribute Materials

Your contribution can guide someone’s learning journey. Share your documents today.
Document Page
Running head: ITECH7409 SOFTWARE TESTING
ITECH7409 Software Testing
An Investigation of Software Development, Testing Standards, and Test Management Software
Name of the Student
Name of the University
Author Note
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
1
ITECH7409 SOFTWARE TESTING
Document Page
2
ITECH7409 SOFTWARE TESTING
Table of Contents
Introduction......................................................................................................................................3
Responses to questions for following standards..............................................................................3
Standard 1:...................................................................................................................................3
Standard 2:...................................................................................................................................5
Comparison of the Standards...........................................................................................................6
Analysis of the Test Management Tool Features............................................................................7
Conclusion.......................................................................................................................................9
References......................................................................................................................................11
Document Page
3
ITECH7409 SOFTWARE TESTING
Introduction
Almost every industry in the present time has been using software system for the
accomplishment of the operational activities in much better and easy way. Evaluation of the
developed software system is necessary in manner to identify how feasible could be the proposed
system in regards with the organizational growth and development. Standard can be described as
the methodology that can be established for the identification, implementation, analysis, process
validation, and the quality metrics’ software (Henderson-Sellers et al., 2014). The following
report will be discussing the two standards including ISO 829 Software documentation
standards,” and “AS/NZS ISO/IEC 25062:2006 Software engineering.” A comparison of these
standards is also proposed in this report. There are also availability of the test management tools
those could be utilized for evaluating the developed software system and it has been also
discussed in this paper.
Responses to questions for following standards
Standard 1:
1. Standard name
“ISO 829 Software documentation standards”
2. Copyright for the standard
© 2008 by the Institute of Electrical and Electronics Engineers, Inc.
3. Involved Universities
Middle East Technical University, the Institute of Electrical and Electronics Engineers, Inc.
4. Scope or intent of the standard
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
4
ITECH7409 SOFTWARE TESTING
This standard will be covering software and systems being reused, maintained, operated,
acquired, and/or developed (Gonzalez-Perez et al., 2016). The consideration of the systems are
identified by this standard including the activities and the processes addressed while software
correctness and system determination along with other attributes.
5. Key terms and understandings
Integrity level: The standard defines four integrity levels that can be helpful in describing
the software-based system and software importance for the users.
Minimum testings tasks for every integrity level: It defines testing activities for the four
integrity levels. It also prefers table of optional testing tasks.
System viewpoint: the minimum recommended testing tasks are included that
alternatively contributes in responding to the systems.
Test Document Selection: the content topics and test documentation within all the
documentation type and these should be chosen based on the integrity level tasks.
6. Application of the standard
It is helpful in determining whether the given activity’s development products conform
the activity requirements and whether the software and/or system satisfies the application and
use of the users. It includes verification, demonstration, analysis, inspection, and validation of
the software-based and software system products (Ardis et al., 2015).
7. Specific relevance to software testing
It allows integrity levels ranging between high and low integrity that can be useful in
determining the software-based system and software system’s importance. Minimum testing
Document Page
5
ITECH7409 SOFTWARE TESTING
tasks have been recommended in every integrity level in association with the rigor and intensity
to the testing tasks. Selection of test documentation is another better approach as a modification
through introducing optional tasks.
Standard 2:
1. Standard name
AS/NZS ISO/IEC 25062:2006 ISO/IEC 25062:2006 Software engineering—Software
product Quality Requirements and Evaluation (SQuaRE)—Common Industry Format (CIF) for
usability test reports.
2. Copyright for the standard
© Standards Australia/Standards New Zealand
3. Involved Universities
Griffith University, the University of Queensland, University of Auckland; NZ,
University of South Australia University of Technology; Sydney, Charles Sturt University, and
La Trobe University.
4. Scope or intent of the standard
The scope of this standard can be described as to provide the supplier organizations,
customer organization, and their usability professionals with the standard that can evaluate the
software usability testing (Raulamo-Jurvanen, Kakkonen & Mäntylä, 2016). This standard can be
described as the key factor for the prediction of the successful deployment of the software.
5. Key terms and understandings
Document Page
6
ITECH7409 SOFTWARE TESTING
Informative: the term is used within the proposed standard in manner to explain the
definition of the annex to which the standard is being applied (O'Connor & Laporte, 2017). This
annex has been for the purpose of guidance and information.
CIF (Common Industry Format): this format has been intended to be applicable by the
usability professionals in manner to propose the results of the summative usability testing.
6. Application of the standard
This standard can be used for reducing the training time for the usability staff as only one
individual is capable enough to understand the entire process (Matalonga, Rodrigues &
Travassos, 2017). The communication between the purchasing organizations and the vendors can
potentially be enhanced as the individual reading CIF-compliant report would be able to share
the common expectations and language.
7. Specific relevance to software testing
The usability professional can use this standard for reporting the summative usability
testing of the software in manner to analyse its feasibility. This standard can be helpful in
facilitating usability’s incorporation as the division of the procurement decision-making process
for the interactive software products.
Comparison of the Standards
Sl No. ISO 829 Software documentation
standards
AS/NZS ISO/IEC 25062:2006
ISO/IEC 25062:2006
1 Despite of focusing on testing
processes, this standard also provide
the clear format for the documentation
This standard however focuses on
documentation developed after the
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
7
ITECH7409 SOFTWARE TESTING
on every task (Eira et al., 2018). usability tests.
2 This standard emphasizes on both the
black box and white box testing
techniques that allows the users to
understand the system work with or
without in-depth knowledge related to
the working of the systems.
This standard appears to address all
the technical aspects related to the
software in manner to identify the
application is meeting the functional
requirements or not. It also addresses
the black-box testing techniques for
the evaluation of the system.
Analysis of the Test Management Tool Features
Test Collab: it can be very easy to understand and much quick to set-up despite of being
modern test management tool. Many of the organizations have believed in the test collab and in
last seven years it has become very popular because of their QA process. It is offering the users
with state-of-the-art integration in association with the test automation tools and bug trackers
(Abrahamsson et al., 2017). Other than these specifications, it is also capable of offering agile
methodology, time tracking, test planning, requirement management, and scheduling. The latest
model launched in April 2018, introduced new intelligence reports that is helpful in mining all
the execution and cases of the project in manner to produce trends for the manager and
interesting sights (Ahmed et al., 2016). Along with the other reports, it is also capable of offering
test suite heat map, burn down charts, and the test failure rate distribution.
XQual: It is capable of delivering the most complete or advanced test management
solutions within a marginable price that can be helpful in allowing the users to manage not less
than the bugs, campaigns, tests, specifications, risks, requirements, and releases. It is capable of
driving any kind of test and it can be integrated with all the continuous integration platforms
Document Page
8
ITECH7409 SOFTWARE TESTING
(Petersen, vakkalanka & Kuzniarz, 2015). In the present market, there are five different
interfaces for the manual testing and approximately seventy connectors in manner to establish
much effective and efficient automation frameworks including “Selenium, JMeter, QTP/UFT,
Ranorex, JUnit, TestOptimal, TestComplete, TestPartner, NUnit, QF-Test, Sahi, NeoLoad,
Sikuli, RobotFramework, TestNg, SoapUi, Squish, and much more (Rojas et al., 2018).” It is
also comprised of bug-tracking and internal requirement management module that can also be
integrated with the third party bug – tracking and requirement tracking systems such as Mantis,
ClearQuest, JIRA and others.
Following is the difference those can be helpful in allowing the user to select on eof the
best approach and the differences have been gained after logging in both the test case
management tools:
The Slant community ranked Test Collab at sixth position however, XQual has been
listed in third rank that makes XQual to be more favourable for the users. XQual also offers free
community edition for four users who can perform 400 tests. On the other hand, Test Collab has
free plan for the users. Talking about the benefits of these tools, XQual can be helpful in
providing fifty or more than fifty drivers to the interface including every automated tools those
are available in the market (Fenton & Bieman, 2014). It also provides an SDK that can allow the
user to develop one’s own for the instances, if the usability professional have the proprietary
framework for the same instance. On the other side, Test Collab has the Bug tracker integration
that provides the users with the facility of bi-directional integration that can be applicable for
tracking the bugs. It also holds the test automation support that makes the evaluation faster and
easy. Bi-directional integration is not available for other software evaluation system that makes it
much favourable for the individuals who need bug free results and output. The XQual has
Document Page
9
ITECH7409 SOFTWARE TESTING
innovative GUI (Graphical User Interface) that allows the users to use for the complete product
in much easier and effective way (Amalfitano et al., 2015). Incorporating so much processes can
be beneficial for the individuals however, it can also make the system messy and despite of these
facts, the XQual’s GUI allows users to access it much efficient and easy way. The Test Collab
can be either self-hosted or it can be hosted through SaaS that makes it flexible for the users
allowing them to choose platform for hosting method. The users can opt for either the cloud
server service or self-hosted plan in manner to evaluate the software being evaluated or tested
(Piet et al., 2017). Test Collab supports the agile methodologies however, XQual does not
support the agile methodologies in high precise manner. Test Collab also allows the reusability
and versioning of the test cases. Talking about the disadvantages, it can be stated that XQual
offers the integrations with the external systems in only way that led to the conclusion that the
users can submit the bugs those could be tracked by the external system (Navarro, Yvard & Van
Dillen, 2018). However, the changes in the external system itself will not fed back into the
XQual. Test Collab on the other hand has also a disadvantage as it does not allows the user to
develop custom reports.
The common approach between both the test case management tools is that they allow
the users to effectively and easily evaluate the software and get to the precise and accurate results
those could be further utilized in enhancing the efficiency and effectiveness of the software being
installed within the existing system of the organization.
Conclusion
The above report explained the two standards those must be considered while performing
the software testing in the software engineering in manner to make sure that the testing being
performed proposes feasible results. The two standards properties have been explained through
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
10
ITECH7409 SOFTWARE TESTING
answering few questions. A comparison between these two standards have been also proposed
that describes the difference and the similarities between both the standards. The above analysis
on the standards can be adhered while performing software testing in manner to make sure that
the purpose of testing is covering all the required sector and the testing is being performed in an
efficient and effective way. The second part was about the analysis on the test case management
tools those are available in the present market allowing users to easily and effectively testing the
software and proposing sophisticated outputs. The XQual can be recommended as the best test
management tool in manner to evaluate the software.
Document Page
11
ITECH7409 SOFTWARE TESTING
References
Abrahamsson, P., Salo, O., Ronkainen, J., & Warsta, J. (2017). Agile software development
methods: Review and analysis. arXiv preprint arXiv:1709.08439.
Ahmed, T. M., Bezemer, C. P., Chen, T. H., Hassan, A. E., & Shang, W. (2016, May). Studying
the effectiveness of application performance management (APM) tools for detecting
performance regressions for web applications: an experience report. In Proceedings of
the 13th International Conference on Mining Software Repositories (pp. 1-12). ACM.
Amalfitano, D., Fasolino, A. R., Tramontana, P., Ta, B. D., & Memon, A. M. (2015).
MobiGUITAR: Automated model-based testing of mobile apps. IEEE software, 32(5),
53-59.
Ardis, M. A., Budgen, D., Hislop, G. W., Offutt, J., Sebern, M. J., & Visser, W. (2015). SE
2014: Curriculum Guidelines for Undergraduate Degree Programs in Software
Engineering. IEEE Computer, 48(11), 106-109.
Eira, P., Guimarães, P., Melo, M., Brito, M. A., Silva, A., & Machado, R. J. (2018, April).
Tailoring ISO/IEC/IEEE 29119-3 Standard for Small and Medium-Sized Enterprises.
In 2018 IEEE International Conference on Software Testing, Verification and Validation
Workshops (ICSTW). IEEE.
Fenton, N., & Bieman, J. (2014). Software metrics: a rigorous and practical approach. CRC
press.
Gonzalez-Perez, C., Henderson-Sellers, B., McBride, T., Low, G. C., & Larrucea, X. (2016). An
Ontology for ISO software engineering standards: 2) Proof of concept and
application. Computer Standards & Interfaces, 48, 112-123.
Navarro, G., Yvard, P., & Van Dillen, E. (2018). A dedicated Quality approach to manage the
“small” software tools in control centers. In 2018 SpaceOps Conference (p. 2326).
O'Connor, R. V., & Laporte, C. Y. (2017). The evolution of the ISO/IEC 29110 set of standards
and guides. International Journal of Information Technologies and Systems Approach
(IJITSA), 10(1), 1-21.
Petersen, K., Vakkalanka, S., & Kuzniarz, L. (2015). Guidelines for conducting systematic
mapping studies in software engineering: An update. Information and Software
Technology, 64, 1-18.
Piet, G., Soma, K., Bonanomi, S., Laffargue, P., Nielsen, J. R., Notti, E., ... & Rijnsdorp, A.
(2017). Report Management Strategy Evaluation and performance test of the decision-
support tool (s).
Raulamo-Jurvanen, P., Kakkonen, K., & Mäntylä, M. (2016, November). Using Surveys and
Web-Scraping to Select Tools for Software Testing Consultancy. In International
Document Page
12
ITECH7409 SOFTWARE TESTING
Conference on Product-Focused Software Process Improvement (pp. 285-300). Springer,
Cham.
Rojas, E., Doriguzzi-Corin, R., Tamurejo, S., Beato, A., Schwabe, A., Phemius, K., & Guerrero,
C. (2018). Are we ready to drive software-defined networks? A comprehensive survey on
management tools and techniques. ACM Computing Surveys (CSUR), 51(2), 27.
Sánchez-Gordón, M. L., Colomo-Palacios, R., de Amescua Seco, A., & O’Connor, R. V. (2016).
The route to software process improvement in small-and medium-sized enterprises.
In Managing Software Process Evolution (pp. 109-136). Springer, Cham
chevron_up_icon
1 out of 13
circle_padding
hide_on_mobile
zoom_out_icon
[object Object]

Your All-in-One AI-Powered Toolkit for Academic Success.

Available 24*7 on WhatsApp / Email

[object Object]