Software Testing and Documentation Assignment: MTEST Application

Verified

Added on  2023/03/17

|6
|1120
|57
Homework Assignment
AI Summary
This assignment addresses software testing and documentation, focusing on the MTEST application designed for automating the grading of student answers. It begins by analyzing a real-world software failure, the ExamSoft Examplify application outage in 2014, emphasizing the importance of software testing before deploying updates. The assignment then details the identification of test cases for the MTEST application, using techniques such as equivalence class partitioning, boundary value analysis, error guessing, and negative testing. Specific test cases are listed to verify data input, handle negative integers, validate ranges, and address potential errors. Boundary value analysis is applied to identify test cases for boundary conditions like zero questions and specific question limits. Finally, the assignment explores error guessing techniques to anticipate and test for potential errors, such as student answers exceeding the number of questions. The assignment concludes with a list of relevant references.
tabler-icon-diamond-filled.svg

Contribute Materials

Your contribution can guide someone’s learning journey. Share your documents today.
Document Page
Running head: SOFTWARE TESTING AND DOCUMENTATION
Software Testing and Documentation
Name of the Student
Name of the University
Author Note
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
1SOFTWARE TESTING AND DOCUMENTATION
Question 1: Real software failure
In the year of 2014, exam takers throughout the nation of United States were rendered
helpless in uploading his or her respective exam files. This generate wide scale panic
concerning the blowing states possibility and uploading deadlines. The incident generated
from a failure in the ExamSoft’s Examplify application.
At the very starting day of bar exams, these exam takers had to suffer the ordeal of
largescale outages of the Examplify application portals built by ExamSoft, that started
causing delays of exams in multiple regions (Casey, Casey and Griffin 2018). Countless
number of the bar candidates had to stay wide eyed while worrying about the times they tried
to begin writing their essays as well as how the exemplify kept failing preventing them from
being able to submit their work at all. All of this began when ExamSoft released a series of
updates for their application on days leading to the exams.
Hence to prevent such software failures from paralyzing academic work, conducting
exams as well as operations of various associated businesses, it is of prime importance to get
have software testing conducted on software applications before rolling out any changes to
them live through automatic updates. As a result, for the application MTEST which is to
automate grading of student answers needs to be tested as per the requirements identified.
Document Page
2SOFTWARE TESTING AND DOCUMENTATION
Question 2a: Identification of Test Cases
Test case design for the Marking and grading application MTEST has been conducted
with the help of multiple testing techniques (Mahmoud et al. 2019). These techniques are
equivalence class partitioning, boundary value analysis as also error guessing and negative
testing with respect to the requirements of the application (Fögen and Lichter 2018). The test
cases identified are listed below.
List of identified test cases for the marking and grading software application MTEST are:
1. Verify error message for empty data in input record of title
2. Verify acceptance of valid title records
3. Verify error message for negative integers present in No. of questions column
4. Verify error message for value '0' in No. of questions column
5. Verify values within the range 1-999 are getting accepted
6. Verify error message for numbers more than 999 in columns 1-3
7. Verify error message if first record set is empty
8. Verify acceptance if one question present
9. Verify acceptance of second record for 51-100 questions
10. Verify acceptance of third record for 101-150 questions
11. Verify error message for columns 1-9 if empty
12. Verify acceptance of student data in columns 1-9
13. Verify error message if student answers exceed number of questions
Document Page
3SOFTWARE TESTING AND DOCUMENTATION
Question 2b: Boundary Value Test Cases
Boundary value analysis refers to software testing techniques where test cases are
designed for testing the boundary values of the specified ranges (Abrahamsson et al. 2017).
This concept derives from the idea of boundaries. The values available for testing on the
extreme sides the boundary range represent boundary values.
Boundary value test cases for the grading application MTEST can include empty
input files, missing title records, one question exams, 50 question exams, 51 question exams,
999 question exams and 0 question exams. The boundary values here are 0, 50, 51, 100,101
and it continues in the same order till reaching 999 (Ahmed, Abdulsamad and Potrus 2015).
This results from MTEST being designed for categorizing a set of 50 questions per record
and supports grading of exams with up to 999 questions.
The boundary value test cases in the test design includes
Verify error message for value '0' in No. of questions column
Verify error message if first record set is empty
Verify acceptance if one question present
Verify error message for columns 1-9 if empty
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
4SOFTWARE TESTING AND DOCUMENTATION
Question 2c: Error Guessing test cases
Error guessing refers to the software testing technique where guessing of errors that
might prevail during execution of codes is performed (Hooda and Chhillar 2015). In this
technique, the possible error prone situations get counted. The tester then designs test cases
for exposing these errors. This is performed based on the experience, skills and technique of
the test analysts who uses their tacit understanding and prior work to identify problematic
parts of the application being tested. For this a thorough understanding of system being tested
is required along with evaluation of historical data and related test results.
For the given marking and grading application MTEST the error guessing test case
design to perform on the application is the following:
Verify error message if student answers exceed number of questions
Document Page
5SOFTWARE TESTING AND DOCUMENTATION
References
Hooda, I. and Chhillar, R.S., 2015. Software test process, testing types and techniques.
International Journal of Computer Applications, 111(13).
Abrahamsson, P., Salo, O., Ronkainen, J. and Warsta, J., 2017. Agile software development
methods: Review and analysis. arXiv preprint arXiv:1709.08439.
Fögen, K. and Lichter, H., 2018, April. Combinatorial testing with constraints for negative
test cases. In 2018 IEEE International Conference on Software Testing, Verification and
Validation Workshops (ICSTW) (pp. 328-331). IEEE.
Casey, K., Casey, M. and Griffin, K., 2018. ACADEMIC INTEGRITY IN THE ONLINE
ENVIRONMENT: TEACHING STRATEGIES AND SOFTWARE THAT ENCOURAGE
ETHICAL BEHAVIOR. Copyright 2018 by Institute for Global Business Research,
Nashville, TN, USA, p.58.
Mahmoud, A., Venkatagiri, R., Ahmed, K., Misailovic, S., Marinov, D., Fletcher, C.W. and
Adve, S.V., 2019. Minotaur: Adapting Software Testing Techniques for Hardware Errors.
Ahmed, B.S., Abdulsamad, T.S. and Potrus, M.Y., 2015. Achievement of minimized
combinatorial test suite for configuration-aware software functional testing using the cuckoo
search algorithm. Information and Software Technology, 66, pp.13-29.
chevron_up_icon
1 out of 6
circle_padding
hide_on_mobile
zoom_out_icon
logo.png

Your All-in-One AI-Powered Toolkit for Academic Success.

Available 24*7 on WhatsApp / Email

[object Object]