logo

Viva and Raw Data Verification

   

Added on  2019-09-18

4 Pages1720 Words433 Views
 | 
 | 
 | 
2017-2018Faculty of Science, Engineering and ComputingAssessment FormModule: CI 6310User ExperienceTitle of Assignment:Usability Test ReportSubmission details: see Hand-in of Reports (p1)Module Learning Outcomes assessed in this piece of courseworkThe learning outcomes for this piece of coursework are:Research user needs and the implications of technology for work practice Analyse users and their activities, and carry forward lessons learned Design input modalities, output media and interactive content to appeal to an audience Evaluate the quality of users’ experience Reflect upon design practice and discuss the strengths and weaknesses of alternative techniquesThe coursework is also an opportunity for you to develop as individuals, and for employment (though this is not assessed)Communicate and collaborate in a professional mannerTo work in an ethical, social and security –conscious mannerAssignment Brief and assessment criteria (these will be discussed within a formally timetabled class)This piece of coursework is to conduct and report a usability test of an existing system. The usability test should be based on the CIF standard method and reporting format described in the lectures. Typically, students ask their friends and family to participate inusability test, or play ‘participant’ for each other – no need to approach strangers! A handful of participants is usually sufficient to cover a range of user personas and identify a range of usability issues. I know you do not have enough time to test a
Viva and Raw Data Verification_1

2017-2018representative sample adequate for statistical analysis, but go about your work and analyse the results as if you were going to test many participants eventually.Example Structure : Target Max Word Length 3,500 words2 Aims The aims of this evaluation. Will be different to the aims of the whole project, and may have evolved since the project definition. The overall aim may be broken down into a number of evaluation objectives (the ‘big questions’ for your study to answer – see Problem Statements in workshop), and explained3 Study Method (include rationale for decisions made)3.1 Experimental Design (e.g. one-shot, comparison, or repeated measures (for learnability)) 3.2 Participants 3.3 Tasks3.4 Metrics 3.5 Materials Identify and give reasons for form. Include blank copy in Appendix. 3.6 Procedure 3.7 Expected Results4. Evaluation Results 4.1 Written summary of overall findings4.2 Performance Data (including Tables of Quantitative Data – mean times, ratings etc)4.3 Usability Issues (including Table)4.4 Redesign RecommendationsIdentify and outline roughly the changes that will resolve the issues identified May be included as a column in the Table used for 4.35. Evaluation Discussion(Including consideration of, for example, unrepresentative samples of users/tasks/contexts, confoundingvariables and biases, inaccurate or unreliable indicators). Further, and related evaluation studies.AppendicesMaterials (Invitation to Participate, Participant Information Sheet, Screener, Task Instruction Sheets, Observation sheets, Post-Test Questionnaire, any custom questions, protocols as required by any adaptationof CIF.). An Optimal path, Screening script, Moderator-Participant protocol and List of Possible Errors are notrequested, as they just ‘sensitise’ your observation – they are typically not included in research reports.)
Viva and Raw Data Verification_2

End of preview

Want to access all the pages? Upload your documents or become a member.

Related Documents