IMD11112 Design and Dialogue Report: Campus Garden Application
VerifiedAdded on 2023/04/21
|27
|5222
|178
Report
AI Summary
This report details the design and development of an interactive mobile application for the Napier Lion’s Gate project at the Merchiston campus. The application aims to enhance visitor experience within the campus garden by integrating augmented reality (AR), the Internet of Things (IoT), and user experience (UX) design principles. The report explores various design approaches, including conceptual design, design language, and the use of icons and typography. It also includes online research on AR and VR technologies, along with the application's architecture, which is built on the Vufuria SDK. Furthermore, the report covers the AR experience, the functionality of IoT, and the application's features, such as an interactive campus map, visitor scheduling, and plant information through camera functionalities and proximity sensors. Evaluation and testing methods, including participant-based evaluation and usability testing, are also discussed. The report concludes with potential future enhancements and provides references for further study.

Running head: DESIGN AND DIALOGUE 1
Design and Dialogue
[Name of Student]
[Institution Affiliation]
Design and Dialogue
[Name of Student]
[Institution Affiliation]
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Running head: DESIGN AND DIALOGUE 2
Table of Contents
1 Introduction..............................................................................................................................5
2 Design Approaches..................................................................................................................6
2.1 Conceptual Design............................................................................................................7
2.2 Design Language...............................................................................................................8
2.2.1 Color..........................................................................................................................9
2.3 Icons................................................................................................................................10
2.3.1 Typography..............................................................................................................11
3 Understanding........................................................................................................................12
3.1 Online Research..............................................................................................................12
3.1.1 Marker-based system...............................................................................................13
3.1.2 Markerless system...................................................................................................13
3.2 The AR Experience.........................................................................................................14
3.3 How IoT Works..............................................................................................................14
3.4 How The Application Works..........................................................................................16
3.4.1 Requirements...........................................................................................................16
3.4.2 Sensors.....................................................................................................................16
4 Environment..........................................................................................................................19
4.1 Story Boarding................................................................................................................19
5 Evaluation and Testing..........................................................................................................20
Table of Contents
1 Introduction..............................................................................................................................5
2 Design Approaches..................................................................................................................6
2.1 Conceptual Design............................................................................................................7
2.2 Design Language...............................................................................................................8
2.2.1 Color..........................................................................................................................9
2.3 Icons................................................................................................................................10
2.3.1 Typography..............................................................................................................11
3 Understanding........................................................................................................................12
3.1 Online Research..............................................................................................................12
3.1.1 Marker-based system...............................................................................................13
3.1.2 Markerless system...................................................................................................13
3.2 The AR Experience.........................................................................................................14
3.3 How IoT Works..............................................................................................................14
3.4 How The Application Works..........................................................................................16
3.4.1 Requirements...........................................................................................................16
3.4.2 Sensors.....................................................................................................................16
4 Environment..........................................................................................................................19
4.1 Story Boarding................................................................................................................19
5 Evaluation and Testing..........................................................................................................20

Running head: DESIGN AND DIALOGUE 3
5.1 Participant Based Evaluation..........................................................................................20
5.2 List of Requirements and Problems................................................................................21
5.3 Usability Testing of The App..........................................................................................22
6 Future Enhancements.............................................................................................................23
7 Conclusion.............................................................................................................................24
8 References..............................................................................................................................25
5.1 Participant Based Evaluation..........................................................................................20
5.2 List of Requirements and Problems................................................................................21
5.3 Usability Testing of The App..........................................................................................22
6 Future Enhancements.............................................................................................................23
7 Conclusion.............................................................................................................................24
8 References..............................................................................................................................25

Running head: DESIGN AND DIALOGUE 4
List Of Tables
Table 1 Key requirements..............................................................................................................16
Table 2 Sensors..............................................................................................................................17
List Of Figures
Figure 1 Overal Design....................................................................................................................7
Figure 2 Conceptual Design............................................................................................................8
Figure 3 Design Language...............................................................................................................9
Figure 4 Color choice....................................................................................................................10
Figure 5 Icons to Be Used.............................................................................................................11
Figure 6 Design Typography.........................................................................................................12
Figure 7AR Experience.................................................................................................................14
Figure 8 IoT In Action...................................................................................................................15
Figure 9 App environment.............................................................................................................19
Figure 10 AR and 3-D of the tree area..........................................................................................20
List Of Tables
Table 1 Key requirements..............................................................................................................16
Table 2 Sensors..............................................................................................................................17
List Of Figures
Figure 1 Overal Design....................................................................................................................7
Figure 2 Conceptual Design............................................................................................................8
Figure 3 Design Language...............................................................................................................9
Figure 4 Color choice....................................................................................................................10
Figure 5 Icons to Be Used.............................................................................................................11
Figure 6 Design Typography.........................................................................................................12
Figure 7AR Experience.................................................................................................................14
Figure 8 IoT In Action...................................................................................................................15
Figure 9 App environment.............................................................................................................19
Figure 10 AR and 3-D of the tree area..........................................................................................20
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Running head: DESIGN AND DIALOGUE 5
1 INTRODUCTION
The newly proposed project for the Napier Lion’s Gate includes a more user-friendly
mobile phone application with enhanced interactivity which is geared to give the visitors of the
campus a more interactive application to get to know the campus garden. To make this a reality,
the smartphone application has been proposed to have a new set of requirements that include the
inclusion of advanced technologies such as augmented reality, virtual reality and the new field of
Internet of Things. Some of the key features of the app include an interactive campus map, a
feature for visitors to schedule their visits to the campus and have a 360 degrees view of the
campus buildings and infrastructure and garden [1]. Through the application, one is able to
display detailed information about plants in the vicinity and their corresponding species. This is
made possible due to various camera functionalities built into the app and inclusion of proximity
sensors to correctly identify the plants [2].
The underlying architecture of the app is built on the Vufuria SDK which is normally
used to build Augmented reality applications for mobile devices. The app is secured by an
interactive login screen where users such as students and the campus staff can securely log in and
once authorized, they will be presented with a simplified version of the campus moodle. In this
moodle, the users can view the various courses offered by the campus, the status of various
modules and the student is given the ability to choose different options for taking modules, either
of trimester basis. Since AR and mostly IoT forms the bulk of the application, different student
and staff will be presented with a different view which is customized for their needs [3].
1 INTRODUCTION
The newly proposed project for the Napier Lion’s Gate includes a more user-friendly
mobile phone application with enhanced interactivity which is geared to give the visitors of the
campus a more interactive application to get to know the campus garden. To make this a reality,
the smartphone application has been proposed to have a new set of requirements that include the
inclusion of advanced technologies such as augmented reality, virtual reality and the new field of
Internet of Things. Some of the key features of the app include an interactive campus map, a
feature for visitors to schedule their visits to the campus and have a 360 degrees view of the
campus buildings and infrastructure and garden [1]. Through the application, one is able to
display detailed information about plants in the vicinity and their corresponding species. This is
made possible due to various camera functionalities built into the app and inclusion of proximity
sensors to correctly identify the plants [2].
The underlying architecture of the app is built on the Vufuria SDK which is normally
used to build Augmented reality applications for mobile devices. The app is secured by an
interactive login screen where users such as students and the campus staff can securely log in and
once authorized, they will be presented with a simplified version of the campus moodle. In this
moodle, the users can view the various courses offered by the campus, the status of various
modules and the student is given the ability to choose different options for taking modules, either
of trimester basis. Since AR and mostly IoT forms the bulk of the application, different student
and staff will be presented with a different view which is customized for their needs [3].

Running head: DESIGN AND DIALOGUE 6
Augmented reality has been a growing field in the technology industry. It comprises of a
superimposed image generated by a computer on a view of a real-world object. The general
effect is a more composite view of the object [4]. The above has been made possible, thanks to
the processing of data generated by the system such as audio-visual, graphical and GIS. Several
research has been done in this field and the overall effect is the presentation of-of a blend of the
real object and computer-generated images to give the user a better experience with the real-life
objects [5]. This has made the virtual and real objects to exist in the same space within the
application.
The IoT, on the other hand, is made of devices and other information gathering objects
which are able to seamlessly communicate with each other and share data. This has increased the
potential of IoT devices to numerous use cases [6].
The combination of the two technology has provided the app with the ability to identify
the various tree and tree species in the campus garden, provision of a brief summary and the
potential benefits of the plant is included in the augmented view. The information is presented to
the user in a 3-dimensional space to increase more interactivity [7].
2 DESIGN APPROACHES
The design principle used in this app is making AR part of the design from the ground up
and the software mainly focus on AR. The additional communication features are implemented
using the IoT in the design [8]. The overall design of the app is as shown below,
Augmented reality has been a growing field in the technology industry. It comprises of a
superimposed image generated by a computer on a view of a real-world object. The general
effect is a more composite view of the object [4]. The above has been made possible, thanks to
the processing of data generated by the system such as audio-visual, graphical and GIS. Several
research has been done in this field and the overall effect is the presentation of-of a blend of the
real object and computer-generated images to give the user a better experience with the real-life
objects [5]. This has made the virtual and real objects to exist in the same space within the
application.
The IoT, on the other hand, is made of devices and other information gathering objects
which are able to seamlessly communicate with each other and share data. This has increased the
potential of IoT devices to numerous use cases [6].
The combination of the two technology has provided the app with the ability to identify
the various tree and tree species in the campus garden, provision of a brief summary and the
potential benefits of the plant is included in the augmented view. The information is presented to
the user in a 3-dimensional space to increase more interactivity [7].
2 DESIGN APPROACHES
The design principle used in this app is making AR part of the design from the ground up
and the software mainly focus on AR. The additional communication features are implemented
using the IoT in the design [8]. The overall design of the app is as shown below,

Running head: DESIGN AND DIALOGUE 7
Figure 1 Overal Design
2.1 CONCEPTUAL DESIGN
This design provides a broader outline of the app in terms of functionality and the provides a
more intellectual view of the app [9]. The following diagram presents the refined conceptual
design of the app, refined to its core concepts as shown below,
Figure 1 Overal Design
2.1 CONCEPTUAL DESIGN
This design provides a broader outline of the app in terms of functionality and the provides a
more intellectual view of the app [9]. The following diagram presents the refined conceptual
design of the app, refined to its core concepts as shown below,
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

Running head: DESIGN AND DIALOGUE 8
Figure 2 Conceptual Design
2.2 DESIGN LANGUAGE
This provides the overlying schema that is used to guide the design of the app. The key
factors considered in the language is the ability to make the app more interactive in terms of
usage and experience. To make this included in the design, all the images and the texts to
accompany the images will be well colored in HDR. To ensure the users are able to use the app
even when outside the garden, the colors will be designed to make the app appear distinct [10].
The basic principles of AR and IoT design languages are all included in the design to
make the app more easy to navigate hence increasing user experiences and usability. The
following diagram shows the design language in its core [11].
Figure 2 Conceptual Design
2.2 DESIGN LANGUAGE
This provides the overlying schema that is used to guide the design of the app. The key
factors considered in the language is the ability to make the app more interactive in terms of
usage and experience. To make this included in the design, all the images and the texts to
accompany the images will be well colored in HDR. To ensure the users are able to use the app
even when outside the garden, the colors will be designed to make the app appear distinct [10].
The basic principles of AR and IoT design languages are all included in the design to
make the app more easy to navigate hence increasing user experiences and usability. The
following diagram shows the design language in its core [11].

Running head: DESIGN AND DIALOGUE 9
Figure 3 Design Language
2.2.1 Color
The color of the app is important for enhancing usability. The app has taken the approach
of using the brand color of the campus to make the users have a feel of the campus when using
the app [12]
A more solid black color shall be used primarily for text on the app. The various
icons used and words that are emphasized in the app shall be designed in color theme. The colors
chosen for AR graphics shall be those which are viewable by smartphone. The overall effect of
this is the creation of the real-life impression of the various objects in the garden hence making
the various users to notice them while viewing the garden [13]. The gallery option of the various
Figure 3 Design Language
2.2.1 Color
The color of the app is important for enhancing usability. The app has taken the approach
of using the brand color of the campus to make the users have a feel of the campus when using
the app [12]
A more solid black color shall be used primarily for text on the app. The various
icons used and words that are emphasized in the app shall be designed in color theme. The colors
chosen for AR graphics shall be those which are viewable by smartphone. The overall effect of
this is the creation of the real-life impression of the various objects in the garden hence making
the various users to notice them while viewing the garden [13]. The gallery option of the various

Running head: DESIGN AND DIALOGUE
10
images of the objects shall be in color instead of solid black to improve their look and feel. The
color choice is as shown in the diagram below,
Figure 4 Color choice
2.3 ICONS
Icons do play a major role in any app development since research has shown that users get
more motivated by the messages the icons portray. The icons enable the users to have a
recognize experiences as opposed to recall experiences, making them ubiquitous in their use.
This has been included in the design language of the app [14].
The various icons will be strategically positioned on the app screen depending on the
screen the user currently views. The main menu shall include basically three icons well
positioned, very simple to recognize and robust to even the guest accounts. The design language
shall implement different icons based on which user logged in [15]. This will be separated
10
images of the objects shall be in color instead of solid black to improve their look and feel. The
color choice is as shown in the diagram below,
Figure 4 Color choice
2.3 ICONS
Icons do play a major role in any app development since research has shown that users get
more motivated by the messages the icons portray. The icons enable the users to have a
recognize experiences as opposed to recall experiences, making them ubiquitous in their use.
This has been included in the design language of the app [14].
The various icons will be strategically positioned on the app screen depending on the
screen the user currently views. The main menu shall include basically three icons well
positioned, very simple to recognize and robust to even the guest accounts. The design language
shall implement different icons based on which user logged in [15]. This will be separated
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Running head: DESIGN AND DIALOGUE
11
screens for both students and the general staffs. [16]The following represents some sample icons
to be used in the design language.
Figure 5 Icons to Be Used
2.3.1 Typography
Typography is key in enhancing the styling of the app to make it more appealing to the users in
terms of looks and feel. It is therefore important to include it in the design language to foster
more interaction with the app. The styled typos used in the app will enhance the objective of
giving he augmented reality look by making the objects appear real life [17]. Sample typos are as
shown below,
11
screens for both students and the general staffs. [16]The following represents some sample icons
to be used in the design language.
Figure 5 Icons to Be Used
2.3.1 Typography
Typography is key in enhancing the styling of the app to make it more appealing to the users in
terms of looks and feel. It is therefore important to include it in the design language to foster
more interaction with the app. The styled typos used in the app will enhance the objective of
giving he augmented reality look by making the objects appear real life [17]. Sample typos are as
shown below,

Running head: DESIGN AND DIALOGUE
12
Figure 6 Design Typography
3 UNDERSTANDING
The motivation behind the approach to include AR and VR in the design language is
because recent studies have shown that people increasingly show much interest in having an
interaction between real life and virtual life. The IDC research-based organization has estimated
that VR and AR will potentially grow the tune of about 70 million units by 2022. This is
compounded annually at about 53% [18].
3.1 ONLINE RESEARCH
According to IDC research, the demand for AR and VR bases technologies will skyrocket
this year as compared to 2018. (IDC) It is forecast by the firm the global economy spendings on
AR and VR technologies, products and services will reach to the tune of about $30 billion by
2019 [19]. This represents about 90% spending increase compared to the 2018 spendings. At this
rate, the VR and AR annual rate of growth is expected to reach 70 percent increase in the rate of
growth over a span of five years, the rate being compounded annually [20]. In the contemporary
12
Figure 6 Design Typography
3 UNDERSTANDING
The motivation behind the approach to include AR and VR in the design language is
because recent studies have shown that people increasingly show much interest in having an
interaction between real life and virtual life. The IDC research-based organization has estimated
that VR and AR will potentially grow the tune of about 70 million units by 2022. This is
compounded annually at about 53% [18].
3.1 ONLINE RESEARCH
According to IDC research, the demand for AR and VR bases technologies will skyrocket
this year as compared to 2018. (IDC) It is forecast by the firm the global economy spendings on
AR and VR technologies, products and services will reach to the tune of about $30 billion by
2019 [19]. This represents about 90% spending increase compared to the 2018 spendings. At this
rate, the VR and AR annual rate of growth is expected to reach 70 percent increase in the rate of
growth over a span of five years, the rate being compounded annually [20]. In the contemporary

Running head: DESIGN AND DIALOGUE
13
world, sound and vision have been the primary sensors used in today’s technologies. It is
however noted that more progress has been made to include more sensors in the VR and AR
world to give it reality [21].
The AR-based technologies generally fall under two categories that are marker-based
and markerless AR system. The campus garden app is based on the market less system. The two
approaches are explained in the subsequent section below
3.1.1 Marker-based system
The market-based system uses real-life object symbols to act as the referencing guide
for the computerized images to be composed [22].
3.1.2 Markerless system
The markerless system uses aggregated devices such as an accelerometer, GIS,
compass which when used together can identify the current position of a real-life object. The key
data it can get from the object includes the pointing positions, the axis, the vectors [23]. When
this location data is warehoused and mined to gauge how the device is looking hence making the
computer search through its database of graphics and select one which matched the described
location object hence making the computerized graphics to be generated [24].
3.2 THE AR EXPERIENCE
The best way to illustrate the experience when dealing with AR is as shown below,
13
world, sound and vision have been the primary sensors used in today’s technologies. It is
however noted that more progress has been made to include more sensors in the VR and AR
world to give it reality [21].
The AR-based technologies generally fall under two categories that are marker-based
and markerless AR system. The campus garden app is based on the market less system. The two
approaches are explained in the subsequent section below
3.1.1 Marker-based system
The market-based system uses real-life object symbols to act as the referencing guide
for the computerized images to be composed [22].
3.1.2 Markerless system
The markerless system uses aggregated devices such as an accelerometer, GIS,
compass which when used together can identify the current position of a real-life object. The key
data it can get from the object includes the pointing positions, the axis, the vectors [23]. When
this location data is warehoused and mined to gauge how the device is looking hence making the
computer search through its database of graphics and select one which matched the described
location object hence making the computerized graphics to be generated [24].
3.2 THE AR EXPERIENCE
The best way to illustrate the experience when dealing with AR is as shown below,
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

Running head: DESIGN AND DIALOGUE
14
Figure 7AR Experience
3.3 HOW IOT WORKS
The IoT is a green field that focuses on technologies that connect various devices to
form an information value chain loop. The devices include but not limited to networks, sensors,
standards, augmented behaviors and augment intelligence [25].
14
Figure 7AR Experience
3.3 HOW IOT WORKS
The IoT is a green field that focuses on technologies that connect various devices to
form an information value chain loop. The devices include but not limited to networks, sensors,
standards, augmented behaviors and augment intelligence [25].

Running head: DESIGN AND DIALOGUE
15
Figure 8 IoT In Action
More business models have emerged since the inception of IoT idea as new physical
and digital objects have converged to create a better product. The underlying principle behind
IoT is the amalgamation of hardware and software components into world real objects. The
contemporary real word things have the potential of having an increase in use cases as IT is
included in them, thanks to IoT technology making the services of the real world things to be
accessed not only from the local but also on a global scale [26].
From a technology point of view, integrating IT into things requires a multi-layered
approach in stacking IT into the real-life objects by combining numerous hardware and software
into the thing. With the advent in technology comes with its own business and technology risks
which have challenged the acquisition of IoT devices. Special concerns have been raised on the
security threats posed by these IoT devices making it one of the hotly discussed topics in the IT
industry [27].
15
Figure 8 IoT In Action
More business models have emerged since the inception of IoT idea as new physical
and digital objects have converged to create a better product. The underlying principle behind
IoT is the amalgamation of hardware and software components into world real objects. The
contemporary real word things have the potential of having an increase in use cases as IT is
included in them, thanks to IoT technology making the services of the real world things to be
accessed not only from the local but also on a global scale [26].
From a technology point of view, integrating IT into things requires a multi-layered
approach in stacking IT into the real-life objects by combining numerous hardware and software
into the thing. With the advent in technology comes with its own business and technology risks
which have challenged the acquisition of IoT devices. Special concerns have been raised on the
security threats posed by these IoT devices making it one of the hotly discussed topics in the IT
industry [27].

Running head: DESIGN AND DIALOGUE
16
3.4 HOW THE APPLICATION WORKS
The application shall make good use of available sensor technologies in the smartphone world to
provide the AR experience.
3.4.1 Requirements
The key requirements for the app to function normally is as shown in the table below,
Table 1 Key requirements
3.4.2 Sensors
The use of sensors will be key in providing the AR capabilities as the smartphone app shall be
able to identify the objects in the real world and relate that to a virtual world object. The list of
sensors required by the app is as shown in the table below
16
3.4 HOW THE APPLICATION WORKS
The application shall make good use of available sensor technologies in the smartphone world to
provide the AR experience.
3.4.1 Requirements
The key requirements for the app to function normally is as shown in the table below,
Table 1 Key requirements
3.4.2 Sensors
The use of sensors will be key in providing the AR capabilities as the smartphone app shall be
able to identify the objects in the real world and relate that to a virtual world object. The list of
sensors required by the app is as shown in the table below
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Running head: DESIGN AND DIALOGUE
17
Table 2 Sensors
17
Table 2 Sensors

Running head: DESIGN AND DIALOGUE
18
18

Running head: DESIGN AND DIALOGUE
19
4 ENVIRONMENT
To better envision the design, it was key to understand the environment which the app
will try to mimic. The above environment is the campus garden as shown in the figure below,
Figure 9 App environment
4.1 STORY BOARDING
Storyboards are key to elaborating the features of the app and the various scenarios which
will apply to the app. The storyboard enhances the design by focusing on how the actual app will
be developed. The end result allows the design to be taken to the next phase of staging [28].
19
4 ENVIRONMENT
To better envision the design, it was key to understand the environment which the app
will try to mimic. The above environment is the campus garden as shown in the figure below,
Figure 9 App environment
4.1 STORY BOARDING
Storyboards are key to elaborating the features of the app and the various scenarios which
will apply to the app. The storyboard enhances the design by focusing on how the actual app will
be developed. The end result allows the design to be taken to the next phase of staging [28].
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

Running head: DESIGN AND DIALOGUE
20
The following image shows a 3-D taken from the Windows 10 paint. The aim of this sample is to
help the design have an off the intended AR and have the 3-D multimedia graphics. Using the
Microsoft paint, the design produced an AR scan of a tree area of the garden. The figure below
shows the AR and 3-D of tree
Figure 10 AR and 3-D of the tree area
5 EVALUATION AND TESTING
5.1 PARTICIPANT BASED EVALUATION
The evaluation was done using two members of the university staff by giving them a brief
introduction to the features and functionalities of the application. The two participants were then
given a conceptual question about the app [29]. The objective of this user participation based
evaluation was to gauge the level of interest the staff had in using the app. From the evaluation
20
The following image shows a 3-D taken from the Windows 10 paint. The aim of this sample is to
help the design have an off the intended AR and have the 3-D multimedia graphics. Using the
Microsoft paint, the design produced an AR scan of a tree area of the garden. The figure below
shows the AR and 3-D of tree
Figure 10 AR and 3-D of the tree area
5 EVALUATION AND TESTING
5.1 PARTICIPANT BASED EVALUATION
The evaluation was done using two members of the university staff by giving them a brief
introduction to the features and functionalities of the application. The two participants were then
given a conceptual question about the app [29]. The objective of this user participation based
evaluation was to gauge the level of interest the staff had in using the app. From the evaluation

Running head: DESIGN AND DIALOGUE
21
matrix, two of the participants were positive about the AR and VR technological advancements
in the app and were eagerly waiting to see the app to its final completion [30].
Through the usage of participant-based prototype evaluation, the developers helped the
exploration of the app concept in terms of usability. The end result was the incorporation of user
acceptance testing into the final stages of the app [31]. The evaluation helped in the identifying
problems that the design had ignored during the earlier stages of expert-based evaluation stage.
This was made possible by viewing different design options as proposed by the participants [32].
To have a better understanding of the issues raised by the participants in terms of user
requirements, Interviews was organized to collect data. This involved interviewing 5 students
from the Napier University and another two from Heriot-Watt. The gist of the interview involved
explaining the basic concept of the app and the background. The interviewed students were
shown the conceptual framework and design of the app to give them a basic understanding of the
various scenarios the app would get its use cases [33].
5.2 LIST OF REQUIREMENTS AND PROBLEMS
i. User exploration of the garden will increase their interest level in the garden
ii. The app should have a minimalist design to increase usability to increase the time users
spend on the app
iii. The various metadata about plants and trees in terms of history and benefits should be
more accurate
iv. The user experience should be interesting and the users using it must show some keen
interest in using the app in the future
21
matrix, two of the participants were positive about the AR and VR technological advancements
in the app and were eagerly waiting to see the app to its final completion [30].
Through the usage of participant-based prototype evaluation, the developers helped the
exploration of the app concept in terms of usability. The end result was the incorporation of user
acceptance testing into the final stages of the app [31]. The evaluation helped in the identifying
problems that the design had ignored during the earlier stages of expert-based evaluation stage.
This was made possible by viewing different design options as proposed by the participants [32].
To have a better understanding of the issues raised by the participants in terms of user
requirements, Interviews was organized to collect data. This involved interviewing 5 students
from the Napier University and another two from Heriot-Watt. The gist of the interview involved
explaining the basic concept of the app and the background. The interviewed students were
shown the conceptual framework and design of the app to give them a basic understanding of the
various scenarios the app would get its use cases [33].
5.2 LIST OF REQUIREMENTS AND PROBLEMS
i. User exploration of the garden will increase their interest level in the garden
ii. The app should have a minimalist design to increase usability to increase the time users
spend on the app
iii. The various metadata about plants and trees in terms of history and benefits should be
more accurate
iv. The user experience should be interesting and the users using it must show some keen
interest in using the app in the future

Running head: DESIGN AND DIALOGUE
22
v. The map of the campus should be very accurate and get preloaded into the app to reduce
the instances of the user getting lost in the app.
vi. The should be a standby IT support to help the users with technical glitches on the app.
The response time for the issue logs should be not more than 10 hours
vii. The support staff should be at least two techies to give their technical assistance
viii. Security should be included in the design to mitigate security challenges raised by IoT
components of the app. This shall ensure privacy is not violated while using the app
5.3 USABILITY TESTING OF THE APP
The following usability testing parameter will be used to ensure the app is more usable,
i. Understandability which is defined as the ability of the software service or product to
have a better understanding of whether or not the software product is of suitable use case
and how the software’s product can be used for those particular use case and the various
terms and conditions of usage
ii. Learnability which is defined as the capability of the application to enable users learns its
use cases
iii. Operability which is defined as the ability of the app to be controlled by the user while
using it.
iv. Attractiveness which defines the ability of the software product to be of aesthetic value in
terms of its attractiveness.
22
v. The map of the campus should be very accurate and get preloaded into the app to reduce
the instances of the user getting lost in the app.
vi. The should be a standby IT support to help the users with technical glitches on the app.
The response time for the issue logs should be not more than 10 hours
vii. The support staff should be at least two techies to give their technical assistance
viii. Security should be included in the design to mitigate security challenges raised by IoT
components of the app. This shall ensure privacy is not violated while using the app
5.3 USABILITY TESTING OF THE APP
The following usability testing parameter will be used to ensure the app is more usable,
i. Understandability which is defined as the ability of the software service or product to
have a better understanding of whether or not the software product is of suitable use case
and how the software’s product can be used for those particular use case and the various
terms and conditions of usage
ii. Learnability which is defined as the capability of the application to enable users learns its
use cases
iii. Operability which is defined as the ability of the app to be controlled by the user while
using it.
iv. Attractiveness which defines the ability of the software product to be of aesthetic value in
terms of its attractiveness.
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.

Running head: DESIGN AND DIALOGUE
23
6 FUTURE ENHANCEMENTS
To make the advancement of the current design, there is a need to develop a prototype of the
application as it is currently designed. The application has the potential to fine-tuned to the new
and upcoming XR (Extended Reality) [34]. The various convergence between the different
immersive technologies when merged together can help the deployment of a merged physical
and virtual environment. The various realities include the VR, MR, AR and the now upcoming
XR are continually blending in humans day to day lives. The XR encompasses all the above
realities. The XR normally means a lot of things converged together [35]. This includes the 360-
degree video media, AR, MR, VR. The XR has been hitting the mainstream media and if
included in our app can help improve its usability by far since future realities shall be easily
integrated [36].
The advancement in user experiences in XR has enabled the shift from how storytelling is done
and how those in marketing engage with the potential customer. Although the XR provides a
virtual environment, it is however noted that it did provide real interactivity. It has the
capabilities of enabling the user to choose the path they wish to explore [37].
23
6 FUTURE ENHANCEMENTS
To make the advancement of the current design, there is a need to develop a prototype of the
application as it is currently designed. The application has the potential to fine-tuned to the new
and upcoming XR (Extended Reality) [34]. The various convergence between the different
immersive technologies when merged together can help the deployment of a merged physical
and virtual environment. The various realities include the VR, MR, AR and the now upcoming
XR are continually blending in humans day to day lives. The XR encompasses all the above
realities. The XR normally means a lot of things converged together [35]. This includes the 360-
degree video media, AR, MR, VR. The XR has been hitting the mainstream media and if
included in our app can help improve its usability by far since future realities shall be easily
integrated [36].
The advancement in user experiences in XR has enabled the shift from how storytelling is done
and how those in marketing engage with the potential customer. Although the XR provides a
virtual environment, it is however noted that it did provide real interactivity. It has the
capabilities of enabling the user to choose the path they wish to explore [37].

Running head: DESIGN AND DIALOGUE
24
7 CONCLUSION
The future of AR is becoming a promising technology day by day as more users find
its utility in various scenrarios. Imagine a scenario whereby you can see using an app more than
normal users could see. Hear using an app more than ordinary people could hear. All this is just
from one click on an app. Through the application, the university garden can be viewed from
different angles and more about the garden can learn thereby increasing the exploration of the
garden by students and other staff members. The map of the campus together with the voice
assistant which again is virtual is very useful compared to the traditional plain maps which are
non-interactive hence never reach maximum usage. The ability to have a preloaded 360 degrees’
camera enable the person using the app have a better exploration of how the students sit inside
the garden and the various best spots to have a chat over. All this is all pre-built into the app.
Although the app promises real interactivity with virtual objects, there still exist
some challenges and shortcoming which hinders the ability of the app to achieve 100% usability.
Working with 3-D images has presented challenges in the AR world mainly due to the limitation
on the hardware components of the smartphones. This can, however, be worked around in the
future by the inclusion of XR in the future enhancements.
24
7 CONCLUSION
The future of AR is becoming a promising technology day by day as more users find
its utility in various scenrarios. Imagine a scenario whereby you can see using an app more than
normal users could see. Hear using an app more than ordinary people could hear. All this is just
from one click on an app. Through the application, the university garden can be viewed from
different angles and more about the garden can learn thereby increasing the exploration of the
garden by students and other staff members. The map of the campus together with the voice
assistant which again is virtual is very useful compared to the traditional plain maps which are
non-interactive hence never reach maximum usage. The ability to have a preloaded 360 degrees’
camera enable the person using the app have a better exploration of how the students sit inside
the garden and the various best spots to have a chat over. All this is all pre-built into the app.
Although the app promises real interactivity with virtual objects, there still exist
some challenges and shortcoming which hinders the ability of the app to achieve 100% usability.
Working with 3-D images has presented challenges in the AR world mainly due to the limitation
on the hardware components of the smartphones. This can, however, be worked around in the
future by the inclusion of XR in the future enhancements.

Running head: DESIGN AND DIALOGUE
25
8 REFERENCES
[1] S. K. Ong and A. Y. C. Nee, Virtual and augmented reality applications in manufacturing.
Springer Science & Business Media, 2013.
[2] M. Billinghurst, A. Clark, and G. Lee, “A survey of augmented reality,” Foundations and
Trends® in Human–Computer Interaction, vol. 8, no. 2–3, pp. 73–272, 2015.
[3] D. W. F. Van Krevelen and R. Poelman, “A survey of augmented reality technologies,
applications and limitations,” International journal of virtual reality, vol. 9, no. 2, p. 1,
2010.
[4] D. Schmalstieg et al., “The studierstube augmented reality project,” Presence:
Teleoperators & Virtual Environments, vol. 11, no. 1, pp. 33–54, 2002.
[5] B. Thomas, V. Demczuk, W. Piekarski, D. Hepworth, and B. Gunther, “A wearable
computer system with augmented reality to support terrestrial navigation,” in Wearable
Computers, 1998. Digest of Papers. Second International Symposium on, 1998, pp. 168–
171.
[6] M. Dunleavy, C. Dede, and R. Mitchell, “Affordances and limitations of immersive
participatory augmented reality simulations for teaching and learning,” Journal of science
Education and Technology, vol. 18, no. 1, pp. 7–22, 2009.
[7] K.-H. Cheng and C.-C. Tsai, “Affordances of augmented reality in science learning:
Suggestions for future research,” Journal of Science Education and Technology, vol. 22,
no. 4, pp. 449–462, 2013.
[8] V. Vlahakis et al., “Archeoguide: an augmented reality guide for archaeological sites,”
IEEE Computer Graphics and Applications, vol. 22, no. 5, pp. 52–60, 2002.
[9] B. Thomas et al., “ARQuake: An outdoor/indoor augmented reality first person
application,” in Wearable computers, the fourth international symposium on, 2000, pp.
139–146.
[10] B. B. Bederson, “Audio augmented reality: a prototype automated tour guide,” in
Conference companion on Human factors in computing systems, 1995, pp. 210–211.
[11] E. FitzGerald, R. Ferguson, A. Adams, M. Gaved, Y. Mor, and R. Thomas, “Augmented
reality and mobile learning: the state of the art,” International Journal of Mobile and
Blended Learning (IJMBL), vol. 5, no. 4, pp. 43–58, 2013.
[12] M. Billinghurst, H. Kato, and I. Poupyrev, “The magicbook-moving seamlessly between
reality and virtuality,” IEEE Computer Graphics and applications, vol. 21, no. 3, pp. 6–8,
2001.
[13] A. Y. Nee, S. K. Ong, G. Chryssolouris, and D. Mourtzis, “Augmented reality applications
in design and manufacturing,” CIRP Annals-manufacturing technology, vol. 61, no. 2, pp.
657–679, 2012.
[14] L.-M. Su, B. P. Vagvolgyi, R. Agarwal, C. E. Reiley, R. H. Taylor, and G. D. Hager,
“Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-
time 3D-CT to stereoscopic video registration,” Urology, vol. 73, no. 4, pp. 896–900, 2009.
25
8 REFERENCES
[1] S. K. Ong and A. Y. C. Nee, Virtual and augmented reality applications in manufacturing.
Springer Science & Business Media, 2013.
[2] M. Billinghurst, A. Clark, and G. Lee, “A survey of augmented reality,” Foundations and
Trends® in Human–Computer Interaction, vol. 8, no. 2–3, pp. 73–272, 2015.
[3] D. W. F. Van Krevelen and R. Poelman, “A survey of augmented reality technologies,
applications and limitations,” International journal of virtual reality, vol. 9, no. 2, p. 1,
2010.
[4] D. Schmalstieg et al., “The studierstube augmented reality project,” Presence:
Teleoperators & Virtual Environments, vol. 11, no. 1, pp. 33–54, 2002.
[5] B. Thomas, V. Demczuk, W. Piekarski, D. Hepworth, and B. Gunther, “A wearable
computer system with augmented reality to support terrestrial navigation,” in Wearable
Computers, 1998. Digest of Papers. Second International Symposium on, 1998, pp. 168–
171.
[6] M. Dunleavy, C. Dede, and R. Mitchell, “Affordances and limitations of immersive
participatory augmented reality simulations for teaching and learning,” Journal of science
Education and Technology, vol. 18, no. 1, pp. 7–22, 2009.
[7] K.-H. Cheng and C.-C. Tsai, “Affordances of augmented reality in science learning:
Suggestions for future research,” Journal of Science Education and Technology, vol. 22,
no. 4, pp. 449–462, 2013.
[8] V. Vlahakis et al., “Archeoguide: an augmented reality guide for archaeological sites,”
IEEE Computer Graphics and Applications, vol. 22, no. 5, pp. 52–60, 2002.
[9] B. Thomas et al., “ARQuake: An outdoor/indoor augmented reality first person
application,” in Wearable computers, the fourth international symposium on, 2000, pp.
139–146.
[10] B. B. Bederson, “Audio augmented reality: a prototype automated tour guide,” in
Conference companion on Human factors in computing systems, 1995, pp. 210–211.
[11] E. FitzGerald, R. Ferguson, A. Adams, M. Gaved, Y. Mor, and R. Thomas, “Augmented
reality and mobile learning: the state of the art,” International Journal of Mobile and
Blended Learning (IJMBL), vol. 5, no. 4, pp. 43–58, 2013.
[12] M. Billinghurst, H. Kato, and I. Poupyrev, “The magicbook-moving seamlessly between
reality and virtuality,” IEEE Computer Graphics and applications, vol. 21, no. 3, pp. 6–8,
2001.
[13] A. Y. Nee, S. K. Ong, G. Chryssolouris, and D. Mourtzis, “Augmented reality applications
in design and manufacturing,” CIRP Annals-manufacturing technology, vol. 61, no. 2, pp.
657–679, 2012.
[14] L.-M. Su, B. P. Vagvolgyi, R. Agarwal, C. E. Reiley, R. H. Taylor, and G. D. Hager,
“Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-
time 3D-CT to stereoscopic video registration,” Urology, vol. 73, no. 4, pp. 896–900, 2009.
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

Running head: DESIGN AND DIALOGUE
26
[15] B. Bilbrey, N. V. King, and A. Pance, “Synchronized, interactive augmented reality
displays for multifunction devices,” Mar-2013.
[16] K. Lee, “Augmented reality in education and training,” TechTrends, vol. 56, no. 2, pp. 13–
21, 2012.
[17] M. Kesim and Y. Ozarslan, “Augmented reality in education: current technologies and the
potential for education,” Procedia-Social and Behavioral Sciences, vol. 47, pp. 297–302,
2012.
[18] S. Nicolau, L. Soler, D. Mutter, and J. Marescaux, “Augmented reality in laparoscopic
surgical oncology,” Surgical oncology, vol. 20, no. 3, pp. 189–201, 2011.
[19] A. I. Comport, E. Marchand, M. Pressigout, and F. Chaumette, “Real-time markerless
tracking for augmented reality: the virtual visual servoing framework,” IEEE Transactions
on visualization and computer graphics, vol. 12, no. 4, pp. 615–628, 2006.
[20] M. Billinghurst and A. Duenser, “Augmented reality in the classroom,” Computer, vol. 45,
no. 7, pp. 56–63, 2012.
[21] M. Graham, M. Zook, and A. Boulton, “Augmented reality in urban places: contested
content and the duplicity of code,” Transactions of the Institute of British Geographers, vol.
38, no. 3, pp. 464–479, 2013.
[22] M. Dunleavy and C. Dede, “Augmented reality teaching and learning,” in Handbook of
research on educational communications and technology, Springer, 2014, pp. 735–745.
[23] D. Wagner, G. Reitmayr, A. Mulloni, T. Drummond, and D. Schmalstieg, “Real-time
detection and tracking for augmented reality on mobile phones,” IEEE transactions on
visualization and computer graphics, vol. 16, no. 3, pp. 355–368, 2010.
[24] J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic,
“Augmented reality technologies, systems and applications,” Multimedia tools and
applications, vol. 51, no. 1, pp. 341–377, 2011.
[25] J. Bacca, S. Baldiris, R. Fabregat, and S. Graf, “Augmented reality trends in education: a
systematic review of research and applications,” 2014.
[26] S. C.-Y. Yuen, G. Yaoyuneyong, and E. Johnson, “Augmented reality: An overview and
five directions for AR in education,” Journal of Educational Technology Development and
Exchange (JETDE), vol. 4, no. 1, p. 11, 2011.
[27] A. Tang, C. Owen, F. Biocca, and W. Mou, “Comparative effectiveness of augmented
reality in object assembly,” in Proceedings of the SIGCHI conference on Human factors in
computing systems, 2003, pp. 73–80.
[28] H.-K. Wu, S. W.-Y. Lee, H.-Y. Chang, and J.-C. Liang, “Current status, opportunities and
challenges of augmented reality in education,” Computers & education, vol. 62, pp. 41–49,
2013.
[29] M. Golparvar-Fard, F. Peña-Mora, and S. Savarese, “D4AR–a 4-dimensional augmented
reality model for automating construction progress monitoring data collection, processing
and communication,” Journal of information technology in construction, vol. 14, no. 13,
pp. 129–153, 2009.
[30] A. M. Kamarainen et al., “EcoMOBILE: Integrating augmented reality and probeware with
environmental education field trips,” Computers & Education, vol. 68, pp. 545–556, 2013.
[31] D. S. Mallinson and R. L. Marks, “Portable augmented reality device and method,” Oct-
2013.
[32] S. J. Henderson and S. Feiner, “Evaluating the benefits of augmented reality for task
localization in maintenance of an armored personnel carrier turret,” 2009.
26
[15] B. Bilbrey, N. V. King, and A. Pance, “Synchronized, interactive augmented reality
displays for multifunction devices,” Mar-2013.
[16] K. Lee, “Augmented reality in education and training,” TechTrends, vol. 56, no. 2, pp. 13–
21, 2012.
[17] M. Kesim and Y. Ozarslan, “Augmented reality in education: current technologies and the
potential for education,” Procedia-Social and Behavioral Sciences, vol. 47, pp. 297–302,
2012.
[18] S. Nicolau, L. Soler, D. Mutter, and J. Marescaux, “Augmented reality in laparoscopic
surgical oncology,” Surgical oncology, vol. 20, no. 3, pp. 189–201, 2011.
[19] A. I. Comport, E. Marchand, M. Pressigout, and F. Chaumette, “Real-time markerless
tracking for augmented reality: the virtual visual servoing framework,” IEEE Transactions
on visualization and computer graphics, vol. 12, no. 4, pp. 615–628, 2006.
[20] M. Billinghurst and A. Duenser, “Augmented reality in the classroom,” Computer, vol. 45,
no. 7, pp. 56–63, 2012.
[21] M. Graham, M. Zook, and A. Boulton, “Augmented reality in urban places: contested
content and the duplicity of code,” Transactions of the Institute of British Geographers, vol.
38, no. 3, pp. 464–479, 2013.
[22] M. Dunleavy and C. Dede, “Augmented reality teaching and learning,” in Handbook of
research on educational communications and technology, Springer, 2014, pp. 735–745.
[23] D. Wagner, G. Reitmayr, A. Mulloni, T. Drummond, and D. Schmalstieg, “Real-time
detection and tracking for augmented reality on mobile phones,” IEEE transactions on
visualization and computer graphics, vol. 16, no. 3, pp. 355–368, 2010.
[24] J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic,
“Augmented reality technologies, systems and applications,” Multimedia tools and
applications, vol. 51, no. 1, pp. 341–377, 2011.
[25] J. Bacca, S. Baldiris, R. Fabregat, and S. Graf, “Augmented reality trends in education: a
systematic review of research and applications,” 2014.
[26] S. C.-Y. Yuen, G. Yaoyuneyong, and E. Johnson, “Augmented reality: An overview and
five directions for AR in education,” Journal of Educational Technology Development and
Exchange (JETDE), vol. 4, no. 1, p. 11, 2011.
[27] A. Tang, C. Owen, F. Biocca, and W. Mou, “Comparative effectiveness of augmented
reality in object assembly,” in Proceedings of the SIGCHI conference on Human factors in
computing systems, 2003, pp. 73–80.
[28] H.-K. Wu, S. W.-Y. Lee, H.-Y. Chang, and J.-C. Liang, “Current status, opportunities and
challenges of augmented reality in education,” Computers & education, vol. 62, pp. 41–49,
2013.
[29] M. Golparvar-Fard, F. Peña-Mora, and S. Savarese, “D4AR–a 4-dimensional augmented
reality model for automating construction progress monitoring data collection, processing
and communication,” Journal of information technology in construction, vol. 14, no. 13,
pp. 129–153, 2009.
[30] A. M. Kamarainen et al., “EcoMOBILE: Integrating augmented reality and probeware with
environmental education field trips,” Computers & Education, vol. 68, pp. 545–556, 2013.
[31] D. S. Mallinson and R. L. Marks, “Portable augmented reality device and method,” Oct-
2013.
[32] S. J. Henderson and S. Feiner, “Evaluating the benefits of augmented reality for task
localization in maintenance of an armored personnel carrier turret,” 2009.

Running head: DESIGN AND DIALOGUE
27
[33] W. Barfield, Fundamentals of wearable computers and augmented reality. CRC Press,
2015.
[34] B. Furht, Handbook of augmented reality. Springer Science & Business Media, 2011.
[35] G. Schall et al., “Handheld augmented reality for underground infrastructure visualization,”
Personal and ubiquitous computing, vol. 13, no. 4, pp. 281–291, 2009.
[36] Á. Di Serio, M. B. Ibáñez, and C. D. Kloos, “Impact of an augmented reality system on
students’ motivation for a visual art course,” Computers & Education, vol. 68, pp. 586–596,
2013.
[37] E. Kruijff, J. E. Swan, and S. Feiner, “Perceptual issues in augmented reality revisited,” in
Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on, 2010,
pp. 3–12.
27
[33] W. Barfield, Fundamentals of wearable computers and augmented reality. CRC Press,
2015.
[34] B. Furht, Handbook of augmented reality. Springer Science & Business Media, 2011.
[35] G. Schall et al., “Handheld augmented reality for underground infrastructure visualization,”
Personal and ubiquitous computing, vol. 13, no. 4, pp. 281–291, 2009.
[36] Á. Di Serio, M. B. Ibáñez, and C. D. Kloos, “Impact of an augmented reality system on
students’ motivation for a visual art course,” Computers & Education, vol. 68, pp. 586–596,
2013.
[37] E. Kruijff, J. E. Swan, and S. Feiner, “Perceptual issues in augmented reality revisited,” in
Mixed and Augmented Reality (ISMAR), 2010 9th IEEE International Symposium on, 2010,
pp. 3–12.
1 out of 27

Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
© 2024 | Zucol Services PVT LTD | All rights reserved.