Dissertation: Watermarking Techniques for Cloud Data Security Analysis

Verified

Added on  2023/01/19

|25
|9644
|87
Thesis and Dissertation
AI Summary
This dissertation delves into the application and effectiveness of watermarking techniques in cloud computing to enhance data security. It begins with an introduction to cloud computing, outlining its framework, benefits, and the significance of data security. The literature review explores cloud computing frameworks, watermarking techniques and their advantages for businesses, and the security and privacy issues associated with cloud computing. The research methodology section details the approach used to investigate the effectiveness of watermarking. The results, synthesis, and findings chapters present an analysis of the research, discussing the benefits of watermarking, the effectiveness of various techniques, and the implications for cloud data security. The study concludes with recommendations for future research and practical applications of watermarking in cloud environments, emphasizing the importance of protecting data in the cloud. The study aims to contribute to the understanding of cloud computing security.
Document Page
DISSERTATION
(Application and effectiveness of watermarking technique in cloud computing to
enhance cloud data security)
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
TABLE OF CONTENTS
CHAPTER 1: INTRODUCTION ...................................................................................................1
Background..................................................................................................................................1
Aim and Objectives......................................................................................................................2
Rationale......................................................................................................................................3
Significance..................................................................................................................................3
Research specification.................................................................................................................4
Gantt Chart...................................................................................................................................4
CHAPTER 2 : LITERATURE REVIEW .......................................................................................5
2.1 Cloud computing framework ................................................................................................5
2.2 Watermarking techniques and its benefits for business ........................................................8
2.3 Security and data privacy issues related to cloud computing .............................................12
CHAPTER 3: RESEARCH METHODOLOGY...........................................................................16
Chapter 4 , RESULTS ,SYNTHESIS............................................................................................26
2.1 Cloud computing framework...............................................................................................26
2.2 Watermarking techniques and its benefits for business.......................................................29
2.3 Security and data privacy issues related to cloud computing..............................................31
FINDINGS AND RESULT...........................................................................................................33
CONCLUSION..............................................................................................................................36
RECOMMEDNDATION..............................................................................................................37
REFERENCES..............................................................................................................................38
Document Page
CHAPTER 1: INTRODUCTION
Background
Cloud computing refers to the software technology used for providing the services over the
internet which can be used for the further use of the technology which can be used to provide
information. It includes servers, storage, database, networking, software, analytics and
intelligence which are used to provide a faster innovation in the things which can change over
within a shorter span of time, flexibility of resources is generated which involves the use of
flexible resources which can change over time. Cloud is built to aim the service which are made
to minimise the negative effect of the internet use this is build for the innovation like you pay for
what you use there is a limited services offered and the exact information can be evaluated out.
The cloud computing is the hardware and software which uses for providing the services
over internet. It is used to store and access data on internet from any device, which is able to use
internet. It is safe and secure to store data over cloud computing (A SURVEY ON
WATERMARKING METHODS FOR SECURITY OF CLOUD DATA, 2016). There are lot of
examples of cloud computing like Google's Gmail, Facebook, etc. because anybody can access
the data via login. As it provides more data storage, improved performance, low maintenance
needed, increased capacity to store data, increased data security, backup and recovery of data is
easy. Now come to the new way to make more securely storage of the data and files on the
cloud computing. The Watermarking technique is very effective and easier to secure the data on
the cloud. Because data security is main issue nowadays, the data can be copied, altered, or
transmitted easily. So using Watermarking technique on storing the data is useful and data can be
protected. Digital watermarking technique protects the digital data, like text, images, audio,
video. Digital Watermarking technique is developed from the steganography (To Propose A
Novel Technique for Watermarking In Cloud Computing, 2015). It includes the embedding of the
information or digital watermark with the files, and the data contains the watermark with the raw
file and then it is stored then it generates a digital watermarked file. Digital Watermarking
Technique has many advantages using to improve data security on the cloud computing, such as
it provides the copyright protection, the important information which has copyright can be
embedded as watermark into production, if any issue is raised on the copyright, then the
watermark can be the evidence from the owner's behalf (A Cloud-Based Watermarking Method
for Health Data and Image Security, 2014).
1
Document Page
Authentication is provided, when any image or content is stored and the attacker tries to
hack the data but would be unsuccessful because the use of the digital signature would provide
the security to the content so that no modification will be done on the data, it demands for the
authentication and it is highly secured. The watermarking technique provides the imperceptibility
which means the invisibility , suppose the image contains the watermark but the person can't
identify the watermark, and it looks like the original image and the unauthorized person can't
access the image (An Efficient Approach for Security of Cloud Using Watermarking Technique,
2013). Its is very secure from the point of data security because it keeps the watermark private up
to the user only. Nobody else can identify the watermark of other and unable to retrieve any file
from the cloud computing.
Aim and Objectives
Aim
“To describe application and effectiveness of watermarking technique in cloud
computing to enhance cloud data security”
Objectives
To understand conceptual framework of cloud computing
To describe concept of watermarking technique and its benefits for business
To investigate issues related to cloud computing and data security
Problem statement:
This study needs to evaluate the need of the effectiveness of the watermarking techniques used to
evaluate the effects which can be use to minimise the factors which can be used to evaluate the
cloud computing strategies which can be used to generate data related to the watermark.
Watermark is not so secured and there are chances of data theft as anyone can access the data by
recovery generation which can be easily changed with the measurative steps followed easy
modifications of the issues are the major problem related to cloud computing. There are many
cases which can be heard about the outsourcing the data which can be outsourced in return of
money. Data accessed to cloud can be used by the host company which can be a threat to users.
how the conceptual framework of cloud computing can be understood?
Describe the concept of watermark techniques and its benefits for business?
How can issues related to cloud computing and data security can be evaluated?
2
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Statement of Purpose:
'This aims to describe the applications and effectiveness of the watermarking techniques in cloud
computing to enhance cloud data security.'
the study carried out reflects on the description of the applications and effectiveness carried out
on the basis of the study carried out which can be used to evaluate the data which are done to aim
the security and services of the cloud computing. The ways of evaluation of the work is carried
out which needs to be evaluated according to the need, this study provides the effectiveness of
the watermarking its benefits and use in cloud computing.
Rationale
The major reason of selecting topic of application and effectiveness of watermarking
technique in cloud computing to enhance cloud data security is the interest topic and researcher
has studied computer science hence this is essential topic and very vast. That is why individual
has selected this topic for the research. In addition, Scholar has worked in past on cloud
computing elements has individual has proper knowledge about it. Scholar is well aware with the
data security issues hence this is being taken into consideration for the present investigations
(Kumar and Gupta, 2018).
Significance
This is very important topic because by gathering information about effectiveness of
watermarking technique companies will be able to control over issue of data security. Apart from
this, investigation will give positive results to other scholar because they will get study material
which can be used by them in future researches (Rhazlane and et.al., 2017). For the business
there are limited sources which needs to be carried out which aims to lower the running cost. It
reflects the use of the cloud computing with the reflection of its benefits of cloud computing
which can be done with the aim to develop the purpose of this research which can be evaluated.
This study will focus on the methods of security system used for the protection of cloud data and
the fundamental ways to evaluate out the security methods. Various methods for watermarking
of the cloud computing is carried out in this study digital watermarking facility has used for the
development and safety of the cloud computing which can be used to improve the data security
and safety of the data. Watermarking is evaluated to improve the data security services on cloud.
It will be important for the researchers to evaluate the security services offered on cloud
3
Document Page
computing and the security offered by the watermarking to carry out his further research this can
be used by many organizations which work on the cloud computing data for the security and
safety of the data generated this research will be beneficial for the small business to make them
aware for the development and data security.
Research specification
Scholar will use interpretivism philosophy for conducting investigation on application
and effectiveness of watermarking technique in cloud computing to enhance cloud data security.
Researcher will gather all the detail related to watermarking technique from secondary sources
and will analyses data by using books, journals and other essential articles related to
watermarking. In addition to this, individual will take care of ethical aspects and will conduct
entire investigation within by following ethical guild lines properly (Uma and Sumathi, 2017).
Researcher will ensure that details are being used in the investigation in adequate manner hence
theories will be taken into consideration. Inductive approach will be followed to answer the
research questions related to effectiveness of water marking technique in data security.
Gantt Chart
W1 W2 W3 W4 W5 W6 W7 W8
Selection of
research topic
Using previous
literatures and
review these data
Selection of
appropriate
research method
Data collection
from secondary
sources such as
books, journals etc.
Analyses of data to
answer research
question
4
Document Page
Drawing
conclusion and
giving
recommendations
Submission of
research report
Editions and final
submission
CHAPTER 2 : LITERATURE REVIEW
2.1 Cloud computing framework
According to Rittinghouse and Ransome, (2017) cloud is defined as the integration of
various hardware and software components which work in collaboration for providing various
computing services to end users by means of online services. These services are provided via
internet and it allow users to share and access applications from the remote devices by using
internet. With vast operational area of business organisations it is essential for service providers
to access and share the data from different locations or servers. In conventional computing
devices when data is stored in hard drives then it is not possible to access it from the remote
locations. Thus it affect the continuity and efficiency of business operations. On the other hand
cloud based data is stored in virtual or physical servers which are hosted by third party or another
service providers so that it can be accessed via any other network or internet.
In the same context Botta and et.al., (2016) stated that cloud services also offers the
different version or extent of accessibility by providing different frameworks for public and
private clouds. Public cloud is similar to standard cloud framework in which data, files, storage,
services and applications are accessible to public via internet. On the other hand, in private cloud
these applications and data are protected and implemented within corporate firewall. Thus, the
entire control for accessibility lies in the hands of fix authority or the information technology
department of the organisation. The cost efficiency of cloud computing services is making them
highly popular and demanding in business applications. The cloud-based hardware services
allow organisation to use service provider's equipment’s such as hardware devices, networking
5
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
and storage components and servers. Thus it allows organisation to eliminate the need of
spending huge investments on equipments. At the same time the cloud software services
applications are hosted by the service provider and are accessible via network. Thus business
organisations receives huge benefits in terms of maintenance cost or cost deployment.
As per the view of Almorsy, Grundy and Müller, (2016) cloud services are more safer
and accessible for storing and sharing the data as compare to the other computing methods. With
business expansion across the world multiple copies of data is also required by the organisation
so that their interrelated global operations can be accomplished without any difficulty. The cloud
services allow organisation to store their data on various servers situated in different
geographical locations which is also protected by power supplies. Cloud services are
characterised by on demand self service, ubiquitous access of network, rapid elasticity and pay
per use. The storage and data processing needs of organisation are also distributed by a common
infrastructure in which no particular user is assigned a specific resources.
According to Gai and et.al., (2016) cloud computing frameworks consist of front end
platform such as mobile device and clients, cloud based delivery, networks such as intranet,
internet or intercloud and back end platform like storage and servers. There are various types of
framework available for the cloud services such as SaaS, DaaS, PaaS, IaaS, and DaaS.
Software as service (SaaS): In this type of cloud service framework software is installed and
maintained by the service provider in the cloud only while the users can run the software over
internet or intranet. Ali, Khan and Vasilakos, (2015) stated that with SaaS client machines does
not require the installation of application specific software because applications are executed or
performed in the cloud. This framework is scalable and thus business organisations can easily
manage their ever increasing business data by loading the application on various servers. It
usually involves the annual or monthly for the usage charges.
Data as a service (DaaS): In DaaS, cloud data can be accessed by using well defined API layers.
The data build on this cloud computing framework is provided to users as per their demand
irrespective of the organisational separation between consumer and service provider or the
geographical distinction. The collection of data customer package and software and EAI
middleware increases the burden of software on organisation for managing the particular type of
data. Along with this the regular up-gradation and maintenance cost as well as software update
6
Document Page
with the change in data format. The DaaS operates on premise that quality of data occurs in
central place and thus it has several advantages such as cost effectiveness, agility and data
quality. This is one of the important feature of this type of cloud framework that it allows users
to move quickly as it is very simple for the service users to access the data without requirement
of any in depth knowledge of type of underlying data. In addition to this it also permits the
necessary modifications in locations or the data structure as per the needs of service users. Since
there is single point of update data access is managed and controlled by means of data services
which in turn provide the high-quality data.
Development as a service (DaaS): According to Messerli and et.al., (2017) this cloud framework
is web based and can be refereed as the community shared development tools. This is the type of
stand-alone model in which integrated development tools are provided along with the runtime
environment for developing application.
Infrastructure as a service (IaaS): In this type of cloud computing framework physical
hardware is used and all servers, storage, system management and network remain virtually.
Thus, the annual or monthly subscription provided by service users for running virtual
components mitigate the requirement of data centre, cooling and other hardware maintenance
devices at the local level.
Platform as a service (PaaS): This type of cloud service framework allows business
organisation or the users to utilise database and application platform as service. In similar
context Xia and et.al., (2016) elucidated that PaaS provides a complete platform which includes
development of application and interface, storage and database development which are delivered
through a remotely hosted service platform to the users. Thus, this framework gives the ability to
develop an enterprise class service for local usage and for on demand services at free of cost or
considerably lower prices.
According to Yang and et.al., (2017) the cloud computing technology consist of two key
elements which are known as cloud virtualisation and service-oriented architecture (SOA).
Cloud virtualisation: It is one of the most important feature of cloud system which allow the
efficient delivery of the cloud services. When virtual computing resources are installed or
implemented in the cloud then it mimics the functions of physical computer resources. For
instance, it also provides flexible load balancing management allowing adjustment of
7
Document Page
computational services as per the demand. Another benefit of cloud virtualisation is that it allows
business to promote high degree of scalability, reliability and availability in terms of access to
cloud technology or information technology needs. In case of fail over support of the disaster
recovery also the cloud virtualisation serves as a critical element of the cloud services.
SOA architecture: As per the view of Stergiou and et.al., (2018) SOA architecture is vital for the
organisation as it permits organisation to access computing solutions which can be improved on
demand basis or as per the situational change in the business environment. SOA encourages the
independent web-based services which can interact with each other via internet. Hence it offers
real time communication and flexibility which makes the cloud-based system and services easily
and quickly reconfigurable delivery. This architecture also put the development cost,
maintenance and deployment on the web service providers. It results in the permission and easy
access to web services at low cost. The characteristic to promote the centralise distribution and
reuse of components makes SOA as one of the most powerful component of cloud computing.
This characteristic also drives low cost in software delivery and development.
Contrary to the above discussion Marinescu, (2017) stated that suitability of the cloud
computing techniques is also critically affected by the several challenges. Apart from the data
security and privacy concerns cloud computing technology and framework performance is also
influenced by the bandwidth cost and performance. The organisations can reduce the cost by
saving their expenses on acquiring new systems, database management and other tasks related to
maintenance. Though this issue is not significant in small applications but it greatly affects the
performance in data intensive applications. Thus, there is need that there must be sufficient
bandwidth so that application timeout and hold off latency can be hold and intricate data and
exhaustive services can be received and delivered. Another challenge in adopting cloud system is
known to be amalgamation of present cloud infrastructure. Thus, the added benefits disappear
from discrete cloud services within organisation are not able to achieve the expected outcomes as
provided by well-integrated environment.
2.2 Watermarking techniques and its benefits for business
As per the view of Minerva M. Yeung (2016) Watermarking technique is a which is used
to hide the digital data which is generated which can be used to evaluate the ownership of
copyright which can be used to evaluate the signal. Watermarking is a tool used to hide a
8
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
information in career signal, but does not need to contain a career signal it does not include any
relation to the career signal. Digital watermarking is a tool which is used to verify the
authenticity or integrity of the host signal which reflects on the identity of the owner.
Watermarking technique is used to observe the authentication of any digital system used in cloud
computing. Likewise, traditional watermarking, digital watermarking is used to evaluate the
authenticity of the data which are set up as a host in the digital market which can be used to
evaluate the rest of the data. Digital watermarking is under the certain conditions which after
using algorithm defines the authentication of the data. Digital signal distorts the way that they
become easily perceivable, it is traced effective or ineffective depending on purpose. Traditional
tool is used for media like images and videos but with the evolution of technology there is a
change in the digital watermarking it has included various objects and techniques for
digitalization and for the protection of data. metadata is always added to the digital signal where
the size of information remains fixed.
The variability in the digital water-marketing is dependent on the cases where it is
applied in the use. Water-marketing must be robust and flexible that they can carry all the
sources of the carrier signal and help to adapt the changes as easily and quickly as possible.
Watermarking is a headway of various researches which can be used as technical as well as
commercial tool which can be used to reduce publicity and controversy. The use of internet can
help in the development of the social and electronic publishing and advertising. Real time
information used, product ordering, various digital processes are used in the which needs to be
channelized and made secure which can help in the salient and safer working of the tools used
this ensures safety and data control which can be used in a secure manner.
Contradictory to above discussion, Marwan, Kartit and Ouahmane (2018) there are
abundant opportunities which are developed, which also gave rise to the challenges with the
development of the watermarking various issues are resolved which involves the issue related to
socio-economic policies which resolved many issues which can help in the management of the
personal data. With the help of water-marketing these issues can be resolved as authentication
and legal rights are provided to the authority which can access data. With the help of
watermarking this approach are safer and help in the generating socio-economic policies.
9
Document Page
In accordance with Nasrin M. Makbol and Bee Ee Kho(2018) Singular Value
Decomposition(SVD) consist of important mathematical tools that are helpful in many
operations. This can help in the business that are associated with the online marketing this tools
can effectively maintain the structure and the data provided which can help in the maintenance of
the data, this can help in detecting the False Positive Problem(FPP) which can lead to the
satisfaction and robustness of the SVD problem. Satisfying robustness and imperceptibility
requirements as well as preventing FPP's, in SVD based watermarking is a crucial practice which
can be used in copyrights and authentication of the personal data which cannot be accessed
without the permission of the administrator. All the SVD based algorithms that lead to the false
positive problem can be used to detect the flaws of the data that is based on the false positive
problem.
As per the view of Upasana Yadav(2017) classification of digital water-marketing is done
on the basis of
Robustness:
Fragile water-marketing: it is used for integrated protection which is sensitive towards the
change and alerts when there is any change in the data and the signal. With the help of fragile
water-marketing it is easier to predict that the original data is changed or not.
Semi fragile water-marketing: it is capable of tolerance of some degree of change to water-
marked image like addition of noise attacks.
Robust water-marking: it is used to prevent the noise attacks, geometric or non-geometric
attacks which can be done without the change in the watermarked. This aims that the water-mark
is not destroyed and aims for the development and security to the water-mark.
According to Perceptive:
Visible Watermarking: it is visible in digital data example- logos of channel are visible in the
corner of the screen.
Invisible watermarking: by this technology, inserting a secret information into digital content
which are not visible in the digital images and contents. It needs to be extracted by various
processes to crack the information.
According to attached Digital signal
Image watermarking: This is used to convert original data into digital image.
10
Document Page
Audio watermarking: For watermarking various audio system are refereed and used for the
development.
Video watermarking: It converts video data into video system, which can be used with real-time
extraction.
According to task performed:
Data watermarking and integrity watermarking: It keeps the originality of the content as it was
in its initial stage, which eliminates the lossy compression.
Copyright protection watermarking: It means that the owner wants to view marked image, then
the watermarking can be seen with the technique of adding watermarking to the image and the
watermark still exists even attached which defines authentication to the original data.
Anti-Counterfeiting watermarking: It is used to see the mark of image with watermarking, this
can be seen after adding the watermark to the image and watermarking still exists even it is
attacked.
According to the Domain type:
Spital watermarking: This domain emphasizes on modifying the randomly chosen subset of
image which can be used for loafing the raw data into pixels. Some algorithms used in this subset
can be LSB, SSM module-based techniques.
Transform watermarking: it is also called frequency domain. Value of frequencies are changed
as per the initial values. DTC, DWT, DFT are commonly used frequency domain. Computing
complexity method is high.
These methods are used to identify and protect copyrights of the ownership. Digital
content can be embedded with watermarks by the evaluating metadata which is carried along the
data. These can be used content achieving which can be used to evaluate the results. It is used to
create the online content which can help in the development of the creating the digital content
which can be authenticated and secure which can help in preserving the data which can be used
to carry out the use of digital content.
It is useful by various banks for the protection of the data which can be used for the
security of the data and information. Virtual images are created which can help in the data
analysing techniques used for data safety and protection. They are used in various online
platforms which are social networking sites and various official sites where the data available is
restricted to a certain extent. These sites use watermarking techniques which can help in content
11
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
authentication. Watermarking is used in various medical applications for printing names of
patients on X-rays reports and MRI reports. Fragile watermarking can be used in tamper
detection which can be used in detection of tampering in digital data therefore the data cannot be
trusted. This is used in authenticating the content. It can also be used in digitalization and
forensic where the digital images are used. Watermarking is used in digital finger printing where
the data are carried out as they are unique identification of humans, they are the unique
identification of every individuals. Watermarking is used in digital content safety and no
misleading is carried out violation of data are stricken.
2.3 Security and data privacy issues related to cloud computing
As per the view of A Mehmood (2018) security is a major issue which can lead to the data theft
and can lead to the financial losses as there is no proper location defined where the data stored by
the user is the location of the data is not defined. The risk can arise because of the risk which
arise during the implications of the data and the management of the cloud. This consists of data
breaches, compromised credentials and broken authentication, hacked interfaces and API's,
accounts hijacking are not helped alleviate concerns. All of this make a trusting sensitive and
proprietary data which restricts third party authentication. Which is a challenge with the cloud
computing. As the stability is still on a go for the data with cloud computing it is developing with
an easy pace which is day by day increasing the efficiency of the cloud computing. Security is
still a major issue with the cloud computing which can lead to data theft, creating a chaos in the
users. Various additional applications need to be implicated to verify SaaS provider has secure
by identity management, various authentications provided and access control mechanism.
As per the view of Shazia Tabassam(2017) In order to ensure big data privacy, several
mechanisms have been developed in recent years, which is based on stages of big data life cycle
i.e. Data generating, data storing and processing. In this phase, for protection of privacy, access
restriction and falsifying data techniques are used. Access restrictions are used to restrict the
users to access the data, they are repaired before presenting the data before the encryption takes
place. Encryption based techniques are divided into Attribute based encryption (ABE). Identity
Based Encryption (IBE) and storage-based encryption.
Cloud computing is a widely used IT applications but there are still major problems to be
solved for further use significant barrier in the adoption of data security, accompanied by issues
related to the compliance, privacy, trust and legal matters, which is a one big factor related to the
12
Document Page
cloud computing and the data security. Data security in the cloud computing becomes a practical
issue which can be used to operate the cloud computing environment, there is no platform for the
data which can be used to evaluate there is no specification of data storage. This can be easily
accessed by the use any devicAs per the view of Shazia Tabassam(2017) In order to ensure big
data privacy, several mechanisms have been developed in recent years, which is based on stages
of big data life cycle i.e. Data generating, data storing and processing. In this phase, for
protection of privacy, access restriction and falsifying data techniques are used. Access
restrictions are used to restrict the users to access the data, they are repaired before presenting the
data before the encryption takes place. Encryption based techniques are divided into Attribute
based encryption (ABE). Identity Based Encryption (IBE) and storage based ee which can access
cloud computing. Data security in cloud computing is more complicated in data security. As
cloud computing is refereed as on demand service which can be accessed at any moment of time.
As cloud computing has two varied functions that is computing and data storage users can easily
get access to the data which can be used by any users in this case there is a risk of data theft.
Network security issue: It is one big issue related to the cloud computing data will be
taken from Saas and are stored by the Saas provider which can be easily accessed by them. This
can be easily accessed by the Saas provider as this data must be secured which must provide a
strong network traffic encryption.
Data Locality: It is as the Saas provides a platform to work with, users are unaware that
the activities carried by them are recorded by the Saas provider which can easily keep an eye on
the data, information and work carried out by the users are stored at Saas platform. This is much
important in the Saas platform to use the data which can be used to evaluate the use of country
laws and policies governed by them.
Date Access: Data at cloud are accessible from anywhere where there is an accessibility
to the Saas information sorts which can be used to flourish access poor client verification which
can be used to evaluate the frail passwords which can be used to evaluate the access of the work.
With this data access there are major chances of data hack which can be used to hack the data
and access the data which can lead to theft and fraud cases. Cloud has been a major target of
many of the hackers which can access the data and lead to frauds. Utilization of administration of
the frameworks which can be sure that the encryption without the web cannot be discovered on
web. This required a solid web password creation which must be changed with respect to time
13
Document Page
and different methods of client ID must be changed which can be used to preserve data and
information.
DoS Attack: This refers to the service attacks which can be used to evaluate. It is
impossible to stop the effect but can mitigate the effect of these attacks. These assaults can
overpower resources of a cloud so that clients can get information of applications. Programmers
are similarly prone assaults for pernicious goals including extortion. The cloud supplier inverts
the charges, yet there is an assault and takes extra time and irritation.
System vulnerabilities: vulnerabilities of the system are the bugs in the operating system
that the programmers intentionally use control to validate framework for computers. Information
technology cleanliness goes towards the shielding you from the sort of genuine assault. As the
data can be accessed by the cloud provider there is a chance of overhaul and security fixes.
Account hijacking: There are many cases in which the access of the data is made easily
available with the hijacking process which can make use of the data which can be used to
evaluate the access and illegal use of the data by providing unnecessary assault. Moreover, the
user has no clue with the data theft and forgery which can be a major issue.
Malicious Insiders: There are many cases in which there is an issue related to the human
mistakes. A malicious insider present which gives easy access to the users and work for data
theft by the access of the data. There are two faced access by the hackers which provides access
for the users which can provide the access to the users.
APT parasite: APTs programmer plan are long term cyber-attacks to a particular user to
generate access to the users which can be used for data theft and frauds. This can be done by the
use of USB gadgets and many uninterrupted means of unreliable system. To get focuses. Once
interruption shows on system the use and the data theft can be easily conveyed. Mindful clients
and many of the people can be controlled to safeguard against the misconduct.
There is a permanent data loss which is carried out by the liable to an indistinguishable danger of
the information which is authentic. There can be a peterman data loss which can be used to
provide a user a loss of capital and information which can be of much importance to him. There
can be many spams which are Dos assaults, Email spam, computerized click extortion and
pilfered which are the basis of the data theft which can harm user’s information and lead to
losses.
Digital Signature
14
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
Digital signature can be understand as marking of digital data in form of encryption to
keep the data safe and authenticate for the user along with provider. It also help the data creator
to prevent on the misuse and unauthorised data handling and transfer. It also can help to evaluate
who has last modified the particular data base.
Digital Authenticate
Data authentication is a process that is used to established confidence in user identities
that are present on the information system. This can be consider as kind of authentication process
that is used to certified the identity and work of person.
Digital Authorization
Digital authorisation is also known as digital certificate that contains attributes that are
associated to the person who is holding the information data or system. This is generally
provided by the creator. This basically used for better security and data confidentiality.
15
Illustration : Digital Authorization
(Source: Authentication Vs Authorization, 2019)
Document Page
CHAPTER 3: RESEARCH METHODOLOGY
Research Type: Research type refers to the researches made which can be used to evaluate the
data according to the research carried out which can be descriptive research which involves
surveys and studies which are calculated to identify facts. There is no control over variables in
descriptive research.
This type of research methodology consists of two types that are qualitative and quantitative.
Quantitative: this refers to the infers and resolve problems using numbers. This is the collection
of data which can be used to collection of the numerical data and the summary of those data can
be used to evaluate the result of the evaluation carried out which can be used to evaluate the key
concerns which can be used to evaluate the approaches which needs to evaluate out. Main
research can be the evaluation of the result which is carried out for the evaluating the data which
is carried on the numeric research. This is widely used as field research is not carried out and the
data evaluated is carried out from some other base. This can be advantageous as data collection
occurs rapidly with this research. The sample of quantitative research are randomised. It offers
reliable and repeatable information. You can generalize the findings in which basis we need to
carry out the result.
Qualitative: This refers to the researches done on feelings, words, emotions, and non-numeric
data. It cannot be analysed by the means of the mathematical calculations. This characteristic is
concerned with the evaluation of the data which can be used to evaluate these qualitative
evaluations carried out this is referred to the data which does not happens frequently.
Main advantage of this research is it can be carried out on the emotional basis which is carried
out with the incidents and happenings which takes place. This refers to the data carried out. With
this research carried out there is a specific data which can be carried out by the research
approach which can be evaluated generic data which can be evaluated out.
For this research qualitative research is carried out as they can be related to cases which
can be analysed which can be easily carried out to evaluate the information which can be used as
a basis of research. There are various modes through which information on the cloud computing
is carried out and the cases through which there can be easily data gathering. As there are many
people facing the same problem which can be used to evaluate the data and the data collection
can be made easy. As the surveys and evaluation carried out for the development the basics of
research this research is based on the actual data which is generated by the self-evaluation.
16
Document Page
Research Approach: Research Approach can be defined as the study carried out which refers to
the study in the main distinctive points between various approaches. This is the approach carried
out in the evaluation of the data which is carried out by the research carried out earlier this
process is the evaluation of the data which is carried out to evaluate in the basis of the various
hypothesis. There are three type of research carried out:
Deductive research approach: This theory is based on the research carried on the collection of
data in which the test is carried out for the evaluation of the data which is done on the methods
and teats which already exists. The evaluation is carried out to improve the approaches and to
generate the better results. This is beneficial for the data which can be evaluated on the basis of
the self-evaluation this is carried out to improve the basis of the research which can help in the
development of the research.
Inductive research approach: This research approach is evaluated out with the resources which is
carried out on the evaluation of the result. Based on the study carried out on the personal basis
which implies that the theories implied out is by the researcher which emerges out the new
theory and generalization. This research is carried on the basis of the evaluation carried out
which consists of the self-evaluation which can be done on the primary basis and done on the
self evaluation of the data carried out.
Abductive Research approach: In the Abductive research there are premises used to generate the
conclusions based on the tests carried out.
For this research inductive research is carried out which can evaluate the data on a
profound manner which can evaluate the data which can help in the development of the research
and profound results. This can be used to evaluate the data based on various approaches carried
out to evaluate the result._
Research Philosophy: Research philosophy refers to the business and economics dissertation
which is at bachelors level which is not evaluated from the result. It is the use of the research
philosophy of the study which can be evaluated. It is based on the realistic approach which can
be evaluated form the positivism which helps in the generation of the data. This is carried out on
the interpretivism of the data which needs to be carried out to evaluated in the form of the
philosophical classification. Implications of the research philosophy is identified form the
research philosophy carried out for the generation of the data on the basis of primary data
collection. It deals with the source of nature and the development of the knowledge. It is a way
17
tabler-icon-diamond-filled.svg

Paraphrase This Document

Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser
Document Page
in which the data about the problem should be collected, and its analysation is carried out and its
use. On the basis of research the data is analysed and collected and analysed formulating the
result and assumptions. There is a philosophical differences between studies and focus on facts
and numbers. For this study interpretivism research is used which refers to the research interpret
in the elements of the study which interpretivism integrates human interests into the study. This
refers to the researches which assume the reality only through the social construction.
Interpretivism is used to group together a diverse approach, including social
constructivism, phenomenology and hermeneutics. Approaches that reject the objectivists view
that meaning resided within the real world. Various important aspects of interpretivism are:
hermeneutics which refers to the study or philosophy of interpretation and understanding
which focuses on the biblical texts and literatures.
Phenomenology it is a philosophical tradition that seeks to understand world through
direct experience.
Symbolic interactions accepts various symbols which are culturally diverse as social
objectives having shared meanings.
In this evaluation on the watermarking techniques related to cloud computing various study is
carried out for the evaluation of the data which can be used to evaluate the approaches used for
the generation of the data interpretivism is carried out which is based on the facts which can
evaluated on the basis of the facts which is carried out for the evaluation of the data to interpret
the sources. The research carried out are interpreted for the development of the data for the
research to be further carried out. It is used by a group of people which can be evaluated for a
diverse approach which can be carried out to evaluate the key approaches. Which can help in the
generation of the strategies which can be used for the further evaluation of data.
Research design:
Research design is known as the methodological framework which is used for combining
various components or elements of research in logical way for addressing the research question.
Research design is of three types namely: exploratory, descriptive and causal.
Exploratory design: In this design emphasis is put on obtaining new insights and ideas. It
provides better and more detailed understanding of the research situation. However it does not
provide any decision or conclusive statement. Exploratory research is essential for formulating
business or marketing strategy as it emphasis upon discovery or exploration of new ideas instead
18
Document Page
of gathering statistical data (McCusker and Gunaydin, 2015). This type of research design is
preferred for defining growth and priority areas, common issues and alternative actions.
Descriptive design: Contrary to exploratory design descriptive design focus on frequency of
covariance between two research variables. Descriptive research gives the characteristics of
research population or the characteristics. It is conclusive in nature and is of quantitative type. It
is structured and pre planned so that statistical information can be inferred with the sample
population. Descriptive design is primarily used for defining behaviour, attitude or opinion of a
group. With the selection of descriptive design there must be high level of rigidity so that bias
can be avoided and reliability of the research can be increased.
As compared to exploratory research this design approach is more structured and has pre
planned design for the analysis purpose. It has limited flexibility and it also uses the knowledge
from exploratory research to determine priority issues and respondents. It also follows the
standardise methods for gathering and analysing the research data. This design is based on
theories and thus provides in depth information of the research. One of the advantages of using
exploratory design is that it puts entire focus on specific research question so that data collection
and analysis processes are compelled to gather data for addressing that research question.
Contrary to the exploratory design, in this approach data are analysed using quantitative data
mainly. In many researches descriptive design is also used with the goal to find the correlation
between research variables. Descriptive design also provides accurate data by finding new and
relevant data which may contradict the previously researched data.
Causal research design: This research design is suitable for evaluating the cause and effect
relationship. Similar to the descriptive research design, causal design is also quantitative in
nature. However, it is employed to figure out the causal and effect relationship between research
variables (McCusker and Gunaydin, 2015). Instead of adopting observational style, it deciphers
whether a relationship is causal or not. Thus, causal research design helps researchers to
understand the cause variables and their predicted effect. When it comes to causality it is also
capable to identify covariation between different research variables. Similar to the descriptive
design, causal research design is also highly structured. This research design also uses control
procedures during experimental designs linked with test of causal relationships.
19
Document Page
Thus, causal design is used by the researcher when the key subject of concern is to know
and understand the impact of independent variable on dependent variable. Causal studies plays
an important role in identifying the rationale or reasons for various processes and impact of
existing norms or changes (Leung, 2015). Thus, the prime advantage offer by this type of studies
is of replication and higher level of internal validity as a result of systematic selection or choice
of subject. One of the disadvantages with causal design is that it can be challenging to reach
appropriate conclusions on the basis of causal findings because casualty may not be proved with
higher degree of certainty, though it can be inferred. Further for some researchers such as this
one it can be difficult to identify the cause and impact variable.
In this study exploratory design is used because it helps in investigation of a problem fro
more clear and effective solutions. The flexibility of exploratory research design allow
researcher to consider and analyse all aspects of the research problem. Another reason for
choosing exploratory design is that this design approach uses more qualitative data and thus it
becomes easy for the researcher to conduct studies which are of qualitative type (Sutton and
Austin, 2015). Another benefit of using exploratory design is that this research design is highly
effective in terms of adaptability to change. The use of exploratory design also builds up the
groundwork for the future supporting studies in the research area. It is also considered as
effective in terms of cost and time saving as in the initial stages only it gives a clear classification
of the research type.
Data collection
Research data is defined as the collection of all necessary facts and information which
can be analysed to develop conclusions. The validity and quality of research studies depends
upon the accuracy and efficiency of data collection methods. Hence it is very important to
choose the data collection methods appropriately. The research data can be collected by primary
or secondary methods.
Primary data collection:
Primary data collection is known as the method in which data is collected directly from
the original sources or the sample population instead of gathering it from the previously
conducted research studies. The primary data can be collected by number of ways such as
interviews, surveys or questionnaire or through experimentation or laboratory testing in case of
20
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
scientific research (Palinkas and et.al., 2015). One of the most common advantage of using
primary research is that it gives researcher an opportunity to collect original data which is highly
specific to the research requirements and is current. However, this process can be time
consuming or costly and thus can even take several months to complete. Primary data collection
is usually preferred by the researchers when the research issue is sufficiently unique or important
that it requires the additional efforts and expenditure to collect the relevant and primary data
(McCusker and Gunaydin, 2015). This type of data is original and thus it gives more realistic
analysis of the research background.
In addition to this the reliability of the primary data is also high because it is collected
from the reliable and concerned sources. Apart from these advantages primary data collection
methods have several disadvantages such as their time consuming and costly nature. Along with
this it is also very challenging for the researcher to design or prepare a data collection tool or
survey which can encourage the desired response from the participants. With the primary
research it is also possible that some participants or respondents may not give proper or timely
response (Harrison and et.al., 2017). The response may also be influenced by the personal
opinion of individuals and thus they can be biased. Primary data collection requires trained
research teams so that adequate amount and quality of data can be collected. With this type of
data collection researchers have very limited or no control over the data. Due to these reasons in
this study secondary data collection is chosen.
Secondary data collection:
Contrary to the primary data, secondary data is collected from the previous researches
and thus it involves data collection, synthesizing and analysis of primary data. Secondary data
can be gathered from variety of sources such as scholarly journals, government documents,
scientific reports, textbooks and research databases. Secondary research is based on existing data
and thus it can be accomplished very quickly with less expenses (Wildemuth, 2016). However,
this data collection method imposes a great challenge to obtain information which is specific to
the requirement of research. Along with the benefits of time and cost effectiveness, secondary
data also allow researcher to access the research work of the best- and well-known scholars
around different corners of world, instead of limiting research data to specific geographical or
population segment.
21
Document Page
Secondary data is available for the researchers in the various forms such as documents,
electronic data or reports. Secondary data is used for gaining initial insight of the research
problem and associated solutions (Boone and et.al., 2016). This data collection can be classified
as internal and external. Internal secondary data or in house data can be collected by researcher
within organisation or the research setting in which study is being conducted. On the other hand
external secondary data can be obtained outside the research setting such as from different
scholars or different organisations. This type of data is easily available and with this data it is
easy for the researcher to compare the existing gaps or the research gaps which are not fulfilled
by the previous research studies.
This research will use secondary data collection methods because of its time
effectiveness. Another benefit and reason of using secondary data method for this research
project is that secondary methods provide a framework or guidance to researcher that in which
direction research must be guided (Johnston, 2017). By exploring the research work of vast range
of scholars the reliability and effectiveness of the research can be increased. Use of secondary
data also assist in exploring the relevant information on cloud computing evaluated by various
scholars.
Data Analysis
In a research work different type of data and information is collected in data collection
process. Mainly the process of data analysis is known as an approach that is used by the
researchers to analyse and evaluate data that is collected from various sources (Mohajan, 2018).
This is an important part of research process. The results the research work is depended on the
data analysis process. In data analysis different approaches are used to evaluate and examine on
certain standards which are selected in order to generate effective and systematic result to draw
proper conclusion of the research work. Selection of data analysis approach is depended on what
type of data or information that is going to be analysed in the research. There are different data
analysis approaches are generally used in common research works. For the research projects
there are many data analysis techniques are available which can be used in the research work.
There is no any constant approach of data analysis process (Ahmed, Opoku and Aziz, 2016).
Mostly in research work text analysis, statistical analysis, diagnosis, predictive analysis,
22
Document Page
prescriptive analysis and thematic analysis. These are most common tools and techniques that are
used by the individual researcher to analyse data and information.
The text analysis is based on the data mining technique where data collected in the data
collection process is interrelated to each other to find any pattern and relation between data. The
statistical data analysis process is based on the analysis of numeric data. This data analysis
approach is based on the calculation of collected data (Comer and Bry, 2018). There are different
tools like MS excel, MS project and SPSS are used in order to prepare this type of analysis.
Mostly this data analysis approach is used by researcher when they want to analysis numerical
data to get specific and precise result. Diagnostic data analysis is based on the conditional
analysis where the whole processes of collected data are evaluated or analysis to generate a
proper result. Predictive analysis is a process of data analysis that is used by the researcher to
evaluate the future possibilities to predict the future condition. Prescriptive data analysis
approach is used by the researcher when they need to analysis different process that are used in
the data collection or research work.
23
chevron_up_icon
1 out of 25
circle_padding
hide_on_mobile
zoom_out_icon
[object Object]