Access Control Challenges and Approaches for Securing Big Data (AI)
VerifiedAdded on 2023/01/05
|10
|3069
|90
Report
AI Summary
This report provides a detailed analysis of the challenges and approaches related to securing big data. It begins with an introduction to big data, its characteristics, and its importance in modern business, including the concepts of structured and unstructured data, and big data analytics. The main body of the report then dives into the core challenges of securing big data, such as distributed data processing, endpoint vulnerabilities, access control limitations, and the complexities of non-relational databases. The report highlights the vulnerabilities in systems like Hadoop and NoSQL databases. Furthermore, it explores various solutions, including safeguarding distributed programming frameworks, securing non-relational data, implementing secure data storage and transaction logs, endpoint filtering and validation, real-time compliance and security monitoring, preserving data privacy through techniques like differential privacy and homomorphic encryption, leveraging big data cryptography, and implementing granular access control. The report concludes by emphasizing the importance of these approaches in mitigating risks and ensuring the security and privacy of big data in various applications.

Access Control Challenges
and Approaches for Securing
Big Data
and Approaches for Securing
Big Data
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

Table of Contents
Introduction......................................................................................................................................1
Main Body.......................................................................................................................................1
Conclusion.......................................................................................................................................6
References........................................................................................................................................8
Introduction......................................................................................................................................1
Main Body.......................................................................................................................................1
Conclusion.......................................................................................................................................6
References........................................................................................................................................8

Introduction
Big data refers to term that is being utilised for describing collection of data which is
large in volume and is growing rapidly. Basically, it illustrates diverse set of data which grows at
increasing rates (Akhuseyinoglu, and Joshi, 2020). This comprises of speed or velocity of data at
which this created, its volume and scope at which data is being created. It is a field that is liable
for analysing systematic information from data sets which are complex to be handled through
usage of traditional data processing applications.
Main Body
Big data is defined as description of huge amount of data which is both structured as well
as unstructured and inundates business on every day approaches. Big data is liable for assisting
organisation within creation of new growth opportunities along with categories of firms that are
liable for combining as well as analysis of industry data. These firms comprises of ample of
information related with products & services, buyers & suppliers and preferences of customers
which can be captured as well as analysed. Structured data comprises of information that is
already managed via organisation within databases as well as spreadsheets which is usually in
numeric within nature. In case of unstructured data, information is being in unorganised form
which does not fall within pre-determined format or model (Al-Abassi and et. al, 2020). This
comprises of data that is gathered via social media sources that aids institutes to collect data from
needs of the customers. Big data is being gathered via publically shared comments that are being
made on social networks or any other websites, product purchase, apps, questionnaires and
electronic check-ins that are being made.
Big data analytics refers to process that is utilised for extraction of meaningful insights
like unknown correlations, hidden patterns, customer preferences and market trends. This aids
within ensuring that decision making is carried out in an enhanced manner so that any kind of
fraudulent activities can be avoided. For an instance the Phillippine banking firm Banco de Oro
makes use of Big data analytics for identification of discrepancies. Starbucks have opted for big
data analytics for making up strategic decisions like they leverage this to identify whether
peculiar location will be apt for starting a new venture or not (Awaysheh and et. al, 2020). Here,
different parameters are being analysed that involves accessibility to location, demographics,
1
Big data refers to term that is being utilised for describing collection of data which is
large in volume and is growing rapidly. Basically, it illustrates diverse set of data which grows at
increasing rates (Akhuseyinoglu, and Joshi, 2020). This comprises of speed or velocity of data at
which this created, its volume and scope at which data is being created. It is a field that is liable
for analysing systematic information from data sets which are complex to be handled through
usage of traditional data processing applications.
Main Body
Big data is defined as description of huge amount of data which is both structured as well
as unstructured and inundates business on every day approaches. Big data is liable for assisting
organisation within creation of new growth opportunities along with categories of firms that are
liable for combining as well as analysis of industry data. These firms comprises of ample of
information related with products & services, buyers & suppliers and preferences of customers
which can be captured as well as analysed. Structured data comprises of information that is
already managed via organisation within databases as well as spreadsheets which is usually in
numeric within nature. In case of unstructured data, information is being in unorganised form
which does not fall within pre-determined format or model (Al-Abassi and et. al, 2020). This
comprises of data that is gathered via social media sources that aids institutes to collect data from
needs of the customers. Big data is being gathered via publically shared comments that are being
made on social networks or any other websites, product purchase, apps, questionnaires and
electronic check-ins that are being made.
Big data analytics refers to process that is utilised for extraction of meaningful insights
like unknown correlations, hidden patterns, customer preferences and market trends. This aids
within ensuring that decision making is carried out in an enhanced manner so that any kind of
fraudulent activities can be avoided. For an instance the Phillippine banking firm Banco de Oro
makes use of Big data analytics for identification of discrepancies. Starbucks have opted for big
data analytics for making up strategic decisions like they leverage this to identify whether
peculiar location will be apt for starting a new venture or not (Awaysheh and et. al, 2020). Here,
different parameters are being analysed that involves accessibility to location, demographics,
1
⊘ This is a preview!⊘
Do you want full access?
Subscribe today to unlock all pages.

Trusted by 1+ million students worldwide

population and many more. The big data analytics comprises of different stages within the
lifecycle they are:
i. Business case evaluation: This defined reason along with goals behind carrying
out analysis.
ii. Identification of data: In this case, the broad range of data sources are determined
in order to identify the probable information that will lead to creation of impact on
the organisation.
iii. Data filtering: All the data that have been determined via previous stages will be
filtered within this stage so that any kind of corrupted data can be eliminated.
iv. Data extraction: It is identified that data which is attained is not compatible with
the tools that have been extracted and this is then converted into the compatible
format.
v. Data aggregation: Here, data that is present across distinct datasets will be
integrated in same field.
vi. Data analysis: In this case, data will be analysed through usage of analytical as
well as statistical tools that are being utilised for discovering important data (Jain,
2020).
i. Visualisation of data: Tools such as Power BI, Tableau and QlikView are used
for big data analytics that aids within producing graphic visualisation for the
analysis.
ii. Final analysis results: This illustrates last step within big data analytics lifecycle
in which it is being made available for business stakeholders for which actions
will be taken.
Big data security implies collective term that is used for measuring tools that are utilised
for guarding both data as well as analytics processes from any kind of theft, attacks or any
malicious attacks which will have negative impact. Big data challenges are not restricted to
certain platforms (Kapil and et. al, 2020). The different challenges of securing big data are
specified here, Distributed data: The data processing is carried via different systems for ensuring
that fast analysis is carried out like Hadoop is open source framework but it is formulated
without taking into consideration any security. MapReduce mapper can be used for showing
2
lifecycle they are:
i. Business case evaluation: This defined reason along with goals behind carrying
out analysis.
ii. Identification of data: In this case, the broad range of data sources are determined
in order to identify the probable information that will lead to creation of impact on
the organisation.
iii. Data filtering: All the data that have been determined via previous stages will be
filtered within this stage so that any kind of corrupted data can be eliminated.
iv. Data extraction: It is identified that data which is attained is not compatible with
the tools that have been extracted and this is then converted into the compatible
format.
v. Data aggregation: Here, data that is present across distinct datasets will be
integrated in same field.
vi. Data analysis: In this case, data will be analysed through usage of analytical as
well as statistical tools that are being utilised for discovering important data (Jain,
2020).
i. Visualisation of data: Tools such as Power BI, Tableau and QlikView are used
for big data analytics that aids within producing graphic visualisation for the
analysis.
ii. Final analysis results: This illustrates last step within big data analytics lifecycle
in which it is being made available for business stakeholders for which actions
will be taken.
Big data security implies collective term that is used for measuring tools that are utilised
for guarding both data as well as analytics processes from any kind of theft, attacks or any
malicious attacks which will have negative impact. Big data challenges are not restricted to
certain platforms (Kapil and et. al, 2020). The different challenges of securing big data are
specified here, Distributed data: The data processing is carried via different systems for ensuring
that fast analysis is carried out like Hadoop is open source framework but it is formulated
without taking into consideration any security. MapReduce mapper can be used for showing
2
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

incorrect list of key pairs or values. Endpoint vulnerabilities: Cybercriminals are liable for
manipulation of endpoint devices along with transmission of false data to different data lakes.
Security solutions are accountable for analysis of logs from different endpoints that have to be
used to validate authenticity of endpoints. For an instance, hackers can have access to distinct
manufacturing systems that comprises of making use of sensors in order to detect malfunctions
within the processes (Kayes and et. al, 2020). After having access, hackers will be able to show
up fake results on the sensors.
Access controls: Firms tends to limit access for sensitive information such as medical
records that comprises of personal information about individuals. But sometimes individuals like
data researchers or medical researchers have to make use of this data but they cannot have access
to this. In such kind of situations granular access is being given but in case of big data
technologies this concept does not exist due to security concerns. Non-Relational database: The
traditional database makes use of tabular scheme of columns and rows because of this they will
not be dealing with big data. The reason behind this is diverse structure and highly scalability. It
is also referred to as NoSQL database which are formulated for overcoming issues related with
relational database. The NoSQL database is liable for optimisation of storage models according
to their data types. This makes them more scalable and flexible in nature but flexibility is
regarded as important in comparison to security. These are new technologies and are in active
development which means that is complex for processes and security software for protecting
such kind of toolsets.
Apart from these, the mature security toolsets are liable for protecting data storage and
ingress but this might not lead to same influence on outcome from wide analytics tools within
multiple locations. Furthermore, the critical aspect is that big data administrators might decide
for mining data without having any notifications or permissions to access this. It is not necessary
that intent is curiosity but this can be also a criminal profit. This means that security tools have to
monitor as well as make alert when any kind of suspicious actions are made. The size of big data
installation is large usually in terabytes and it goes to petabytes which is very big to schedule
routine audits (Kumari, 2020). In addition to this, big data platforms are dependent on cluster-
based that is liable for introducing various vulnerabilities around distinct servers and nodes. In
case big data owners do not update security aspects on regular basis then they are at higher
probability for losing data as well as exposure will also be increased.
3
manipulation of endpoint devices along with transmission of false data to different data lakes.
Security solutions are accountable for analysis of logs from different endpoints that have to be
used to validate authenticity of endpoints. For an instance, hackers can have access to distinct
manufacturing systems that comprises of making use of sensors in order to detect malfunctions
within the processes (Kayes and et. al, 2020). After having access, hackers will be able to show
up fake results on the sensors.
Access controls: Firms tends to limit access for sensitive information such as medical
records that comprises of personal information about individuals. But sometimes individuals like
data researchers or medical researchers have to make use of this data but they cannot have access
to this. In such kind of situations granular access is being given but in case of big data
technologies this concept does not exist due to security concerns. Non-Relational database: The
traditional database makes use of tabular scheme of columns and rows because of this they will
not be dealing with big data. The reason behind this is diverse structure and highly scalability. It
is also referred to as NoSQL database which are formulated for overcoming issues related with
relational database. The NoSQL database is liable for optimisation of storage models according
to their data types. This makes them more scalable and flexible in nature but flexibility is
regarded as important in comparison to security. These are new technologies and are in active
development which means that is complex for processes and security software for protecting
such kind of toolsets.
Apart from these, the mature security toolsets are liable for protecting data storage and
ingress but this might not lead to same influence on outcome from wide analytics tools within
multiple locations. Furthermore, the critical aspect is that big data administrators might decide
for mining data without having any notifications or permissions to access this. It is not necessary
that intent is curiosity but this can be also a criminal profit. This means that security tools have to
monitor as well as make alert when any kind of suspicious actions are made. The size of big data
installation is large usually in terabytes and it goes to petabytes which is very big to schedule
routine audits (Kumari, 2020). In addition to this, big data platforms are dependent on cluster-
based that is liable for introducing various vulnerabilities around distinct servers and nodes. In
case big data owners do not update security aspects on regular basis then they are at higher
probability for losing data as well as exposure will also be increased.
3

The certain challenges that are being faced while working with big data are specified
above; this makes it very clear that there is need for solution through which these issues can be
addressed in affirmative manner. The various ways in which this can be attained are illustrated in
following section, Safeguarding distributed programming frameworks, Hadoop makes large
part of distribution but this is at high risks for data leakage and with this concept of untrusted
mappers also comes in which produces error-ridden outcomes. In this context, it is recommended
that trust must be established through usage of different methods like Kerberos authentication to
have conformity for predefined security policies associated with this. Furthermore, the data will
be decoupled from PII (personally identifiable information) from data for ensuring that personal
privacy is not compromised (Leander, 2020). Furthermore, it has to be ensured that untrusted
code will not leak information through system resources by usage of MAC. The mappers and
worker nodes must be checked up by IT department for identification of modified and fake nodes
data.
Secure non-relational data, like NoSQL but they are vulnerable to certain attacks like
NoSQL injection, CSA has listed certain countermeasures for protecting against them. This is
started via hashing passwords or encrypting and make ensures that end-to-end by encrypting data
by making use of algorithms like AAES, secured hash algorithm 2and RSA. Secured socket layer
and transport layer security encryption. This also involves object-level securities and data
tagging by usage of PAM (pluggable authentication modules) which is a flexible methodology
for authentication of users while ensuring log transactions via NIST log. This is referred to as
fuzzing method in which cross-site scripting as well as vulnerabilities are injected among HTTP
protocol and NoSQL through usage of automated data input at data node, protocols and
application layer of distribution.
Secured data storage and transaction logs, the CSA recommend that signed message
digests are being utilised for rendering digital identifiers for every document or digital file by
usage of technique named SUNDR (untrusted data repository) for detection of unauthorised
modifications that are being made within server agents. Endpoint filtering & validation, it is a
paramount for organisation through usage of trusted certificates, carrying out resource testing a
well as connecting trusted devices to network by usage of (MDM) mobile device management
solution (Surbakti and et. al, 2020). From this, statistical detection techniques as well as outlier
detection techniques for filtering malicious inputs while guarding ID-spoofing and Sybil attacks.
4
above; this makes it very clear that there is need for solution through which these issues can be
addressed in affirmative manner. The various ways in which this can be attained are illustrated in
following section, Safeguarding distributed programming frameworks, Hadoop makes large
part of distribution but this is at high risks for data leakage and with this concept of untrusted
mappers also comes in which produces error-ridden outcomes. In this context, it is recommended
that trust must be established through usage of different methods like Kerberos authentication to
have conformity for predefined security policies associated with this. Furthermore, the data will
be decoupled from PII (personally identifiable information) from data for ensuring that personal
privacy is not compromised (Leander, 2020). Furthermore, it has to be ensured that untrusted
code will not leak information through system resources by usage of MAC. The mappers and
worker nodes must be checked up by IT department for identification of modified and fake nodes
data.
Secure non-relational data, like NoSQL but they are vulnerable to certain attacks like
NoSQL injection, CSA has listed certain countermeasures for protecting against them. This is
started via hashing passwords or encrypting and make ensures that end-to-end by encrypting data
by making use of algorithms like AAES, secured hash algorithm 2and RSA. Secured socket layer
and transport layer security encryption. This also involves object-level securities and data
tagging by usage of PAM (pluggable authentication modules) which is a flexible methodology
for authentication of users while ensuring log transactions via NIST log. This is referred to as
fuzzing method in which cross-site scripting as well as vulnerabilities are injected among HTTP
protocol and NoSQL through usage of automated data input at data node, protocols and
application layer of distribution.
Secured data storage and transaction logs, the CSA recommend that signed message
digests are being utilised for rendering digital identifiers for every document or digital file by
usage of technique named SUNDR (untrusted data repository) for detection of unauthorised
modifications that are being made within server agents. Endpoint filtering & validation, it is a
paramount for organisation through usage of trusted certificates, carrying out resource testing a
well as connecting trusted devices to network by usage of (MDM) mobile device management
solution (Surbakti and et. al, 2020). From this, statistical detection techniques as well as outlier
detection techniques for filtering malicious inputs while guarding ID-spoofing and Sybil attacks.
4
⊘ This is a preview!⊘
Do you want full access?
Subscribe today to unlock all pages.

Trusted by 1+ million students worldwide

Real-time compliance & security monitoring, this leads to create headache for firms while
dealing with continuous deluge for data. This aids within dealing with real-time analytics along
with security at each levels of stack. The CSA enables firms to apply for big data analytics by
usage of secure shell (SSH), IPsec (internet protocol security) and Kerberos for handling real-
time data. The mine logging events along with front end security systems deployed like
application-level firewalls, routers and implementation of security controls via stack at cloud and
cluster.
Preserve data privacy, the key is composable and scalability by implementation of
techniques like differential privacy which will lead to amplify query accuracy and record
identification will be minimised. The homomorphic encryption will be conducted for storing and
processing the information that has been encrypted within the cloud. The emphasis is laid on
incorporating awareness among employees by rendering training in which focus will be on
current privacy regulations as well as ensuring that software infrastructure is being maintained
via authorisation mechanism (Xu, Chen and Blasch, 2020). The best practice is privacy-
preserving data composition must be implemented that is liable for reviewing as well as
monitoring infrastructure which link database together. Big data cryptography, a system can be
constructed for searching and filtering encrypted data like SSE (searchable symmetric
encryption) protocols which are liable to execute Boolean algebra on encrypted information.
Apart from this, there are wide range of cryptographic techniques that can be utilised. Rational
techniques will aid within comparing encrypted data without carrying out any kind of sharing of
encryption keys through attribute values and matching identifiers. IBE (identity based
encryption) which makes management of key easy within public that aids plaintext to be within
encrypted form for peculiar identity. ABE (attribute based encryption) leads to integration of
access controls within encryption scheme. This will further enable cloud providers to eliminate
duplicate information.
Granular access control, involves two core aspects that are restriction for user access as well as
granting access to users. The trick is to develop as well as execute policies that are liable for
choosing right person within peculiar scenario (Jain, 2020). In context of granular access control
comprises of normalisation of mutable entities as well as denormalization of immutable
elements, tracking secrecy requirements along with making sure relevant execution, maintaining
access labels, admin data, SSO and opting for labelling schemes for maintaining relevant
5
dealing with continuous deluge for data. This aids within dealing with real-time analytics along
with security at each levels of stack. The CSA enables firms to apply for big data analytics by
usage of secure shell (SSH), IPsec (internet protocol security) and Kerberos for handling real-
time data. The mine logging events along with front end security systems deployed like
application-level firewalls, routers and implementation of security controls via stack at cloud and
cluster.
Preserve data privacy, the key is composable and scalability by implementation of
techniques like differential privacy which will lead to amplify query accuracy and record
identification will be minimised. The homomorphic encryption will be conducted for storing and
processing the information that has been encrypted within the cloud. The emphasis is laid on
incorporating awareness among employees by rendering training in which focus will be on
current privacy regulations as well as ensuring that software infrastructure is being maintained
via authorisation mechanism (Xu, Chen and Blasch, 2020). The best practice is privacy-
preserving data composition must be implemented that is liable for reviewing as well as
monitoring infrastructure which link database together. Big data cryptography, a system can be
constructed for searching and filtering encrypted data like SSE (searchable symmetric
encryption) protocols which are liable to execute Boolean algebra on encrypted information.
Apart from this, there are wide range of cryptographic techniques that can be utilised. Rational
techniques will aid within comparing encrypted data without carrying out any kind of sharing of
encryption keys through attribute values and matching identifiers. IBE (identity based
encryption) which makes management of key easy within public that aids plaintext to be within
encrypted form for peculiar identity. ABE (attribute based encryption) leads to integration of
access controls within encryption scheme. This will further enable cloud providers to eliminate
duplicate information.
Granular access control, involves two core aspects that are restriction for user access as well as
granting access to users. The trick is to develop as well as execute policies that are liable for
choosing right person within peculiar scenario (Jain, 2020). In context of granular access control
comprises of normalisation of mutable entities as well as denormalization of immutable
elements, tracking secrecy requirements along with making sure relevant execution, maintaining
access labels, admin data, SSO and opting for labelling schemes for maintaining relevant
5
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

information federation. Furthermore, granular auditing is necessary in case if any attack has
occurred on the system. The cohesive audit view must lead to follow attack and ensure that audit
trail must be conducted to ensure that easily access can be carried out for cutting down the
incident response time. Confidentiality and integrity are critical for auditing information which
must be stored in separate as well as protect them with granular user access controls along with
monitoring them. Audit data separate as well as big data will enable them to logging when
auditing is carried out (Kapil and et. al, 2020). The query orchestrator or open source audit layer
tool like ElasticSearch can be utilised for doing this so.
Data provenance¸ means distinct things that depend on certain aspects. In this case,
metadata generated via big data applications which leads to furnish relevant protection and also
leads within development of infrastructure authentication protocols that furnish controls access
when setting up status updates that continually aids within verification of data integrity via
mechanism like checksums. Furthermore, distributed architecture related with big data related
with intrusion attempts. IPS (intrusion prevention system) leads to enable security team for
protecting big data platforms via vulnerability that is liable for vulnerability exploits through
examining network traffic. The IPS aids within isolation and firewall before any kind of actual
damage is carried out. Centralised key management is liable for offering efficient management.
The growing firm tends to make use of big data analytics tools for improvisation of
business strategies as this leads cybercriminals to render enhanced opportunities for attacking big
data architecture. Furthermore, smart big data analytics tools aid within development of new
security strategies when information is attained. For an instance, security intelligence tools aids
within reaching conclusion depending on correlation for security information around distinct
systems (Kumari, 2020). Furthermore, intrusion detection along with prevention aids within
making sure information that is processed and stored is secure and safe. IPS allows security
admins for protecting platform from any kind of intrusion before any kind of attacks are being
made. A physical security system leads to deny data centre for strangers or employees who are
not within the sensitive areas. Security logs and video surveillance are liable for doing so.
Conclusion
Based on the above discussion, it is inferred that big data is defined as diverse and large
set of information which comprises of velocity of data which is covered within this. Data mining
techniques are being utilised for multiple formats. Organizations collect data from a variety of
6
occurred on the system. The cohesive audit view must lead to follow attack and ensure that audit
trail must be conducted to ensure that easily access can be carried out for cutting down the
incident response time. Confidentiality and integrity are critical for auditing information which
must be stored in separate as well as protect them with granular user access controls along with
monitoring them. Audit data separate as well as big data will enable them to logging when
auditing is carried out (Kapil and et. al, 2020). The query orchestrator or open source audit layer
tool like ElasticSearch can be utilised for doing this so.
Data provenance¸ means distinct things that depend on certain aspects. In this case,
metadata generated via big data applications which leads to furnish relevant protection and also
leads within development of infrastructure authentication protocols that furnish controls access
when setting up status updates that continually aids within verification of data integrity via
mechanism like checksums. Furthermore, distributed architecture related with big data related
with intrusion attempts. IPS (intrusion prevention system) leads to enable security team for
protecting big data platforms via vulnerability that is liable for vulnerability exploits through
examining network traffic. The IPS aids within isolation and firewall before any kind of actual
damage is carried out. Centralised key management is liable for offering efficient management.
The growing firm tends to make use of big data analytics tools for improvisation of
business strategies as this leads cybercriminals to render enhanced opportunities for attacking big
data architecture. Furthermore, smart big data analytics tools aid within development of new
security strategies when information is attained. For an instance, security intelligence tools aids
within reaching conclusion depending on correlation for security information around distinct
systems (Kumari, 2020). Furthermore, intrusion detection along with prevention aids within
making sure information that is processed and stored is secure and safe. IPS allows security
admins for protecting platform from any kind of intrusion before any kind of attacks are being
made. A physical security system leads to deny data centre for strangers or employees who are
not within the sensitive areas. Security logs and video surveillance are liable for doing so.
Conclusion
Based on the above discussion, it is inferred that big data is defined as diverse and large
set of information which comprises of velocity of data which is covered within this. Data mining
techniques are being utilised for multiple formats. Organizations collect data from a variety of
6

sources, including business transactions, social media and information from sensor or machine-
to-machine data. There are certain challenges related with big data which have to be analysed
and it becomes necessary for them to furnish relevant solutions with reference with the problems
that have been identified. This involves encryption of information and tracking the information
which is being utilised.
7
to-machine data. There are certain challenges related with big data which have to be analysed
and it becomes necessary for them to furnish relevant solutions with reference with the problems
that have been identified. This involves encryption of information and tracking the information
which is being utilised.
7
⊘ This is a preview!⊘
Do you want full access?
Subscribe today to unlock all pages.

Trusted by 1+ million students worldwide

References
Books & Journals
Akhuseyinoglu, N.B. and Joshi, J., 2020. Access Control Approaches for Smart Cities. IOT
Technologies in Smart-Cities: From Sensors to Big Data, Security and Trust, p.1.
Al-Abassi, A. and et. al, 2020. Industrial big data analytics: challenges and opportunities.
In Handbook of Big Data Privacy (pp. 37-61). Springer, Cham.
Awaysheh, F. and et. al, 2020. Next-generation big data federation access control: A reference
model. Future Generation Computer Systems.
Jain, N., 2020. Secured Cloud Computing for Data Management Using Big Data for Small and
Medium Educational Institutions. International Journal of Computer Engineering and
Technology, 11(2).
Kapil, G. and et. al, 2020. Attribute based honey encryption algorithm for securing big data:
Hadoop distributed file system perspective. PeerJ Computer Science, 6, p.e259.
Kayes, A.S.M. and et. al, 2020. A Survey of Context-Aware Access Control Mechanisms for
Cloud and Fog Networks: Taxonomy and Open Research Issues. Sensors, 20(9), p.2464.
Kumari, P.L.S., 2020. Big Data: Challenges and Solutions. In Security, Privacy, and Forensics
Issues in Big Data (pp. 24-65). IGI Global.
Leander, B., 2020. Access Control for Secure Industry 4.0 Industrial Automation and Control
Systems (Doctoral dissertation, Mälardalen University).
Surbakti, F.P.S. and et. al, 2020. Factors influencing effective use of big data: A research
framework. Information & Management, 57(1), p.103146.
Xu, R., Chen, Y. and Blasch, E., 2020. Decentralized Access Control for IoT Based on
Blockchain and Smart Contract. Modeling and Design of Secure Internet of Things,
pp.505-528.
8
Books & Journals
Akhuseyinoglu, N.B. and Joshi, J., 2020. Access Control Approaches for Smart Cities. IOT
Technologies in Smart-Cities: From Sensors to Big Data, Security and Trust, p.1.
Al-Abassi, A. and et. al, 2020. Industrial big data analytics: challenges and opportunities.
In Handbook of Big Data Privacy (pp. 37-61). Springer, Cham.
Awaysheh, F. and et. al, 2020. Next-generation big data federation access control: A reference
model. Future Generation Computer Systems.
Jain, N., 2020. Secured Cloud Computing for Data Management Using Big Data for Small and
Medium Educational Institutions. International Journal of Computer Engineering and
Technology, 11(2).
Kapil, G. and et. al, 2020. Attribute based honey encryption algorithm for securing big data:
Hadoop distributed file system perspective. PeerJ Computer Science, 6, p.e259.
Kayes, A.S.M. and et. al, 2020. A Survey of Context-Aware Access Control Mechanisms for
Cloud and Fog Networks: Taxonomy and Open Research Issues. Sensors, 20(9), p.2464.
Kumari, P.L.S., 2020. Big Data: Challenges and Solutions. In Security, Privacy, and Forensics
Issues in Big Data (pp. 24-65). IGI Global.
Leander, B., 2020. Access Control for Secure Industry 4.0 Industrial Automation and Control
Systems (Doctoral dissertation, Mälardalen University).
Surbakti, F.P.S. and et. al, 2020. Factors influencing effective use of big data: A research
framework. Information & Management, 57(1), p.103146.
Xu, R., Chen, Y. and Blasch, E., 2020. Decentralized Access Control for IoT Based on
Blockchain and Smart Contract. Modeling and Design of Secure Internet of Things,
pp.505-528.
8
1 out of 10
Related Documents

Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
Copyright © 2020–2025 A2Z Services. All Rights Reserved. Developed and managed by ZUCOL.