Database Fundamentals: Normalization Process and Data Accuracy
VerifiedAdded on 2022/07/27
|4
|666
|51
Homework Assignment
AI Summary
This assignment delves into the core concepts of database normalization, focusing on the reduction of redundant data and the enhancement of data integrity. It explains the four primary normal forms: 1NF, 2NF, 3NF, and BCNF, outlining the rules and objectives of each form. The assignment emphasizes the importance of normalization in achieving data consistency and accuracy, highlighting how it minimizes anomalies and improves data management processes such as insertion, deletion, and modification. The role of primary keys and constraints in maintaining database integrity is also discussed. References to relevant academic sources are included to support the concepts presented.

Running head: DATABASE FUNDAMETALS
DATABASE FUNDAMETALS
Name of the Student
Name of the University
Authors note
DATABASE FUNDAMETALS
Name of the Student
Name of the University
Authors note
Paraphrase This Document
Need a fresh take? Get an instant paraphrase of this document with our AI Paraphraser

1DATABASE FUNDAMETALS
Normalization Process
In database design the Normalization process helps in reducing the redundant data in
different tables. The normalization is he part of logical database designing process for the OLTP
(online transaction processing) databases.
There are mainly four normal forms in which the data base tables can be reduced in
order to achieve the data integrity. These are 1NNF, 2NF,3NF, BCNF1.
In case of 1NF or first normal form, the rule states any attribute or column of an specific
table cannot hold multiple values for it2. It enforces the atomicity rule and thus the attribute
should only contain atomic values. In this way it is possible to reduce or eliminate the repetition
of group of values while storing each of the attributes in a separate table and connect the tables
using one-to-many relationship.
In the second normal form, the main objective is removal or avoiding the duplicate data
in different tables. At first the data which is partly dependent on primary key of the table is
considered and then they are stored it another table. Any table or an entity is said to be in 2NF,
if the table meets the requirements of being in 1NF as well as does not contain composite
primary key3. In the 2NF, the main concern is elimination of the functional dependencies on
different partial keys.
In case of third normal form or 3NF, the primary objective is removal of the tuples or
data rows in the concerned tables which are not dependent on primary key of the table. Finally,
it can be stated that, any entity or table can be said in third normal form or 3NF if the table meets
1 (Kumar, Kunal and Azad 2017)
2 (Al-Othman 2018)
3 ( Koloniari, Stefanidis and Christos 2016) (Al-Othman 2018) (Kumar, Kunal and Azad 2017)
Normalization Process
In database design the Normalization process helps in reducing the redundant data in
different tables. The normalization is he part of logical database designing process for the OLTP
(online transaction processing) databases.
There are mainly four normal forms in which the data base tables can be reduced in
order to achieve the data integrity. These are 1NNF, 2NF,3NF, BCNF1.
In case of 1NF or first normal form, the rule states any attribute or column of an specific
table cannot hold multiple values for it2. It enforces the atomicity rule and thus the attribute
should only contain atomic values. In this way it is possible to reduce or eliminate the repetition
of group of values while storing each of the attributes in a separate table and connect the tables
using one-to-many relationship.
In the second normal form, the main objective is removal or avoiding the duplicate data
in different tables. At first the data which is partly dependent on primary key of the table is
considered and then they are stored it another table. Any table or an entity is said to be in 2NF,
if the table meets the requirements of being in 1NF as well as does not contain composite
primary key3. In the 2NF, the main concern is elimination of the functional dependencies on
different partial keys.
In case of third normal form or 3NF, the primary objective is removal of the tuples or
data rows in the concerned tables which are not dependent on primary key of the table. Finally,
it can be stated that, any entity or table can be said in third normal form or 3NF if the table meets
1 (Kumar, Kunal and Azad 2017)
2 (Al-Othman 2018)
3 ( Koloniari, Stefanidis and Christos 2016) (Al-Othman 2018) (Kumar, Kunal and Azad 2017)

2DATABASE FUNDAMETALS
the all the requirements of 1NF, 2NF as well as there is no transitive functional dependency in
the different attributes of the table.
In BCNF, any table or entity is said to be in BCNF if the entity is in 3NF and every
functional dependency X → Y in the table the X is super Key. The BCNF is one of the most
strinct normalized form.
Importance in Data integrity and accuracy
The normalization process is important in order to eliminates or reduce the redundant
data. In the normalization process large tables are divided in two or more smaller tables with
fewer number of columns. In this way, it is also possible to encourage data consistency between
different tables as well as data accuracy4. While achieving the normalization it is important to
have primary keys and other constraints on the database tables, this leads to restricting the
different anomalies relate to the tables. In this way it is possible to make the database table
consistent and integrity of the stored data through the relationships. Elimination of the anomalies
improves the process of inserting rows of data, deletion of data as well as modification of data in
tables.
4 ( Koloniari, Stefanidis and Christos 2016)
the all the requirements of 1NF, 2NF as well as there is no transitive functional dependency in
the different attributes of the table.
In BCNF, any table or entity is said to be in BCNF if the entity is in 3NF and every
functional dependency X → Y in the table the X is super Key. The BCNF is one of the most
strinct normalized form.
Importance in Data integrity and accuracy
The normalization process is important in order to eliminates or reduce the redundant
data. In the normalization process large tables are divided in two or more smaller tables with
fewer number of columns. In this way, it is also possible to encourage data consistency between
different tables as well as data accuracy4. While achieving the normalization it is important to
have primary keys and other constraints on the database tables, this leads to restricting the
different anomalies relate to the tables. In this way it is possible to make the database table
consistent and integrity of the stored data through the relationships. Elimination of the anomalies
improves the process of inserting rows of data, deletion of data as well as modification of data in
tables.
4 ( Koloniari, Stefanidis and Christos 2016)
⊘ This is a preview!⊘
Do you want full access?
Subscribe today to unlock all pages.

Trusted by 1+ million students worldwide

3DATABASE FUNDAMETALS
References
Al-Othman, Abdulwahab. "Database Systems." (2018).
Eessaar, Erki. "The database normalization theory and the theory of normalized systems: finding
a common ground." Baltic Journal of Modern Computing 4, no. 1 (2016): 5.
Kumar, Kunal, and S. K. Azad. "Database normalization design pattern." In 2017 4th IEEE Uttar
Pradesh Section International Conference on Electrical, Computer and Electronics (UPCON), pp.
318-322. IEEE, 2017.
Stefanidis, Christos, and Georgia Koloniari. "An interactive tool for teaching and learning
database normalization." In Proceedings of the 20th Pan-Hellenic Conference on Informatics, pp.
1-4. 2016.
References
Al-Othman, Abdulwahab. "Database Systems." (2018).
Eessaar, Erki. "The database normalization theory and the theory of normalized systems: finding
a common ground." Baltic Journal of Modern Computing 4, no. 1 (2016): 5.
Kumar, Kunal, and S. K. Azad. "Database normalization design pattern." In 2017 4th IEEE Uttar
Pradesh Section International Conference on Electrical, Computer and Electronics (UPCON), pp.
318-322. IEEE, 2017.
Stefanidis, Christos, and Georgia Koloniari. "An interactive tool for teaching and learning
database normalization." In Proceedings of the 20th Pan-Hellenic Conference on Informatics, pp.
1-4. 2016.
1 out of 4
Related Documents

Your All-in-One AI-Powered Toolkit for Academic Success.
+13062052269
info@desklib.com
Available 24*7 on WhatsApp / Email
Unlock your academic potential
Copyright © 2020–2025 A2Z Services. All Rights Reserved. Developed and managed by ZUCOL.