Data Management: Abstraction, Normalization, Reconciliation Challenges

Verified

Added on  2022/09/30

|3
|748
|340
Homework Assignment
AI Summary
This assignment delves into the processes of data abstraction from clinical records, data normalization, and data reconciliation. It explains how data is extracted from clinical records, converted into electronic health records, and structured for analysis, including patient information, medications, and history. The solution also describes data normalization, which aims to reduce redundancy and improve data consistency, and data reconciliation, used during data migration to ensure data integrity and accuracy, addressing potential errors and inconsistencies. Finally, the assignment identifies challenges associated with using data from multiple sources, such as lack of expertise, data volume, inconsistency, and the importance of data quality, while also highlighting solutions like cloud computing. The solution is supported by cited references.
Document Page
How are data abstracted from clinical records?
The data is collected from clinical records is comprised of details regarding patient treatment
and medication are transcribed in to electronic health records which stores the information in
digitalized form. First, data is collected manually in the form of charts and then converted to
digital form. The clinical records provide a detailed information regarding patient and it helps
to make decisions by avoiding risks associated with safety of patient. The information
gathered from clinical records mainly include patient related information. medication, history
of patient, allergy and immunization related are considered under structured data and it is
abstracted for further studies (Gupta & Rani, 2019).
Describe the process of normalizing data.?
The data normalization process is mainly aimed to reduce the data redundancy and it is
generally employed during data base design and redesign procedures. The objective of data
normalization is reduce the data duplication and limit to minimum referred as normal form. It
improves the consistency of data obtained and offers flexibility for designing database
followed by enhancement of security (Osinska & Bala, 2015). It is comprised of guidelines
that is used while designing a database by reducing the duplication of data.
Describe the process of reconciling data?
During migration procedure of data it requires verification procedures needs to be done and
this refers to as data reconciliation. Data sources that are available are used to compare with
processing of data that is targeted in order to secure that data is transferred by migration.
Mathematical related models are used for reconciliation and validation of data in order to get
information processed.
1
tabler-icon-diamond-filled.svg

Secure Best Marks with AI Grader

Need help grading? Try our AI Grader for instant feedback on your assignments.
Document Page
During migration procedure there are chances for error in logic transformation and mapping
of data. This will result in failure of transactions and breakdown with network that result in
data corruption. This further create problems such as duplication of data, missing of records
and values and unstructured format of values. Reconciliation enables control integration of
enterprise and provides insight to customer service related problems (Kim et.al,2017). It is
also used to extract reliable and accurate data regarding organization by using raw data as a
source. It is aimed at rectifying errors and manage efficient data processing with zero errors.
What are the challenges associated with using data from different sources?
There are several challenges while extracting data from multiple sources are observed mainly
during analysis, curation, sharing , capturing, visualization, search ,storage and privacy of
data. Some of the challenges include lack of proper talent or expert to extract data,
uncertainty ,syncing process, data volume and its validity. It is critical to obtain data with
good quality that comprises of accurate and reliable information for clients and challenges are
mainly related with processing such a large datasets in specific period of time and costs
associated with it. It is a complex process to collect and transform the data to the desired
location with better quality and efficiency. The quality of data can be further enhanced by
process of extraction and transformation of data is to be considered at from the source. Data
inconsistency is another challenge that result in providing inaccurate information result in
biased decisions (Pattuelli, 2010). Currently there are different solutions available for reliable
and accuracy of data that is extracted from multiple source and one such example is cloud
computing.
2
Document Page
References:
Gupta, D., & Rani, R. (2019). A study of big data evolution and research challenges. Journal
of Information Science, 45(3), 322–340.
Osinska, V., & Bala, P. (2015). Study of dynamics of structured knowledge: Qualitative
analysis of different mapping approaches. Journal of Information Science, 41(2),
197–208.
Kim, S., Lee, J.-G., & Yi, M. Y. (2017). Developing information quality assessment
framework of presentation slides. Journal of Information Science, 43(6), 742–768.
Pattuelli, M. C. (2010). Knowledge organization landscape: A content analysis of
introductory courses. Journal of Information Science, 36(6), 812–822.
3
chevron_up_icon
1 out of 3
circle_padding
hide_on_mobile
zoom_out_icon
logo.png

Your All-in-One AI-Powered Toolkit for Academic Success.

Available 24*7 on WhatsApp / Email

[object Object]