Comparative Summary: Intelligent Systems and Natural Language Processing
Verified
Added on 2023/06/01
|4
|665
|451
AI Summary
This comparative summary analyzes the efficiency of existing Dynamic Memory Networks (DMN) for NPL used for question answers, implications of Fuzzy Logic in NPL in web searching with User Inputs Processing Efficiency (UIPE), and the usage of NLP techniques in computer vision considering image recognition and processing system.
Contribute Materials
Your contribution can guide someone’s learning journey. Share your
documents today.
Running head: COMPARATIVE SUMMARY Intelligent Systems and Natural Language Processing Name of the Student Name of the University Author Note
Secure Best Marks with AI Grader
Need help grading? Try our AI Grader for instant feedback on your assignments.
1COMPARATIVE SUMMARY NaturalLanguageProcessingorNPLhasmadeahugechangeinArtificial Intelligence and Human Language Recognition and Processing System. In “A bibliometric analysis of natural language processing in medical research”, the author Chen et al. presented a bibliometric analysis based on the outputs of global medical studies and research on the topic NPL. On the other hand the in the article “Dynamic Memory Networks for Natural Language Processing” author Kumar et al. examined the efficiency of existing Dynamic Memory Networks (DMN) for NPL used for question answers. At the same time, Gupta, Jain and Joshi presented the implications of Fuzzy Logic in NPL in web searching with User Inputs Processing Efficiency (UIPE). Author Socher et al. described the usage of NLP techniques in computer vision considering image recognition and processing system. The presentation of Chen et al. is only covered the NPL studies in medical field aiming to identify the empowerment of NPL in medical use. Hence, it is clear that this study does not have any implementable significance in the developmental study of NPL. On the contrary, the Dynamic Memory Networks proposed by author Kumar et al. has changed the perception of user input decoding and processing concepts. This model has shown some significant increment in operational efficiency of episodic memory module and Attention mechanism by comparing the time required to process a question (). G has been declared as scoring function. Gupta, Jain and Joshi focuses on just the efficiency assessment of existing fuzzy logic and future implication. The research of Socher et al. presented an algorithm, which is only effective for image recognition and processing through adjustment matrix. This algorithm has been successfully executed the image fragmentation processing by 4% increased efficiency. The “Structural Perdition” system with Recursive Neural Network developed the correct trees of data to recognize and process the image based lingual information.
2COMPARATIVE SUMMARY As per all the critical comparison of presented algorithms on NPL, the paper of NPL research in medical studies has been found as the least significant paper among the others. TheStructuralPerditioninaneuralnetworkshowsanefficientwayofimage decoding/encoding procedure. Similarly, However, the study presented by Kumar et al. has the most tangible and implementable outputs that can be directly used for advance data processing and AI remodeling through “G scoring” system.
3COMPARATIVE SUMMARY Reference: Chen, X., Xie, H., Wang, F. L., Liu, Z., Xu, J., & Hao, T. (2018). A bibliometric analysis of naturallanguageprocessinginmedicalresearch.BMCmedicalinformaticsand decision making,18(1), 14. Gupta, C., Jain, A., & Joshi, N. (2018). Fuzzy Logic in Natural Language Processing–A Closer View.Procedia Computer Science,132, 1375-1384. Kumar, A., Irsoy, O., Ondruska, P., Iyyer, M., Bradbury, J., Gulrajani, I., ... & Socher, R. (2016, June). Ask me anything: Dynamic memory networks for natural language processing. InInternational Conference on Machine Learning(pp. 1378-1387). Socher, R., Lin, C. C., Manning, C., & Ng, A. Y. (2011). Parsing natural scenes and natural language with recursive neural networks. InProceedings of the 28th international conference on machine learning (ICML-11)(pp. 129-136).