Talking Machine: A Survey of Chatbot Foundation, Use Case, and Challenges
DOI:
https://doi.org/10.25195/ijci.v52i1.668Keywords:
Artificial Intelligence, machine learning, chatbot, Natural Language Processing (NLP), neural networkAbstract
Recently, chatbot systems have grown from the very basic rule-based systems to more advanced ones, such as natural and context-aware systems that incorporate sophisticated neural network techniques. This paper provides a comprehensive review of evolution and taxonomy and analyzes the application of chatbots in various fields, such as healthcare, banking, education, mental health, customer service, and image recognition, to reveal contemporary strengths and applications. In addition to observation of the most important assessment measurements and highlights of the latest studies, promising performance has been reported in the literature, with accuracy/success rates over 90% for some studies, with user satisfaction levels reaching 95% in certain health-related scenarios; however, these results remain specific to their settings rather than generalizable for all chat applications. The paper concludes that contemporary AI-based chatbots are strong in response generation and context appreciation but are still weak when dealing with ambiguous questions and multi-turn dialogues. Some of the key challenges, such as making the system scalable and personal and achieving human-like conversational quality in different contexts, were identified. This in-depth review summarizes recent developments as well as challenges in the area of chatbot technology and emphasizes the necessity of innovation in AI and NLP to overcome these challenges and improve chatbot performance and user experience.
Downloads
Downloads
Published
Issue
Section
License
Copyright (c) 2026 Iraqi Journal for Computers and Informatics

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
IJCI applies the Creative Commons Attribution (CC BY) license to articles. The author of the submitted paper for publication by IJCI has the CC BY license. Under this Open Access license, the author gives an agreement to any author to reuse the article in whole or part for any purpose, even for commercial purposes. Anyone may copy, distribute, or reuse the content as long as the author and source are properly cited. This facility helps in re-use and ensures that journal content is available for the needs of research.
If the manuscript contains photos, images, figures, tables, audio files, videos, etc., that the author or the co-authors do not own, IJCI will require the author to provide the journal with proof that the owner of that content has given the author written permission to use it, and the owner has approved that the CC BY license being applied to content. IJCI provides a form that the author can use to ask for permission from the owner. If the author does not have owner permission, IJCI will ask the author to remove that content and/or replace it with other content that the author owns or has such permission to use.
Many authors assume that if they previously published a paper through another publisher, they have the right to reuse that content in their PLOS paper, but that is not necessarily the case – it depends on the license that covers the other paper. The author must ascertain the rights he/she has of a specific license (a license that enables the author to use the content). The author must obtain written permission from the publisher to use the content in the IJCI paper. The author should not include any content in her/his IJCI paper without having the right to use it, and always give proper attribution.
The accompanying submitted data should be stated with licensing policies, the policies should not be more restrictive than CC BY.
IJCI has the right to remove photos, captures, images, figures, tables, illustrations, audio, and video files, from a paper before or after publication, if these contents were included in the author's paper without permission from the owner of the content.







