As the days go by, there are technologies that are being introduced everyday, whether it is a tiny music player iPod nano or a robot “Asimo” that runs 6 kilometers per hour. These technologies entertain, facilitate and make the day easier for the human being. It is not arguable anymore that the people need these technologies with the smart systems to lead their regular life smoothly. The smarter the system is;the more people like to use it. One major part of this smartness of the system depends on how well the system can interact with the person or the user. It is not a dream anymore that a system will be able to interact with a human just the way that one human interacts with another. To make that happen, it is obvious that the system must be intelligent enough to understand a human being. For example, if we need a Robot that can have a random conversation with a human, the system must recognize and understand the spoken word to reply the human. And the reply will be based on the current mood and behavior of the human. In this scenario, a human uses his senses to receive the inputs such as voice through the hearing senses, behavior and movement of the body parts, and facial expression through seeing sense from the speaking human. And it is now apparently possible to take such inputs for a system which can be stored as data;later it is possible to analyze the data using various algorithms and also to teach the system through Machine Learning algorithms. We will briefly discuss issues related to the relevance and the possible impact of research in the field of Artificial Intelligence, with special attention to the Computer Vision and Pattern Recognition, Natural Language Processing, Human Computer Interaction, Data Warehouse and Data Mining that is used to identify and analyze data like psychological signals, voice, conversation, geo location, and geo weather, etc. In our research, we have used heart rate that is a successful physiological signal to detect human mood and used smartphone usage data to train the