In daily life, people use their hands in various ways for most daily activities. There are many applications based on the position, direction, and joints of the hand, including gesture recognition, gesture prediction, robotics and so on. This paper proposes a gesture prediction system that uses hand joint coordinate features collected by the Leap Motion to predict dynamic hand gestures. The model is applied to the NAO robot to verify the effectiveness of the proposed method. First of all, in order to reduce jitter or jump generated in the process of data acquisition by the Leap Motion, the Kalman filter is applied to the original data. Then some new feature descriptors are introduced. The length feature, angle feature and angular velocity feature are extracted from the filtered data. These features are fed into the long-short time memory recurrent neural network (LSTM-RNN) with different combinations. Experimental results show that the combination of coordinate, length and angle features achieves the highest accuracy of 99.31%, and it can also run in real time. Finally, the trained model is applied to the NAO robot to play the finger-guessing game. Based on the predicted gesture, the NAO robot can respond in advance.