The existing models of servers work on the M/G/1 model which is in some ways predictable and offers us an opportunity to compare the various other server queuing models. Mathematical analysis on the M/G/1 model is available in detail. This paper presents some mathematical analysis which aims at reducing the mean service time of a multiple server model. The distribution of the Mean Service Time has been derived using Little’s Law and a C++ simulation code has been provided to enable a test run so that the QoS of a multi-server system can be improved by reducing the Mean Service Time.