Abstract
The mean queueing time in a G/GI/m queue is shown to be a nonincreasing and convex function of the number of servers, m. This means that the marginal decrease in mean queueing time brought about by the addition of two extra servers is always less than twice the decrease brought about by the addition of one extra server. As a consequence, a method of marginal analysis is optimal for allocating a number of servers amongst several service facilities so as to minimize the sum of die mean queueing times at the facilities.