Abstract
A linear recursive program consists of a set of procedures where each procedure can make at most one recursive call. The conventional stack implementation of recursion requires time and space both proportional to n, the depth of recursion. It is shown that in order to implement linear recursion so as to execute in time n one does not need space proportional to n : ne for arbitrarily small e will do. It is also known that with constant space one can implement linear recursion in time n. We show that one can do much better : ne for arbitrarily small c. We also describe an algorithm that lies between these two: it takes time n.log(n) and space log(n). In this context one can demonstrate a speed-up theorem for linear recursion - given any constant-space program implementing linear recursion, one can effectively find another constant space program that runs faster almost everywhere.

This publication has 3 references indexed in Scilit: