Mathematical Techniques for Efficient Record Segmentation in Large Shared Databases
- 1 October 1976
- journal article
- Published by Association for Computing Machinery (ACM) in Journal of the ACM
- Vol. 23 (4), 619-635
- https://doi.org/10.1145/321978.321982
Abstract
It is possible to significantly reduce the average cost of information retrieval from a large shared database by partitioning data items stored within each record into a primary and a secondary record segment. An analytic model, based upon knowledge of data item lengths, transportation costs, and retrieval patterns, is developed to assist an analyst with this assignment problem. The model is generally applicable to environments in which a database resides in secondary storage, and is useful for both uniprogramming and multiprogramming systems. A computationally tractable record design algorithm has been implemented as a Fortran program and applied to numerous problems. Realistic examples are presented which demonstrate a potential for reducing total system cost by more than 65 percent.Keywords
This publication has 5 references indexed in Scilit:
- A new technique for compression and storage of dataCommunications of the ACM, 1974
- Theoretical Improvements in Algorithmic Efficiency for Network Flow ProblemsJournal of the ACM, 1972
- An optimization problem on the selection of secondary keysPublished by Association for Computing Machinery (ACM) ,1971
- Operations Research in Wartime Naval MiningOperations Research, 1967
- Flows in NetworksPublished by Walter de Gruyter GmbH ,1963