Skip to main content
Log in

General foundations of high‐dimensional model representations

  • Published:
Journal of Mathematical Chemistry Aims and scope Submit manuscript

Abstract

A family of multivariate representations is introduced to capture the input–output relationships of high‐dimensional physical systems with many input variables. A systematic mapping procedure between the inputs and outputs is prescribed to reveal the hierarchy of correlations amongst the input variables. It is argued that for most well‐defined physical systems, only relatively low‐order correlations of the input variables are expected to have an impact upon the output. The high‐dimensional model representations (HDMR) utilize this property to present an exact hierarchical representation of the physical system. At each new level of HDMR, higher‐order correlated effects of the input variables are introduced. Tests on several systems indicate that the few lowest‐order terms are often sufficient to represent the model in equivalent form to good accuracy. The input variables may be either finite‐dimensional (i.e., a vector of parameters chosen from the Euclidean space \(\mathcal{R}^n\)) or may be infinite‐dimensional as in the function space \({\text{C}}^n \left[ {0,1} \right]\). Each hierarchical level of HDMR is obtained by applying a suitable projection operator to the output function and each of these levels are orthogonal to each other with respect to an appropriately defined inner product. A family of HDMRs may be generated with each having distinct character by the use of different choices of projection operators. Two types of HDMRs are illustrated in the paper: ANOVA‐HDMR is the same as the analysis of variance (ANOVA) decomposition used in statistics. Another cut‐HDMR will be shown to be computationally more efficient than the ANOVA decomposition. Application of the HDMR tools can dramatically reduce the computational effort needed in representing the input–output relationships of a physical system. In addition, the hierarchy of identified correlation functions can provide valuable insight into the model structure. The notion of a model in the paper also encompasses input–output relationships developed with laboratory experiments, and the HDMR concepts are equally applicable in this domain. HDMRs can be classified as non‐regressive, non‐parametric learning networks. Selected applications of the HDMR concept are presented along with a discussion of its general utility.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. L. Chen, H. Rabitz, D. Considine, C. Jackman and J. Shorter, J. Geophys. Res. (1997), in press.

  2. S. Cho, J. Carmichel and H. Rabitz, Atm. City Environ. 21 (1987) 2589.

    Article  CAS  Google Scholar 

  3. P. Diaconis and M. Shahshahani, SIAM J. Sci. Statist. Comput. 5(1) (1984) 175–191.

    Article  Google Scholar 

  4. A. Douglass and R. Stolarski, Geophys. Res. Lett. 16(2) (1989) 131–134.

    Article  CAS  Google Scholar 

  5. B. Efron and C. Stein, Ann. Statist. 9(3) (1981) 586–596.

    Article  Google Scholar 

  6. A.D. Egorov, P.I. Sobolevsky and L.A. Yanovich, Functional Integrals: Approximate Evaluation and Applications (Kluwer, Dordrecht, 1993).

    Google Scholar 

  7. J. Friedman and W. Stuetzle, J. Amer. Statist. Assoc. 76 (1981) 817–823.

    Article  Google Scholar 

  8. F. Girosi and T. Poggio, Neural Comput. 1 (1989) 465–469.

    Article  Google Scholar 

  9. W.J. Gordon, Distributive lattices and the approximation of multivariate functions, in: Proc.Symp. Approximation with Special Emphasis on Spline Functions, Madison, WI, 1969, ed. I.J. Schoenberg (Academic Press, New York, 1969) pp. 223–277.

    Google Scholar 

  10. T.L. Hill, Statistical Mechanics: Principles and Selected Applications (Dover, New York, 1987).

    Google Scholar 

  11. T. Ho and H. Rabitz, J. Phys. Chem. 97(51) (1993) 13447–13456.

    Article  CAS  Google Scholar 

  12. T. Homma and A. Saltelli, Reliabil. Eng. Syst. Safety 52 (1996) 1–17.

    Article  Google Scholar 

  13. P. Huber, Ann. Statist. 13(2) (1981) 435–525.

    Article  Google Scholar 

  14. C. Jackman, A.R. Douglass, D. Considine and E. Fleming, NASA/GFSC model (1993).

  15. G.G. Lorentz, M.V. Golitschek and Y. Makovoz, Constructive Approximation (Springer, New York, 1996).

    Google Scholar 

  16. B. Øksendal, Stochastic Differential Equations (Springer, Berlin, 1995).

    Google Scholar 

  17. D. Parker, Learning logic, Working Paper No. 47, Center for Computational Research in Economics and Management Science, Massachusetts Institute of Technology (1985).

  18. T. Poggio and F. Girosi, Proc. IEEE 78 (1990) 1481–1497.

    Article  Google Scholar 

  19. H. Rabitz, Science 246 (1989) 221–226.

    Article  CAS  Google Scholar 

  20. H. Rabitz, M. Kramer and D. Dacol, Ann. Rev. Phys. Chem. 34 (1983) 419–461.

    Article  CAS  Google Scholar 

  21. H. Rabitz and K. Shim, Multicomponent semiconductor material discovery using a generalized correlated function expansion, J. Chem. Phys. (1999), submitted.

  22. A. Saltelli and I. Sobol, Reliabil. Eng. Syst. Safety 50 (1995) 225–239.

    Article  Google Scholar 

  23. G. Schatz, Rev. Modern Phys. 61 (1989) 669–688.

    Article  CAS  Google Scholar 

  24. H. Scheffe, The Analysis of Variance (Wiley, New York, 1959).

  25. K. Shim and H. Rabitz, Phys. Rev. B 58 (1998) 1940–1946.

    Article  CAS  Google Scholar 

  26. J. Shorter, P.C. Ip and H. Rabitz, An ultra-fast chemistry solver using high dimensional model representations, J. Phys. Chem. (1999), in press.

  27. J. Shorter and H. Rabitz (1999), in preparation.

  28. Y. Shreider, The Monte Carlo Method (Pergamon Press, Oxford, 1967).

    Google Scholar 

  29. B. Simon, Functional Integration and Quantum Physics (Academic Press, New York, 1979).

    Google Scholar 

  30. I. Sobol, Math. Modeling Comput. Exp. 1 (1993) 407–414.

    Google Scholar 

  31. T. Stoker, Econometrica 54 (1986) 1461–1481.

    Article  Google Scholar 

  32. C.J. Stone, Ann. Statist. 10 (1982) 1040–1053.

    Article  Google Scholar 

  33. C.J. Stone, Ann. Statist. 13 (1985) 689–705.

    Article  Google Scholar 

  34. A. Tikhonov and V. Arsenin, The Solution of Ill-Posed Problems (Wiley, Washington, DC, 1977).

  35. H. White, Artificial Neural Networks: Approximation and Learning Theory (Blackwell, Cambridge, MA, 1992).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rabitz, H., Aliş, Ö.F. General foundations of high‐dimensional model representations. Journal of Mathematical Chemistry 25, 197–233 (1999). https://doi.org/10.1023/A:1019188517934

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1019188517934

Keywords

Navigation