
The interest prediction variables were evaluated on different methods (neural networks, nearest neighbor, spline smoothing). The main difference between econometrics and nonparametric econometrics is that the last does not require specification of the functional forms of the objects being evaluated. Nonparametric econometrics weakens parametric assumptions, and it makes predictions closer to empirical data.
Nonparametric econometrics definition
Nonparametric econometrics is a branch of the quantitative science of modeling the economy, that uses the data to form the Econometric model itself.
The history of econometrics origin
An important stage in the formation of econometrics was the construction of economic barometers. The construction of economic barometers is based on the idea that there are indicators that change earlier than others and therefore can serve as signals of changes in the latter. The first and most famous was the Harvard Barometer, which was created in 1903 under the leadership of W. Parsons and W. Mitchell. It consisted of curves characterizing the stock, commodity, and money markets.
Until the 1970s, econometrics was understood as an empirical evaluation of models created within the framework of economic theory.
An important event in the development of econometrics was the advent of computers. Thanks to them, the statistical analysis of time series has received a powerful development. J. Box and G. Jenkinsgiep created the ARIMA model in 1970, and K. Sims and some other scientists – VAR models in the early 1980s. It stimulated econometric research and the rapid development of financial markets and derivatives. This led to the winner of the Nobel Prize in Economics in 1981, J. Tobin’s approach to developing models using censored data. Norwegian economist Trygve Magnus Haavelmo considered economic series as the realization of random processes. The main problems that appear when working with such data are non-stationarity and strong volatility. If the variables are non-stationary, then there is a risk of establishing a connection where there is none. A solution to this problem is the transition from the levels of the series to their differences. In 1989, Haavelmo was awarded the Nobel Prize in Economics “for his clarification of the probability theory foundations of econometrics and his analyses of simultaneous economic structures.”
The disadvantage of this method is the complexity of the economic interpretation of the results obtained. To solve this problem, Clive Granger introduced the concept of cointegration as a stationary combination between non-stationary variables. He proposed a deviation correction model (ESM), for which he developed methods for estimating its parameters, generalizing, and testing. Cointegration is applied if the short-term dynamics reflect significant destabilizing factors, and the long-term tends to economic equilibrium. The models created by Granger were generalized by S. Johansen in 1990 for the multidimensional case. In 2003, Granger together with R. Engle received the Nobel Prize. R. Engle, in turn, is known as the creator of models with time-varying volatility (the so-called ARCH models). These models have become widespread in financial markets.