By Achilleas Zapranis BSc, MSc, PhD, Apostolos-Paul N. Refenes BSc, PhD (auth.)
Neural networks have had substantial luck in numerous disciplines together with engineering, keep an eye on, and monetary modelling. despite the fact that an enormous weak spot is the shortcoming of demonstrated strategies for trying out mis-specified versions and the statistical importance of a number of the parameters which were predicted. this is often quite very important within the majority of monetary purposes the place the information producing strategies are dominantly stochastic and in simple terms in part deterministic. in response to the newest, most vital advancements in estimation thought, version choice and the idea of mis-specified types, this quantity develops neural networks into a sophisticated monetary econometrics software for non-parametric modelling. It offers the theoretical framework required, and screens the effective use of neural networks for modelling complicated monetary phenomena. not like so much different books during this region, this one treats neural networks as statistical units for non-linear, non-parametric regression analysis.
Read or Download Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics PDF
Best econometrics books
This publication offers a complete description of the cutting-edge in modelling worldwide and nationwide economies. It introduces the long-run structural method of modelling that may be without difficulty followed to be used in realizing how economies paintings, and in producing forecasts for determination- and policy-makers.
A accomplished account of financial dimension distributions all over the world and during the yearsIn the process the previous a hundred years, economists and utilized statisticians have built a remarkably assorted number of source of revenue distribution versions, but no unmarried source convincingly debts for all of those types, reading their strengths and weaknesses, similarities and adjustments.
From 1976 to the start of the millennium—covering the quarter-century existence span of this ebook and its predecessor—something notable has occurred to marketplace reaction examine: it has turn into perform. teachers who train in expert fields, like we do, dream of such issues. think the pride of figuring out that your paintings has been integrated into the decision-making regimen of name managers, that type administration is dependent upon options you built, that advertising administration believes in whatever you struggled to set up of their minds.
- Macroeconomic analysis: Essays in macroeconomics and econometrics
- Analysis of Financial Time Series, Third Edition (Wiley Series in Probability and Statistics)
- Introduction to Bayesian Econometrics
- Education Statistics of the United States 2003
- Time Series: Theory and Methods
- Research in Finance, Volume 26
Additional info for Principles of Neural Model Identification, Selection and Adequacy: With Applications to Financial Econometrics
A test statistic can be formulated which under the null hypothesis follows a known distribution. If the null hypothesis is not refuted, a new model belonging to a class Sq is derived from Sk> where q < k. However, the analysis is complicated, since the distribution of the parameters belongs to the Limited Mixed Gaussian family (Phillips, 1989). The second and most commonly used approach for model selection is the so-called discriminating approach (all the approaches reviewed here belong to this class).
On the contrary, neural model selection was implicitly taken to be the same as model identification. Model adequacy testing was largely ignored, while input variabIes were selected on the basis of the magnitudes of a number of 'relevance measures', rather than on the basis of their statistical significance. In this chapter we review the current practice in all these three areas. Thus we attempt to create the basis upon which we built to facilitate a modular methodology for neural model identification.
The incentives behind this can be briefly summarized as follows: • eliminate specification bias due to the inclusion of irrelevant variables; • obtain parsimonious (and thus easier to interpret) models, by using only key variables and relegating all minor or random influences to the stochastic component; • increase the degrees offreedom for error, by reducing the number of explanatory variables and thus obtaining a more favourable sample size to parameters ratio. Evaluating the statistical significance of the variables in the linear regression model i =1, ...