Talk:Group method of data handling

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Untitled[edit]

It is obviously hard to define what is GMDH. Since it is a set of algorithms the definition should be a set of its common properties I think.

You are right. Almost all GMDH algorithms sort-out gradually changing models and check them by external criterion. Even OCC algorithm. Perelom 12:44, 30 October 2007 (UTC)[reply]

In the description of GMDH: "..it simultaneously minimize the models error and find out the optimal model structure.." the phrase "minimize the models error" is not a property of GMDH. This is a property of a criterion of regularity but, there are a lot of other criteria for which this is not truth.

Of course. But the main here that it is done simultaneously in GMDH. From the three classes of criteria (accuracy, balance and information type) usually is used criteria of accuracy. Perelom 12:44, 30 October 2007 (UTC)[reply]
Then "..it usually minimize the models error and simultaneously find out the optimal model structure.." :) I use maximum level of error and penalty for complexity instead of minimization of error. So, I'll suggest changes. Kosh21 21:27, 30 October 2007 (UTC)[reply]

As far as I understand, the only principle of GMDH that is really common for all algorithms is the 'search of a model of optimal complexity' this principle makes us to use 'sample dividing' and gives us 'noise resistance'. It is used in combinatorial, multilayered and harmonic algorithms for sure.

Difference of the GMDH algorithms from another algorithms of structural identification and networks consists of several main peculiarities, which I think must be added to the page:
- usage of external criteria make it possible to take into account automatically several apriori uncertanties during model construction. Only in such way can be founded optimal model, adequate to length or level of noise in data sample;
- more diversity of structure generators: usage of full or reduced sorting of structure variants and different multilayered (iteration) procedures;
- better level of automatization: there is needed to enter the class of model and type of external criterion only;
- effect of noiseimmunity: automatic adaptation of optimal model to level of noises or statistical violations cause robustness of the approach;
- implementation of principle of inconclusive decisions during process of gradual models complication. Perelom 13:25, 30 October 2007 (UTC)[reply]

The second, inductiveness is a property of only multilayered GMDH i.e. property of GMDH-type NNs. I can't see any inductiveness in the combinatorial algorithm because models are not 'gradually complicated'. Perhaps that is not good but that is the way it works.

No, the Combinatorial is pure 'inductive' sorting GMDH algorithm. By the way, it make full sorting of models with not only increasing, but also decreasing complexity. Perelom 12:44, 30 October 2007 (UTC)[reply]
This point is interesting to discuss. Full combinatorial search can't be stopped at a certain level of complexity because of numerous local minimums - see picture in the Combinatorial GMDH section of the article. Then no sense to check models with gradually complicated structure. You can't obtain an optimal model before you will check all of them (with any consideration order). But, if you know how to stop the combinatorial search then this knowledge is very interesting to me. BTW, I propose to use the gmdh malign list for the discussion. Kosh21 21:27, 30 October 2007 (UTC)[reply]

External criterion?[edit]

Isn't that what is more commonly called a loss function? QVVERTYVS (hm?) 09:39, 29 July 2015 (UTC)[reply]

Algorithm and model[edit]

This article is missing descriptions of what the algorithm and the final model look like. The description

GMDH algorithms gradually increase the number of partial model components

could just as well describe a boosting algorithm; it wasn't until I read the Schmidhuber paper that I realized this algorithm is learning layered neural nets. QVVERTYVS (hm?) 10:15, 29 July 2015 (UTC)[reply]