chapter_22.fm

Similar documents
07 チ チ チ

NO- 3 (2010)

ィ ィ 06ィーィ ィエ02ィェィ ィコ08ィィィー0208ィィ07ィャ ィィィェィー ィコィーィョィ 05ィケィェ0609ィーィィ ィィ09ィコィョ0909ィー0902ィェィェ ィィ09ィー02ィャ. 06

PIV7.vp

Aksenov.vp

ィ ィェ 0203ィ 07ィ ィー04ィヲ ィャ02ィェィー ィコ09ィ

R03_LAVAMAT_54610_01_oS_5kg_sy_ZP.bk

ィコィィィェ ィコ060909ィコィィィヲ 03ィョ08ィェィ 05 ィャ020301ィョィェィ ィェ ィ 09ィ ィ 09ィー09ィョ02ィー 08ィ 0409ィィィーィィ06 ィャ020301ィョィ

Na2008_10

tom2.p65

ÐÔ45.pmd

Gender.pm6

ÐÔ 42 â ïå÷àòü.pmd

untitled

1300ィィ ィ ィィ04ィ ィー0608 ィィ ィェィェ ィ ィ , 00ィー ィ ィェィィ ツ0209ィコィィ02 03ィィ01ィコ0609ィーィィ 00ィィ ィ ィィ04ィ ィー0608,

Áeçuìÿííûé-1

untitled

ULANOV.qxd

ÐÔ 48.pmd

1307 チ チ チ01040

< FCEE2F7E8EDF1EAE8E95FCAF0E8ECE8EDE0EBFCEDEEE520EDE02E2E2E>

Accounting Report

4_2009.p65

Îáëîæêà1.pm6

J1_1998

<D1E0F3F8EAE8ED312E7670>

KITMIR2_04.vp

Мир Евразии №1(2016)

I. 02ィ ィエィィ02 ィィ ィー ィーィィ ツ0209ィコィィ ィ ィコ02ィィ06ィェィェィ 07 ィコ ィィ07: ィケ07ィコ06ィェ0609 (06ィー09. 08

R03_KB9820E_EU_Compact_R5.bk

BEJ pmd

Est_na_perelome.pdf

VP:CorelVentura 7.0

untitled

Èçâ_¹4(95)_2011_ñ1.pm6

2_2004_

Verstka_#14.FH10

для типогр-часть1.p65

untitled

13

МР153.pm6

Untitled-1

Ò_1.PM6

soderzhanie-05.indd

КАФЕДРА ТЕОРИИ ВЕРОЯТНОСТЕЙ

-15_Kam

Untitled-2

チ ィィ 09ィー08ィ ィェ ィィ ィー0907 ィ ィ ィーィェ04ィャ 05ィィ

MP-14.pmd

ィ ィコ02ィィ02ィヲ ィャィィ ィ ィェ0609ィェ ィケ ィコ06ィヲ, ィ 09ィ ィー ィィ N 19, 00.02ィャ09ィコ;

028265rs.p65

OKO-LAVAMAT_88840_advanced.bk

Vest_2_2006.pmd

<443A5C4F6C65675CC8C8CCCA5CD1DAC5C7C4DB5CF1FAE5E7E42DD0F3F1F1E05CCCE0F2E5F0E8E0EBFB20D1FAE5E7E4E05C56656E C546F6D312E7670>

Ãëÿíåö.pm6

R03_LAVALOGIC_1820_LCD_5kg_BT_db.bk

Женские глянцевые журналы: хронотоп воображаемой повседневности


ÐÔ pmd

Китай - надвигается война?

R03_OKO-LAVAMAT_LAVALOGIC_1810_LCD.bk

Untitled-30

ィィ 0108ィョ00ィィ01 ィョ ツ02ィ ィェィィィコ0609, 08ィ 0408ィ ィ 06ィーィ ィェィェ0401 ィコ06ィャ07ィ ィェィィ02ィヲ 00ィ ィヲィコ ィー: 07 ツ02ィ ィェ ィコィー04 09 ィ

World Bank Document

untitled

Fungi4.p65

1306ィェ000608ィャィ 02ィィ06ィェィェ0402 ィィ ィー020502ィコ06ィャィャィョィェィィィコィ 02ィィ06ィェィェ0402 ィー0201ィェ ィィィィ 5 PLUS チ PLUS ィョ0502ィヲ

チ :

untitled

Симпозиум-2007-пробный p65

untitled

Microsoft Word Contents-v02

GIS_Nov_Kar

№ 1 (13) 2002

katalog.p65

Оглавление.pmd

772_a.p65

13

untitled

Libretto_New_RS

TIT.p65

059_1

/ ィ ィェィコィー-0302ィー0208ィ ィョ ィィィー0208ィ ィーィョ08ィェ06-07ィョィ 05ィ

Êîé Áýàôóò. Ðåâîëþöèÿ Quixtar.pmd

Overshoot (R) (Lat 16 Jan).qxd

Mova_12.p65

Microsoft Word - Муравьев.doc

1304ィ ィ 06ィーィ ィェ02ィェィ ツ ィェィィィィ ィィィヲ09ィコ06ィヲ 08ィコィ 0102ィャィィィィ 01ィ ィョィコ 08ィィ04ィィィコ06-ィー0201ィェィィ ツ0209ィコ06ィャ ィィィェ09ィーィィィ

№4--(2012).vp

hydra main.pm6

part-2_glava-4_ p65

25_rozdil.indd

NOR4.QXD

3 A3 10

RA2.qxd

r-01.p65

Kniga9.vp

CH_0_RU.p65

033722rs.p65

ィィィェ01ィィィェ (0005ィ 09ィェ04ィヲ ィ ィコィー0608) ィ ィヲ0105

Transcription:

3Vladimir Krasilenko, Alexander Nikolsky, Sergei Pavlov: THE ASSOCIATIVE D-MEMORIES BASED ON MATRIX-TENSOR EQUIVALENTAL MODELS II.58486688 UDK 4.93:7.5 THE ASSOCIATIVE D-MEMORIES BASED ON MATRIX-TENSOR EQUIVALENTAL MODELS Vladimir Krasilenko, Alexander Nikolsky, Sergei Pavlov 4ィ 99ィャィ ィー8ィィ9ィ 6ィー97 78ィィィェィィ74 769ィー86ィェィィ7 ィ c96ィィィ ィーィィ9ィェ6ィヲ 7ィ ィャ7ィーィィ ィィ ィェィヲ86ィェィェ4 9ィーィヲ 57 9ィョィャ8ィェ4 ィィ46ィ 8ィ 3ィェィィィヲ ィェィ 69ィェ69 ィャィ ィー8ィィ ツィェ6-ィーィェ468ィェ4 5ィコ9ィィ9ィ 5ィェィーィェ69ィーィェ4 ィャ65ィヲ. 38563ィェ4 ィャ65ィィ 9 ィ ィ 7ィーィィ9ィェ6-5ィコ9ィィ9ィ 5ィェィーィェ69ィーィェ4ィャ 9493ィィ9ィ ィェィィィャ, 796576ィエィィ ィョ95ィィ ツィィィーィケ 6ィ ィケィャ 7ィ ィャ7ィーィィ 78ィィ 8ィ ィェィェィィィィ ィィ 8ィ 97ィェィ 9ィ ィェィィィィ 9496ィコ6ィコ6885ィィ869ィ ィェ4 9ィョィャ8ィェ4 6ィ 8ィ 469, 9ィコ56 ツィ 7 9ィョ8ィ ィ ィィ6ィェィェ6 ィィ ィャィェ668ィ ィ ィィ6ィェィェ6 ィィ46ィ 8ィ 3ィェィィ7. ィ 9ィョ3ィ 6ィー97 ィ 8ィィィーィコィーィョ84 ィィ 9ィャ63ィェ4 8ィ 5ィィ4ィ ィィィィ ィェィヲ86ィェィェ4 9ィーィヲ ィィ ィ 96ィィィ ィーィィ9ィェ6ィヲ 7ィ ィャ7ィーィィ, ィ 6ィー97 6ィェィコィィ ィィ 76ィコィ 4ィ ィー5ィヲ. The new principles of development of associative memory (AM) and neural networks (NN) for D-mage based on matrixtensor equivalental models (MTEMs) are considered. The given models use -D image adaptive-equivalentive weighting in order to increase memory capacity while storing highly correlated Dimages. The architectures of NN and AM are also suggested. Basic devices for implementations on the base of matrix-tensor equivalental model are matrix-tensor equivalentors (MTE). The MTEMs for two level and multilevel D-images are shown. The results of recognition processes modeling in such AM are given on the example of D-image recognition with the number of neuron from - thousand. A successful recognition of correlated image is achieved. Key words: optical neural networks, associative memory, matrix-tensor equivalentor, equivalental model, matrix multilevel logic.. INTRODUCTION The main perspective tendency of development of information-calculating systems and computer technologies is making them intellectual and similar to human thinking and perception []. Demonstration of the intelligence - an associative call from the memory by fragments (features), classification of patterns, their recognition, training, adaptation to the situation, (processed data, necessary precision, etc.), auto-regulation and auto-control - appears in highly developed systems (biological and technical) spontaneously. As the models of intellectual systems in the field of neurophysiology, cognitive psychology, artificial intelligence (AI), the next models of auto-processing nets are mainly used: connection models, neural nets models, models with parallel distributed processing, models with activity distribution []. Though there are differences and approaches peculiar to different fields of knowledge, where such models are used, basic researches reveal a number of important general principles besides spontaneous intellectual qualities. The principle of parallel information process on all the levels of working up in intellectual systems and the concept of active (not passive) memory (the functions of storage and associative processing are distributed among elements, which form a system) are related to such features. The model of neural net or the neural model is a connectional model, which imitates biophysical processes of working up of information in nervous system like in auto-processing system. The last one shows global system behavior (considered being "intelligent") caused by simultaneous local interactions of its numerous elements []. The great interest to neural model forces to revalue the fundamental theses in many fields of knowledge, including computer techniques. The neural models, concepts and paradigms with their new principles and wide parallel processing are almost the only way of development in creation of new nontraditional computer technique and active dynamic systems [3]. In science neural models, as the models of brain activity and cognitive processes, most probably will cause prospective results in neurophysiology, pathologophysiology, neurology, psychiatry, the appearance of the systems with increased computing resources and intelligent qualities for solving problems of medical informatics and diagnostics. Many failures on the way of improving of artificial intelligence appeared in recent years because firstly, the chosen computing techniques were not adequate to solve the important and complicated problems, and secondly, simple and not perfect neural models and nets were applied. Today, an active development of mathematical logic, especially matrix (multiciphered, fuzzy, neural) [4-8], accumulation of data about continual (analog) and obviously nonlinear functions of neurons [9-], elaboration of the neural net theory, neurobiology and neurocybernetic, and adequate algebrologic instruments for mathematical description and modeling [- 5], development of optical technologies create conditions for building technical systems, adequate in resources and architecture almost to any problem of artificial intelligence. The integration of data of neurophysiology, neuroanatomy, neurogenetic, neurochemistry, neurolinguistics i.e. neuroscience will become a basis of further development of neural intelligence and neural computers. Generalization of data of neural science makes possible to mark out a number of basic neurointellectual structures, among which associative memory of D-images takes an important place. It makes possible 45

358486688 to form a hypothesis about the object by a set of features, fragments of image and to extract an integrated pattern from the memory. Nervous system has a faculty to self-organization. Talking about active character of recognition processes in associative memory, processes of training and synthesis of internal images, the possibility to choose, to throw off (decrease) or on the contrary to add (increase) separated features, fragments, to change their weights while forming adequate internal images are meant here. In many neural models (with the exception [5]) used for optical realization of different associative devices and neural nets, only carrier sets Cb {ィC ; } N and Cu { ; } N are used in bipolar (b) and unipolar (u) coding respectively [,3]. Questions connected with increase of neural nets and associative memory (AM) capacity, especially storing large and greatly correlated images, were solving for a long time by many authors [6-8]. The methods and realization of effective recognition of greatly correlated vectors of memory, based on weighing input images [7-8] and weight of interconnections [5, 6], were proposed in [6-8]. But these propositions concerned only coding of binary images and one-dimensional images. Models, proposed in [3-5] and called equivalental, are more general and good for representation of bipolar and unipolar signals, including multilevel signals. The connections, especially braking, are described more natural there. In such models the basic operation is a standard equivalency of vectors. The models are suitable for different methods of weighing. Considering their prospects, in this work we will show how to build associative (auto-associative and hetero-associative) memory of correlated -D images, including multilevel (gray scale) images, on the basis of matrix-tensor models.. CONCEPTUAL BACKGROUND AND THEORY.. Basic neurological operations of normilixed equivalence For mathematical description of neural net associative memory (NNAM) algebro-logical operations will be considered (equivalental algebra [3-5]), combining linear algebra and neural bio-logic (NBL) [4,6]. Neural bio-logic is an integration (gnoseologically developed and specified) of known logic: multivalued, hybrid, continuous, controlled continuous [], fuzzy etc. At the same time, the integrated operations in fuzzy logic are: operation of fuzzy negation, t- norms and s-norms, and they have the relation of dualism according to general form of De Morgan principle. The examples of t-norms are: logical multiplication (min), algebraic multiplication ( a 66 b), limited multiplication etc. The examples of s-norms are logical sum (max), algebraic sum ( a + b ィC a 66 b), limited sum ( ト ( a + b) ), contrast sum etc. [] The basic operations of NBL, used in equivalental models NNAM [3-5], are binary operations of equivalence and nonequivalence, which have a few variants. The variants of these equivalence operations on a carrier set are shown on fig. (a, b, c) respectively for: ト C ナ eq a 6ユ5 b max{ min( ab, ), min( a, b) }; eq a 6ユ5 b a 66 b+ a66 b ; eq 3 a 6ユ5 + b + a ィC b. () a E E3,5,8,6,4,,8,6,4, eq ナ a6ユ5 b eq a 6ユ5 b eq 3 a 6ユ5 + b,6,4, Figure - Operations of equivalence,8,5,5 u [ ; ] N E a a 46 ISSN 67-374 ー4ィ 665ィコィー86ィェ6ィコィ. 5ィェ68ィャィ ィーィィィコィ. 778ィ 956ィェィェ7 ア,

3Vladimir Krasilenko, Alexander Nikolsky, Sergei Pavlov: THE ASSOCIATIVE D-MEMORIES BASED ON MATRIX-TENSOR EQUIVALENTAL MODELS Their negations are first, second and third nonequivalence respectively. In general case for scalar variables a, b ハ ト C [ A, B] - continuous line segment, signals ト themselves and their functions and segment C can be brought to segments [-D,D] (or [-,]) in bipolar coding and [,D] (or [,]) in unipolar coding. Further a carrier set ト C [, ] and its variables will be considered. Besides, for easier transformations we can limit ourselves at first to operation of equivalence (nonequivalence) of the second type, namely ( 6ユ5 ) and ( 6ユ5 ) respectively. Extending this basic operation ( ) for vector, matrix case of variables and making a corresponding normalization for quality registration of the components in vector or matrix, we will get normalized equivalence ( 9) of two sets of variables A a ij m チ n and B b ij m チ n A 6ユ5 n B m n m チ n ( a 6ユ5 b ) ij ij and normalized nonequivalence: i j A 6ユ5 B n m n m チ n ( a 6ユ5 b ) ij ij as integrated i j negation of normalized equivalence. If excitation matrix (input D image) is S inp, weight matrix of connections of k, l-th neuron-equivalent with input image is T k, l, then namely this neuron-equivalent will complete operation ( 6ユ5 ), because the signal on its output n ヲツ kl will be: ヲツ kl S inp 6ユ5 T, and () n kl ヲツ kl ハ ト C [, ] The operation ( 6ユ5 ) will be completed by dual neuron-nonequivalent with the signal ヲツ kl ハ ト C on its n output: ヲツ kl ィC ヲツ kl S inp 6ユ5 T n kl (3) Let us mark out, that with the help of properties of equivalence operations [5] the signals (direct and dual) can be also represented on the output of integrated (dual) neuronequivalent (nonequivalent) in such a way: ヲツ kl S inp 6ユ5 T,,, n k, l ヲツ kl S inp 6ユ5 T n k, l ヲツ kl S inp 6ユ5 T n kl ヲツ kl S inp 6ユ5 T,, (4) n kl ヲツ kl S inp 6ユ5 T n kl ヲツ kl S inp 6ユ5 T n kl Therefore, if such integrated neurons have dual inputs and dual outputs, then the last ones can be defined (each one) in four ways considering () ツ (4). And that, maybe, provides correspondent vitality of neural nets at the expense of such a repeated backup. Normalized equivalence ( 6ユ5 ) and nonequivalence ( ) are n 6ユ5 n more general new complementary metrics in matrix space R. + In particular, ( 6ユ5 n ) is a normalized metric distance d ( A, B) 6ィ8 m チ n, and for A and B ハ {, } N it turns into normalized distance of Hamming d n ( A, B) 6ィ8 N. The variants of operations of equivalence and nonequivalence depend on different types of operations of t-norms and s-norms used in them and integrated operations of crossing and joining up in fuzzy logic. Depending on type, variants of equivalent algebra (EA) [5], as a new algebro-logical instrument for creation of equivalental theory of NNAM on the basis of matrix NBL... Nonelinear transformation The basic operation of NBL with variable a i, j from A ト a i, j range of continuous normalized set can be an operation of integrated nonlinear m チ n C [, ] m チ n transformation ヲミ( a, ヲチ) with coefficient ヲチ: ヲム' ( a, ヲチ) a ヲチ, ヲム'' ( a, ヲチ) ( a) ヲチ ( ィC a) ヲチ, ヲム''' ( a, ヲチ) ヲム' ( a, ヲチ) ( ィC a) ヲチ, ヲム'''' ( a, ヲチ) ヲム'' ( a, ヲチ) ( ィC a) ヲチ ィC ( ィC a) ヲチ, a ハ [, ], ヲチ,, ュ ゙. One should mark out that for ヲチ operation p''' is in kernel a negation operation in continuous and fuzzy logic, p'' is also a negation operation, and ヲム'''' p'. Let us introduce two more iteration equivalental operations of nonlinear transformation, defined in the next way: ヲテ( a, ヲチ) 6ユ5 max( a, ヲチ) 6ユ5 max( a, ヲチ) 6ユ5 ュ 6ユ5 max( a, ヲチ), ヲチ ィC time ヲテ( a, ヲチ) 6ユ5 min( a, ヲチ) 6ユ5 min( a, ヲチ) 6ユ5 ュ 6ユ5 min( a, ヲチ).(5) ヲチ ィC time From (5) can be seen that the second operation is a negation of the fist one and vise versa. Besides, for a >, 5 fist ヲテ > and second ヲテ > functions will be: ヲテ > ( a, ヲチ) 6ユ5 a 6ユ5 a 6ユ5 ュ 6ユ5 a ヲチ ィC time 6ユ5 a 6ユ5 ュ 6ユ5 a, (6) ヲテ > ( a, ヲチ) 6ユ5 a 6ユ5 a 6ユ5 ュ 6ユ5 a 6ユ5 a 6ユ5 ュ 6ユ5 a, and for ヲチ ィC time ヲチ ィC time a<,5 these functions ヲテ < and ヲテ < will be: ヲテ > ( a, ヲチ) 6ユ5 a 6ユ5 ュ 6ユ5 a, ヲチ ィC time ヲテ > ( a, ヲチ) 6ユ5 a 6ユ5 ュ 6ユ5a. (7) ヲチ ィC time ヲチ ィC time 47

358486688 Hence it is obvious that for all a ハ [, ], ヲテ( a, ヲチ) ン, 5 and ヲテ( a, ヲチ) ワ, 5. By analogy let us determine these functions of variable a : ヲテ( a, ヲチ) ヲテ( a, ヲチ), ヲテ( a, ヲチ) ヲテ( a, ヲチ). (8) From (8) it follows that these functions are symmetric concerning variable aa ( ). Weighing (equivalently) these functions of variable, we will get nonlinear iterative transformation, reducing ratio between signals a and a. We call it competitive nonlinear transformation and let us define it taking into account properties of the operations: aヲチ kn a 6ユ5 ヲテ( a, ヲチ) a 6ユ5 ヲテ( a, ヲチ) a 6ユ5 ヲテ( a, ヲチ) a 6ユ5 ヲテ( a, ヲチ), aヲチ kn a 6ユ5 ヲテ( a, ヲチ) a 6ユ5 ヲテ( a, ヲチ) a 6ユ5 ヲテ( a, ヲチ) a 6ユ5 ヲテ( a, ヲチ). (9) Expression (9) can be written in more convenient way taking into consideration (6) and (7): aヲチ kn Or in another way: 6ヲテ( a, ヲチ + ), for a > 5,, 6 63ヲテ( a, ヲチ + ), for a < 5,, aヲチ 6ヲテ( a, ヲチ + ), for a > 5,, kn 6 63ヲテ( a, ヲチ + ), for a <, 5. aヲチ kn 6 6ユ5 a 6ユ5 ュ 6ユ5 a, for a >, 5, ( ヲチ + ) ィC time 6 6ユ5 a 6ユ5 ュ 6ユ5 a, for a <, 5, 63 ( ヲチ + ) ィC time aヲチ 6 6ユ5 a 6ユ5 ュ 6ユ5 a, for a >, 5, kn 6 63 6ユ5 a 6ユ5 ュ 6ユ5 a, for a <, 5. () It can be seen from () that any variable a or a of the range >,5 sums equivalently with itself ヲチ times at competitive nonlinear transformation, and a variable of the range <,5 - sums nonequivalently ヲチ times with itself. At the same time the indicators for ( ヲチ + ) equivalence (nonequivalence) in corresponding ranges for variable aヲチ kn and its negation aヲチ kn respectively are "" and "" for ranges (>,5) and (<,5) in a case and "" and "" for the same ranges respectively in a case. Hence, we can see that different nonlinear transformations can be also expressed by operations of equivalence (nonequivalence) or by t-norms in case of ヲム i ( a, ヲチ) transformation..3 Matrix-tenzor equivalental models (MTEM) of NNAM For simplification at first let us consider NNAM for associative recognition and reading of two-level (binary) -D images. Input, output, and q th trained images are respectively S inp Sinp ij ハ {, } N m チ n, S out Sout ij ハ {, } N, S q Sq ij ハ {, } N, where q ハ { ツ Q}. Then tensor of weights of interconnections q from input neurons to output will be T ハ [, ] N チ N, Q q q where T i, j, i', j' --- for simple NNAM, Q ( S ij, 6ユ5 S i', j' ) q where ワ i ワ m, ワ j ワ n, ワ i' ワ m, ワ j' ワ n. Considering determination of normalized equivalency ( 6ユ5 ) the component of interconnection tensor can be assign n as: q Ti, j, i', j' Si, j6ユ5 n Si', j', where Si, j Si', j' ( S ij, ュS i, j ュSQ i, j, S Q ( i', j' ュS i', j' ) t, where T i, j, i', j' ハ {, 6ィ8 Q ュ ( Q ィC ) 6ィ8 Q, }. () To determine a new state of k, l-th neuron S out in (t+) moment it is necessary to consider contributions of all weights, and it means to calculate normalized equivalence between S inp () t and matrix (k, l-th tensor plane) T k, l [ T ij ] N k, l ヘ[ x] 6, if x ン x 6 63, if x < x and to take its threshold function The expression for net recalculation is: or considering duality: (function of activation) [5]. Sk, l out ( t + ) ヘ[ S inp () t 6ユ5 T n k, l] () Sk, l out ( t + ) ヘ[ S inp () t 6ユ5 T and n k, l] Sk, l out ( t + ) ヘ[ S inp () t 6ユ5 T.(3) n k, l] ヘ[ S inp () t 6ユ5 T n kl, ] Considering () expression () (let it be basic) can be transformed: Sk, l out ( t + ) ヘ Q Q Q, Q --- S q ( kl, 6ユ5 ヲツ q ) ヘ[ Sk, l6ユ5n ヲツ ] q Q where ヲツ q S inp () t S q n 6ユ5 ; ヲツ ( ヲツ, ュヲツ q, ュヲツ Q ) t. Combining all k, l-th outputs into S out ( t + ) matrix, one can write an expression for recalculation of equivalental model of simple NNAM, increasing threshold scalar operator 48 ISSN 67-374 ー4ィ 665ィコィー86ィェ6ィコィ. 5ィェ68ィャィ ィーィィィコィ. 778ィ 956ィェィェ7 ア,

3Vladimir Krasilenko, Alexander Nikolsky, Sergei Pavlov: THE ASSOCIATIVE D-MEMORIES BASED ON MATRIX-TENSOR EQUIVALENTAL MODELS ヘ( x) to matrix pel-by-pel ヘ( x) : Q Q S out ( t + ) ヘ[ net k, l ] ヘ[ [ Sk, l6ユ5n ヲツ ]] Q Q [ ヘ kl [ Sk, l6ユ5n ヲツ ]] m チ n N..4. MTEM of adaptive-equivalental NNAM (4) To improve the properties of the models, namely MTEM of NNAM during recognition of greatly correlated patterns, we will introduce two weighing coefficients ヲチp k, l and ヲテヲツ ( q, p) (in essence ヲチ matrix and ヲツkn p vector) and modify training so as weights of interconnections could be calculated in the next way: q ヲチヲテ, Tijkl,,, Q q p q --- T, (5) Q ( ijkl,,, 6ユ5 F k, l ( ヲチ p k, l ヲテヲツ ( q, p) )) q p q where F k, l - equivalental operation with variables ヲチ p and ヲテ( p) of p-th degree from coefficients 69 Q 6 + q ヲチ k, l 6--- S 63 ;. The Q kl, 6ユ5, 5 ヲツ 6 63 q S inp () t S q n 6ユ5 6 q first coefficient takes into consideration the "equivalence" (similarity) of the input images with one of the stored pattern images. The second one accounts the "equivalence" of the k-th and l-th pixel of the pattern images. The process of recalculation of MTEM of NNAM with such dual adaptive-equivalental weighing comes to matrixtensor procedures with operation of "equivalence": S out( t + ) ヘ Q --- 6 p F q 69 N --- ( S q inp() t チ T), N --- ( S ヲチ 6ユ5 inp チ q T) 6 チ 6ユ5 6 6 63 6ユ5 q tr65 チ T 66, 67 here Sヲチ inp S inp 6ユ5 ヲチ p, p F q q p q p ヲツ ヲチkn 6ユ5 ヲツ kn (the other types of equivalence operations are possible). The expression for recalculation of the state of one k, l-th neuron (not the whole matrix) in MTEM of NNAM with adaptive-equivalental weighing has more evident form: then we can decompose this image into k-images, each having only two levels. Each two-level or -bit image is referred to as a bit plane. Using different codes for transformation of multilevel image in digit-bit plane, we can complete thus analog-digital transformation of images in any needed code: unit position code, unit normal code at morphological threshold decomposition, binary code at irredandunt coding, alternating code, codes of Fibonacci and others. The devices, which implement such transformation parallel for all pixels, called analog-digit image converters are considered in [, ]. For our MTEM of NNAM we used mainly two types of coding in AD-transformations of multilevel images. In the first case we used morphological threshold decomposition with programmed number of levels or bit planes and low (high) threshold levels. At that the whole set of digit levels ツ 55 for every gray level image (or for every of three main colors R, G, B) was transformed in programmed (as a rule 8 in our experiments) number of bit ordered planes. That led to actual compression of information, and multilevel image, having 56 levels, was transformed in multilevel image with less number of levels. But levels of the last one remained the same and that is why total dynamic range did not change. Applying the operation of pel-by-pel logic sum of corresponding ordered bit planes we form (of 8 planes) a result prepared (converted) bit plane (two-level image) (RCBP). Thus any input image and all trained multilevel images are represented by their RPBPs. The last ones are used as input image and trained images correspondingly in NNAM for twolevel images. In the second case we used 8-digit ADC of picture type (virtual) for transformation of multilevel images. For that purpose, a corresponding subprogram was written. In hardware realization and in this program we use a new algorithm of ADC and a new mathematical model on the basis on neural logic, but we will not describe then here in detail (we will do that in another work). One should note that all 8 bit planes (input, standard, output, gray level image) are processed parallel (or consecutively, virtually) and used during recognition with the help of 8 independent (possibly dependent) NNAM for two-level, two-gradation images. Hence, MTEM of NNAM with AE weighing for multilevel images in the first case of ADC does not differ from the one described in section.4., as since RPBP, similar by shape, but prepared one was used instead of ordinary D two-level images. For the second case the model consists of several two-level images, i.e. it is a superposition, combination of digit-by-digit bit planed models. Sk, l ヘ out( t + ) Sk q, l 6ユ5 ( ヲチ p 6ユ5 S n q n inp 6ユ5 S q ) p N n N kn. (6) 3.SYSTEM DESIGN AND PROPOSED IMPLE- MENTATIONS.5. MTEM of NNAM with weighing for multilevel images There are basically two types of still images: two-level and multilevel. A two-level image is often called a "blackand-white" image, where as a multilevel image is usually called a gray level image. If we represent each pixel of a multilevel image by means of as k-bit binary code word, 3.. Systems of NNAM Fig. shows the structure scheme of neural net associative memory for multilevel D images with the first variant of transformation of input and trained multi-gradation (colour) D images into a result bit plane. Analog-digit converter (ADC) of picture type (PT) and digit-analog converters (DAC) of picture type perform necessary trans- 49

358486688 formations of multilevel images in a set of bit planes and visa versa. Schemes of coding and decoding of picture type (C of PT and DC of PT) complete transformation of the set of bit planes S out ツ S 7 out into one result two-level image RCBPinp and inversion of output RCBP out into a set of bit planes S out ツ S 7 out respectively. Trained multilevel images S q, being input into memory with the help of ADC of PT and C of PT, are transformed into RCBP q. That is why in ordinary addressed memory of page type only two-level (binary) images are stored. For the second variant of coding, according to section.5., the structure scheme will differ from the one on fig. only because it does not have coder and decoder, and the number of blocs of NNAM and blocs of memory of trained images MTI will be equal to the number of bit planes. Every i-th bit plane Si inp of input and S i ュS i q ュSi Q of trained images is processed on i-th NNAM i, which forms i-th bit plane Si out of output image on its output. All NNAM ツ NNAM 7 can work independently and jointly. In the last case the internal result measure of equivalence can be a certain complicated function (of vector normalized equivalence type) of particular i-th measures of equivalence of i-th bit planes. Taking into consideration limitation connected with the size of the article, let's concentrate on realization of basic associative memory with two-level -D images. 3.. Implementations As can be seen from section for simultaneous parallel calculation of all components of ヲツ Q vector or ヲツヲチ vector, net vector vector-matrix and matrix-tensor procedures with operation of equivalence are necessary to form result sets of values, data proportional to equivalence. Besides, for simultaneous calculation of all coefficients ヲチ ij, all functions p F q q p, ヲツ kn, ヲチ p, matrixes of (D-array) elements are required for component-by-component calculation of equivalence operations, nonlinear transformations, threshold processing, sum and so on. That is why proposed MTEMs of NNAM are easier represented on modern and progressive matrix architecture, multifunctional elements of matrix logic [4,7,3,] and processors of picture type [3, 4, 5]. One should mention that normalization may not be done, it is necessary to change threshold, taking function of activation. The device, which performs operation over S matrix and T tensor, namely S チ ケ T, will be called matrix-tensor equivalentor (MTE). According to elaborated in paragraph models (MTEMs of NNAM) the basic structure element for them will be just these MTEs. If ( 6ユ5) equivalency is used, then MTE can be built on ordinary digit matrix-tensor multipliers, including optical, as: (7) Fig. 3 shows architecture of NNAM on the basis of two matrix-tensor eqivalentors MTE and MTE. The first nonlinear converter NC of matrix type is used for competitive nonlinear transformation (expression (9) section..), and the second one NC of PT is used for threshold procession according to activation function. Input commutator-multiplexer M and output commutator-demultiplexer D are intended to input, output and form back propagation of iteration recalculations. ケ チ S チ T S チ T + S チ T S inp multilevel D image ADC of PT. Sinp 7 Sinp input Neuro netvorks Coder image associative of memory. Decoder PT two-level D-images of RCBPinp. PT RCBPout teaching images memory teaching images (MTI) q ハ{,...Q} ヲチ (MTEM NNAM) S out 7 S out DA C of PT S out multilevel D image RCBP Figure - Schematic block diagram of associative memory for multilevel D-images 5 ISSN 67-374 ー4ィ 665ィコィー86ィェ6ィコィ. 5ィェ68ィャィ ィーィィィコィ. 778ィ 956ィェィェ7 ア,

3Vladimir Krasilenko, Alexander Nikolsky, Sergei Pavlov: THE ASSOCIATIVE D-MEMORIES BASED ON MATRIX-TENSOR EQUIVALENTAL MODELS Feedback Loop Sinp M q T [ ヲツ ] [ ] S ( t) () t int NC S out MTE MTE x x ヲツ p kn q T tr [ ヘ ] DM S out NN AM D-images Figure 3 - Block diagram of NNAM Light SLM S inp Q times SLM 3 SLM 4 Phァ h s SLM S ` S q All Q teaching images S q BS SLM 3 Sinp Q multiplaxing S inp SLM SLM LA SLM 4 S All Q teaching images S q Computer a) b) S Q Figure 4 - Optical system for MTE a) and inputting in SLM images b) In a fig.4. one of possible variants of optical realization MTE is shown. Input images Sinp are recorded from the managing computer in first spatial light modulators (SLM ) Q times, i.e. multiplicited. This multiplication is carried out at recording in SLM in the computer. Contrastly inverted input image S inp is also recorded in SLM 3 Q times. On the second SLM all Q trained images (S, ュSq) are recorded and on the fourth SLM 4 - all Q trained contrastly inverted images ( S, ュS q ). The lens array LA serves for spatial integration within each q-zone thus, on an input of every q-photodetector of array PDA a signal will be proportional to normalized equivalence of an input image with the trained S q. Non-linear transformation insfead of blocks NC and NC are shown in the circuit on fig3. They are carried out over the PDA output signals in the computer. Instead of PDA it is possible to use commercial liquid crystal television (LCTV) with 5x5 pixels and 3 ms refresh rates [6]. In this case it is possible to work with D-imagesdimantion 3x3 pixels and x teaching D-images, since on every LCTV it is necessary to record all Q images by dimension mxn. And it puts forward restriction on m チ n< 5 チ 5 6ィ8 Q at given Q or restriction on Q < ( 5 チ 5) 6ィ8 ( m チ n) with the given format of the images mxn D. Let's estimate productivity of such NNAM on the basis of optical realization of MTE. The recording in all SLM (LCTV) (fig4.) and updating given in each iteration is made simultaneously. Nonlinear processing simulating the work of NC and NC (fig.3) and recordings of the processed data in SLM can be combined. Let's consider therefore, that one iterative recalculation of a network the system has executed in 5 ms. At dimension of matrix of the input image 3x3 pixels or " 3 elements of a vector (image) the matrix of weight coefficients will have " 6 components or connections. It means that the system in 5 ms calculated as a matter of fact 6 connections. Thus, the rating show that productivity of NNAM realized on the simplest traditional circuits (fig.4) and on a basis slowly working LCTV archives 66 7 of connections/s. There is a quite real opportunity to increase capacity of SLM up to 5x5 (9 line pairs/ mm) pixel resolution and 5ms response time [7]. In this 5

358486688 case at dimension of D - image equal x pixels(and Q) in 5ms already 8 connections are calculated. Therefore ratings will show productivity of NNAM at a level 66 of connections/s. The common time of reading from associative memory does not exceed ( ィC 3)ヲモ ter ヨ 8c.The usage of other more high-speed The suggested matrix-tensor equivalental models MTEM s for construction on their basis of neuro-net associative memory of two-level and multilevel D-images, comprising models with doubled weighting, are of great importance, since they provide the opportunity to implement the whole number of competitive NNAP systems with increased capacity, productivity and fast acting. The result of modeling and experiment prove the validy of the theoretical work. Since suggested MTEM s are realized on the base of matrix tenzor (vectormatrix) procedures and multipliers and equivalentors it is possible to make a conclusion about necessity and, possibility of construction of complete digital optical NN associative memory and optical pattern of recognition systems. optoelectronic matrixes [3-5], calculating necessary operation of equivalence and matrix-tenzor procedures will allow to reduce significantly the time of reference to NNAM. In this work we wont concentrate on realizations in details, as our purpose is to show new most general principles of their realization. 4. RESULTS OF MODELING For modeling we developed a program realization of MTEMs models. Algorithm is realized in a program product on a low class PC. Technical characteristics this product: - any quantity of segments (elements, neurons) of a processed image with the dimensions in X x Y up to 3 neurons; - time of recognition of a image - part of a second to second; - a number of iterations (steps) necessary for recognition of a image - to 5; - training period depends only on time of data base insert and is significantly shorter than that of other well-known neuro - net paradigms; - correlation between the quantity of sample vectors and number of neurons in the network (network capacity) is -,5 times more (3%) than that in Hopfield networks (4-5%); - the level of distortion of image which permits its authentic recognition is up to 3%; - the number of gradation of images under process is 56 in each color, as well as black - and -white image; - quantity of digit layers at the process of image coding- 8 beat layers; - a possibility of space-invariant recognition of images; Mathematical models used are equivalental models. Metric system for comparison of images is non - Euclidean. A possibility can be fore seen of an adaptive regulation of the network for renewing and extending data base and for the optimization criterion and metric parameters. We shall gave it's characteristics below. The results of modeling are shown on fig. 5-8. a) 5. CONCLUSION b) for differ- Figure 5 - Output of neurons open layer ent p: a) p; b) p3. q p ヲツヲチ 5 ISSN 67-374 ー4ィ 665ィコィー86ィェ6ィコィ. 5ィェ68ィャィ ィーィィィコィ. 778ィ 956ィェィェ7 ア,

3Vladimir Krasilenko, Alexander Nikolsky, Sergei Pavlov: THE ASSOCIATIVE D-MEMORIES BASED ON MATRIX-TENSOR EQUIVALENTAL MODELS Figure 6 - Two-level D-images recognition results Figure 8 - Multilevel D-images invariant recognition results 6. REFERENCES Figure 7 - Multilevel -D images recognition on base of RCBP result. N.M. et al..amosov Neurostructures and Intelligent Robots. Naukova Dumka, Kiev, 99.. D. A. Redgia, G. G. Satton Autoprocessing Nets and Their Significance for Biomedical Researches. TIIER, vol. 76, 6, pp. 46-59, 988. 3. Freeman James a., D.M. Skapura Neural Networks: Algorithms, Applications and Programming Techniques, Addison- Wesley Publishing Company, 99. 4. V.G. Krasilenko, O.K. Kolesnitsky, A.K. Boguhvalsky "Creation Opportunities of Optoelectronic Continuous Logic Neural Elements, Which are Universal Circuitry Macrobasis of Optical Neural Networks ". Proc. SPIE, Vol. 7, pp. 8-7, 995. 5. V.I. Levin. "Continuous Logic, Its Generalization and Application". Automatica and Telemechanica, 8, pp. 3-, 99. 6. V.G. Krasilenko et. al. "Lines of Optoelectronic Neural Elements with Optical Inputs/Outputs Based on BISPIN-devices for Optical Neural Networks". Proc. SPIE, Vol. 7, pp. - 7, 995. 7. V.G. Krasilenko, A.T.Magas "Fundamentals of Design of Multifunctional Devices of Matrix Multiciphered Logic with Fast Programmed Adjusting". Measuring and Computer Technique in Technological Processes. 4, pp. 3-, 999. 8. A.S. Abdul Awwal, "Khan M. Iftekharuddin. Computer Arithmetic for Optical Computing. Special Section". Optical Eng. Vol. 38, 3, 999. 9. E.N. Sokolov, G. G.Vaytkyavichus. Neurointelligence: from Neuron to Neurocomputer. M.: Nauka, -38 p, 989.. N.V. Pozin Modeling of Neural Structures. M.: Nauka, - p. 97.. Yu.G. Antomonov Principles of Neurodynamics. Naukova 53

358486688 Dumka, Kiev, 974.. L.N. Volgin. "Complementary Algebra and Relative Models of Neural Structures with Coding of Channel Numbers". Electrical Modeling, Vol. 6, 3, pp. 5-5. 994. 3. V.G. Krasilenko, A.K.Bogakhvalskiy, A.T.Magas. "Equivalental Models of Neural Networks and Their Effective Optoelectronic Implementations Based on Matrix Multivalued Elements". Proc. SPIE, Vol. 355, pp. 7-36, 996. 4. V.G. Krasilenko et. al. "Applications of Nonlinear Correlation Functions and Equivalence Models in Advanced Neuronets". Proc. SPIE, Vol. 337, pp. -, 997. 5. V.G. Krasilenko et. al. "Continuous Logic Equivalental Models of Hamming Network Architectures with Adaptive-Correlated Weighting". Proc. SPIE, Vol. 34, pp. 398-48, 997. 6. Guo Doughui, Chen Zhenxiang, Lui Ruitaugh, Wu Boxi. "A Weighted Bipolar Neural Networks with High Capacity of Stable Storage". Proc. SPIE, Vol. 3, pp. 68-68. 7. B.Kiselyov, N.Kulakov, A.Mikaelian, V.Shkitin. "Optical Associative Memory for High-order Correlation Patterns". Opt. Eng., Vol. 3, 4, pp. 7-767. 8. A.L. Mikaelian. "Holographic Memory: Problems and Prospective Applications". Proc. SPIE. Vol. 34, pp. -, 997. 9. A.L. Mikaelian (Editor). Optical Memory and Neural Networks. Proc. SPIE, Vol. 34, 997.. V.G. Krasilenko et. al. "Gnoseological Approach to Search of Most General Functional Model of Neuron". Collected research works, 7 () according to the results of 7-th STC "Measuring and Computer Technique in Technological processes", Khmelnitsky, MCTTP, pp. 3-7,.. O.K. Kolesnitskiy, V.G. Krasilenko, "Analog-Digit Transformers of Picture Type for Digit Optoelectronic Processors (Review)". Autometry.-, pp. 6-9, 99.. O.K. Kolesnitskiy, V.G. Krasilenko, "Analog-to-Digital Image Converters for Parallel Digital Optoelectronic Processors". Pattern Recognition and Image Analysis. Vol.,, pp. 7-33, 99. 3. V.G. Krasilenko et. al., "Digital Optoele9tronic Processor of Multilevel Images", Ellectronnoe Modelirovanie, Vol. 5, 3, pp. 3-8, 993. 4. Patent 78679 (SU). Device for Multiplication of Square Matrix of Picture-Image, V. G.Krasilenko et. al., Publ. At BI 46, 99. 5. A.Huang, "About Architecture of Optical Digit Computing Machine"//TIIER.-Vol. 7, 7.-pp. 34-4, 984. 78 6.74 36665868 96 チ3587 5934 83566 6843448 757568 73489368 8.8.9ィ 8ィコィ 569, 85ィケ-4ィ ィ ィィ 8ィェィ ィェ, 8.8.88ィ 9ィィ ツィコ69 385ィ ィ ィー97 ィャィー6ィィィコィ 67ィーィィィャィィ4ィ ィィィィ 9ィャ ィ 89ィ ィィィィ ィコ6ィャ7ィィィィ6ィェィェ66 ィャィィィコ867868ィ ィャィャィェ66 ィョ9ィー86ィヲ9ィー9ィ ィョ78ィ 95ィェィィ7. ィェィ 69ィェ69ィ ィェィ ィェィ ィコ6ィィ869ィ ィェィィィィ ィェ6ィャ869 78669 ィコ6ィェ ツィェ66 ィ 9ィー6ィャィ ィーィ. ィ ィェィェィ 7 ィャィー6ィィィコィ 79657ィー ィョィャィェィケ3ィ ィーィケ ィ 77ィ 8ィ ィーィェ4 4ィ ィー8ィ ィー4, ィェ6ィ 6ィィィャ4 ィェィ 8ィ 5ィィ4ィ ィィ6 9ィャ ィ 89ィ ィィィィ. 38ィィ9ィェ 78ィィィャ8 ィィ9765ィケ469ィ ィェィィ7. 38676ィェィョィーィケ97 ィャィー6ィィィコィ 67ィーィィィャィ 67 9ィャ ィ 89ィ 67 ィコ6ィャ7ィィ6ィヲィェ66 ィャ6ィコ867868ィ ィャィェ66 78ィィ9ィー866 ィコ8ィョ9ィ ィェィェ7. 6ィェィ 4ィ 9ィェ69ィ ィェィ ィェィ ィコ6ィョ9ィ ィェィェ6 ィェ6ィャ869 78669 ィコ6ィェ966 ィ 9ィー6ィャィ ィーィ. ィ ィェィ ィャィー6ィィィコィ 9657 4ィャィェ3ィョ9ィ ィーィィ ィ 7ィ 8ィ ィーィェ6 9ィィィー8ィ ィーィィ, ィェ6ィ 6ィェ6 ィェィ 8ィ 5ィ 66 9ィャ ィ 89ィ 67. 38ィィ9ィェ6 78ィィィコ5ィ 77 9ィィィコ68ィィ9ィーィ ィェィェ7. The optimization method of microcommand's addressing circuits for compositional control unit is proposed. Method based on the encoding of transition's numbers of hard-logic automata. Given method allows to reduce the number of LSI-outputs in the addressing circuits. The method is illustrated by example. 778ィ 9576ィエ ィョ9ィー86ィヲ9ィー96 (77) ィィ8696ィヲ 9ィィ9ィーィャ4 ィャ63ィー ィ 4ィーィケ 8ィ 5ィィ469ィ ィェ6 9 9ィィ ィコ6ィャ7ィィィィ6ィェィェ66 ィャィィィコ867868ィ ィャィャィェ66 ィョ9ィー86ィヲ9ィー9ィ ィョ78ィ 95ィェィィ7 (877) []. ィェィ 9ィー67ィエ 98ィャ7 57 8ィ 5ィィ4ィ ィィィィ 56ィィ ツ9ィコィィ 9ィャ 77 3ィィ86ィコ6 78ィィィャィェ76ィー97 7868ィ ィャィャィィ8ィョィャ4 56ィィ ツ9ィコィィ ィョ9ィー86ィヲ9ィー9ィ (397) []. 5ィェィ ツィィィー5ィケィェィ 7 9ィー6ィィィャ69ィーィケ 397 94449ィ ィー ィェ6ィ 6ィィィャ69ィーィケ ィョィャィェィケ3ィェィィ7 ツィィ95ィ ィャィィィコ869ィャ 9 9ィャ 77, 5ィィィ 6 4ィ ィャィェ4 ィャィィィコ869ィャ 7868ィ ィャィャィィ8ィョィャ6ィヲ 56ィィ ツ9ィコ6ィヲ ィャィ ィー8ィィ4 (39) ィィ 7868ィ ィャィャィィ8ィョィャ6ィヲ ィャィ ィー8ィィ ツィェ6ィヲ 56ィィィコィィ (39) ィ 65 394ィャィィ 357. ィェィ 9ィー67ィエィヲ 8ィ ィ 6ィー 785ィ ィ ィー97 ィャィー6 67ィーィィィャィィ4ィ ィィィィ 9ィー6ィィィャ69ィーィィ 877, ィコ6ィー684ィヲ 78ィィィャィェィィィャ 78ィィ 9ィィィェィー4 56ィ 66 77 ィェィ 9 ツィー ツィィィコ [3]. 3ィョ9ィーィケ 57 8ィ 9ィャ4 ィ 568ィィィーィャィ (58) 765ィョ ツィェ6 ィャィェ639ィー96 678ィ ィー68ィェ4 5ィィィェィヲィェ4 7ィヲ (9) C { ヲヒ, ュ, ヲヒ g }, ヲヒ g ハ C - 769569ィ ィー5ィケィェ69ィーィケ 678ィ ィー68ィェ4 983ィィィェ 58 ィャ3ィョ ィコ6ィー684ィャィィ ィェィー ィョ9569ィェ4 983ィィィェ []. 8ィ 3ィ 7 9 ヲヒ g ハ C ィィィャィー 9 Ik g (K, ュ,K g ) ィィ 6ィィィェ 946 O g. 678ィ ィー68ィェ4 983ィィィェィ 58 4ィ 7ィィ949ィ 6ィー97 ィャィィィコ86ィコ6ィャィ ィェ4 (8) Y t 6ロ7 Y, Y { y, ュ, y N } - ィャィェ639ィー96 ィャィィィコ86678ィ ィィィヲ. ィョ9569ィェ4 983ィィィェィ 58 4ィ 7ィィ949ィ 6ィー97 55ィャィェィー4 ィャィェ639ィー9ィ 56ィィ ツ9ィコィィ ィョ9569ィィィヲ X { x, ュ, x L }. 3ィョ9ィーィケ 9 785ィ ィコィ 36ィヲ 9 ィャィィィコ86ィコ6ィャィ ィェ4 ィィィャ6ィー 969ィェィィ ィ 89ィ, ィー6 9ィーィケ 95ィィ 8 Y t 4ィ 7ィィ9ィ ィェィ 9 983ィィィェ b i, ィ 8 - Y g 9 95ィョ6ィエィヲ 4ィ ィェィヲ 983ィィィェ b j, ィー6 AY ( g ) AY ( t ) +, () A(Y k ) - ィ 89 ィャィィィコ86ィコ6ィャィ ィェ4 Y k ( k ハ { t, g} ). 5ィー6ィャ 95ィョ ツィ 57 ィィィェィー878ィーィ ィィィィ 58 78ィィィャィェィィィャ6 877 (4ィィ9.), 789ィーィ 9576ィエ 96ィ 6ィヲ ィコ6ィャ7ィィィィ6 ィ 9ィー6ィャィ ィーィ 9 "39ィーィコ6ィヲ" 56ィィィコ6ィヲ S (85, RG) ィィ ィ 9ィー6ィャィ ィーィ 9 "7868ィ ィャィャィィ8ィョィャ6ィヲ 56ィィィコ6ィヲ" S (56, 3). 877 ィョィェィコィィ6ィェィィ8ィョィー 95ィョ6ィエィィィャ 6ィ 8ィ 46ィャ. 36 9ィィィェィ 5ィョ "3ィョ9ィコ" 9 8ィィ9ィー8 7ィ ィャ7ィーィィ ィ 9ィー6ィャィ ィーィ S RG ィィ 9 9 ツィー ツィィィコ 56 4ィ ィェ697ィー97 ィェィョ594 ィコ. 3ィョ9ィーィケ 9 ィャ6ィャィェィー 98ィャィェィィ t 9 RG ィェィ 67ィー97 ィコ6 K(a m ) ィーィコィョィエ6 969ィー67ィェィィ7 a m ハ A ィ 9ィー6ィャィ ィーィ S, 8 {a, ュ,a M }. 3ィョ9ィーィケ 9 ィャ6ィャィェィー 98ィャィェィィ t 9 56 ィェィ 6ィィィー97 ィ 89 A(Y i ) ィャィィィコ86ィコ6ィャィ ィェ4, 967ィエィヲ 9 9 ヲヒ g ハ C. 395ィィ 8 Y i ィェ 7957ィー97 9466ィャ 9 ヲヒ g, ィー6 ィィ4 ィョ78ィ 9576ィエィヲ 7ィ ィャ7ィーィィ 73 9ィャ9ィー 54 ISSN 67-374 ー4ィ 665ィコィー86ィェ6ィコィ. 5ィェ68ィャィ ィーィィィコィ. 778ィ 956ィェィェ7 ア,

38.8.9ィ 8ィコィ 569, 85ィケ-4ィ ィ ィィ 8ィェィ ィェ, 8.8.88ィ 9ィィ ツィコ69: 36665868 96 チ3587 5934 83566 6843448 757568 73489368 9 ィャィィィコ86ィコ6ィャィ ィェィ ィャィィ y n ハ y i 94ィ ィィ8ィ ィー97 9ィィィェィ 5 y. 36 y ィコ 9683ィィィャ6ィャィョ 56 78ィィィ ィ 957ィー97 ィィィェィィィ, ィ 89ィョ7 6 ツ8ィェィョ6 ィャィィィコ86ィコ6ィャィ ィェィョ 9 ヲヒ g. 38ィィ 5ィー6ィャ 969ィー67ィェィィ ィ 9ィー6ィャィ ィーィ S ィェ ィャィェ7ィー97. 395ィィ 946 9 ヲヒg 69ィーィィィェィョィー, ィー6 9ィィィェィ 5 y ィェ 68ィャィィ8ィョィー97. 5ィー6ィャ 95ィョ ツィ ィコ6ィャィ ィィィェィ ィィ6ィェィェィ 7 9ィャィ 85 68ィャィィ8ィョィー ィョィェィコィィィィ 8 8(X, T), 76 ィコ6ィー684ィャ 9 56 4ィ ィェ69ィィィー97 ィ 89 96ィ 6 ツ8ィェ6ィヲ 9 ヲヒ g ハC, ィィ 88(X, T), ィィ4ィャィェ76ィエィィ 969ィー67ィェィィ ィ 9ィー6ィャィ ィーィ S. 59ィケ T{T, ュ,T R } - ィャィェ639ィー96 9ィェィョィー8ィェィェィィ 78ィャィェィェ4, ィコ6ィィ8ィョ6ィエィィ 969ィー67ィェィィ7 a m ハ A, R ]log M[. 8ィョィェィコィィ6ィェィィ869ィ ィェィィ 4ィ 983ィ ィー97 78ィィ 69ィーィィ3ィェィィィィ ィコ6ィェ ツィェ66 969ィー67ィェィィ7 ィ 9ィー6ィャィ ィーィ S 7695 68ィャィィ869ィ ィェィィ7 99 ィャィィィコ86ィコ6ィャィ ィェ 966ィー9ィー9ィー9ィョ6ィエィヲ 5ィー6ィャィョ 969ィー67ィェィィ6 9. ィェ6ィャ8ィ 786ィ 9 ィョィェィコィィィィ 8 ィィ ヲラ. 6ィ ィコ6ィヲ 766 76863ィ ィー 877 9 786ィ 8ィ 469ィ ィェィィィャ ィェ6ィャ8ィ 786ィ (8ィィ9. ), ィョィェィコィィ6ィェィィ8ィョ6ィエィィ 95ィョ6ィエィィィャ 6ィ 8ィ 46ィャ. 395ィィ 9ィィィェィ 5 y 68ィャィィ8ィョィー97, ィー6 786ィィ96ィィィー ィェィ 8ィ ィエィィ9ィ ィェィィ 9 ツィー ツィィィコィ. 786ィーィィ9ィェ6ィャ 95ィョ ツィ 9ィャィ 85 ィェ68ィャィィ8ィョィー 78ィャィェィェ4 Z, ィコ6ィィ8ィョ6ィエィィ ィェ6ィャ8ィ 78669 ィ 9ィー6ィャィ ィーィ S. 386ィ 8ィ 469ィ ィー5ィケ ィコ669 38 68ィャィィ8ィョィー 78ィャィェィェ4 88(Z) ィィ ヲラ ヲラ(Z), ィョ78ィ 9576ィエィィ 56 ィィ RG 966ィー9ィー9ィー9ィェィェ6, ツィー6 78ィィ96ィィィー ィコ 6 ツ8ィェ6ィャィョ 786ィョ 877. ァキ ァャァウ Z ァアァャ ァカ ァウァエ ァオァア Y y ァキ ァャァウ ァカ ァアァ蟋罘ワ ァウァエ ァオァア Y RG ァエ RG ァエ 4ィィ9ィョィェ6ィコ - 5ィー8ィョィコィーィョ8ィェィ 7 9ィャィ 877 9 786ィ 8ィ 469ィ ィェィィィャ ィェ6ィャ8ィ 786ィ 4ィィ9ィョィェ6ィコ - 5ィー8ィョィコィーィョ8ィェィ 7 9ィャィ ィコ6ィャ7ィィィィ6ィェィェ66 ィャィィィコ867868ィ ィャィャィェ66 ィョ9ィー86ィヲ9ィー9ィ ィョ78ィ 95ィェィィ7 4ィ 487ィェ69ィーィケ ィ 89ィ ィャィィィコ86ィコ6ィャィ ィェ4 R 67857ィー97 ィコィ ィコ ィョィェィコィィ7 6ィー ツィィ95ィ M 678ィ ィー68ィェ4 983ィィィェ 58: R ]log M [. 6ィ ィコィィィャ 6ィ 8ィ 46ィャ, 9ィャィ 85 ィィィャィー R R + R () 94669. 3ィョ9ィーィケ t 397 - ツィィ956 94669 397 9ィャ4 85, ィー6ィ 78ィィ 94765ィェィェィィィィ ィョ9569ィィ7 R > t 397 (3) ィェ6ィ 6ィィィャ6 94765ィェィィィーィケ 8ィ 93ィィ8ィェィィ 397 76 946ィ ィャ [3]. 6ィー6 78ィィ96ィィィー ィコ ィョ95ィィ ツィェィィ6 ツィィ95ィ 397 9 9ィャ. 3ィョ9ィーィケ ィ 9ィー6ィャィ ィー S 786ィィ496ィィィー H 78669, ツィー6 67857ィー 5ィィィェィョ 6 787ィャ6ィヲ 9ィー8ィョィコィーィョ8ィェ6ィヲ ィーィ ィ 5ィィ4 (356), ィィ R 3 ]log H[. 38ィィ 94765ィェィェィィィィ ィョ9569ィィ7 R 3 ワ t 397 (4) 9ィー6ィィィャ69ィーィケ 9ィャ4 877 ィャ63ィェ6 ィョィャィェィケ3ィィィーィケ. 57 5ィー66 9 ィ ィェィェ6ィヲ 8ィ ィ 6ィー 785ィ ィ ィー97 999ィーィィ 786ィ 8ィ 469ィ ィー5ィケ ィェィ 9ィー67ィエィヲ 8ィ ィ 6ィー 785ィ ィ ィー97 ィャィー6ィィィコィ 9ィィィェィー4ィ 877 9 786ィ 8ィ 469ィ ィェィィィャ ィェ6ィャ8ィ 786ィ, 9ィコ56 ツィ 7 95ィョ6ィエィィ 5ィーィ 74:.868ィャィィ869ィ ィェィィ ィャィェ639ィー9ィ 9 C, ィ 89ィ ィィ7 ィャィィィコ86ィコ6ィャィ ィェ 965ィ 9ィェ6 (), 68ィャィィ869ィ ィェィィ 9683ィィィャ66 73 ィィ 356 ィ 9ィー6ィャィ ィーィ S. 6ィー6ィー 5ィーィ 7 94765ィェ7ィー97 965ィ 9ィェ6 ィャィー6ィィィコ ィィ4 8ィ ィ 6ィー4 [].. 86ィィ869ィ ィェィィ ィェ6ィャ869 h 78669 F h 96ィィ ツィェ4ィャィィ ィコ6ィ ィャィィ K(F h ) 8ィ 487ィェ69ィーィィ R 3. 868ィャィィ869ィ ィェィィ ィャィェ639ィー9ィ ィコ6ィィ8ィョ6ィエィィ 78ィャィェィェ4 Z. 386ィ ィャ F h, 966ィー9ィー9ィー9ィョ6ィエィィィャ 6ィィィェィ ィコ694ィャ 7ィ 8ィ ィャ < 8, ヲラ > 9ィーィ 9ィィィー97 9 966ィー9ィー9ィー9ィィ 6ィィィェ ィェ6ィャ8. 3. 868ィャィィ869ィ ィェィィ 786ィ 8ィ 469ィ ィェィェ6ィヲ 356 ィ 9ィー6ィャィ ィーィ S ィョィ 5ィェィィィャ 9ィー65ィ 69 8 h ィィ ヲラ h, 9683ィ ィエィィ ィョィェィコィィィィ ィョ78ィ 95ィェィィ7 RG ィィ 56, ィィ 9966ィャ 9ィー65ィ ィ Z h, 9683ィ ィエィィ ィョィェィコィィィィ ィョ78ィ 95ィェィィ7 38. 868ィャィィ869ィ ィェィィ 9ィィ9ィーィャ4 ィョィェィコィィィヲ Z Z (X, T). 4. 868ィャィィ869ィ ィェィィ ィーィ ィ 5ィィ4 38 9 96ィ ィャィィ Z ィィ 946ィ ィャィィ 8 ィィ ヲラ. 868ィャィィ869ィ ィェィィ 9ィィ9ィーィャ ィョィェィコィィィヲ 88(Z) ィィ ヲラ ヲラ(Z). 5. 5ィィィェィー4 56ィィ ツ9ィコ6ィヲ 9ィャ4 877 9 4ィ ィ ィェィェ6ィャ 55ィャィェィーィェ6ィャ ィ ィ 4ィィ9. 5ィャィ 85 8ィ 5ィィ4ィョィー97 ィェィ 39 ィィ5ィィ 39, 9ィャ4 38 ィィ 73 - ィェィ 357. 4ィ 99ィャ6ィー8ィィィャ 78ィィィャィェィェィィ 5ィー6ィヲ ィャィー6ィィィコィィ ィェィ 78ィィィャ8 9ィィィェィー4ィ 877, ィ 9ィー6ィャィ ィー S ィコ6ィー6866 4ィ ィ ィェ 356 (6ィ ィ 5.). 55

358486688 6ィ ィ 5ィィィ - 387ィャィ 7 9ィー8ィョィコィーィョ8ィェィ 7 ィーィ ィ 5ィィィ ィ 9ィー6ィャィ ィーィ S 877 S a m K(a m ) a s K(a s ) Xh ヲラ h 8h h 6ィ ィ 5ィィィ - 386ィ 8ィ 469ィ ィェィェィ 7 787ィャィ 7 9ィー8ィョィコィーィョ8ィェィ 7 ィーィ ィ 5ィィィ ィ 9ィー6ィャィ ィーィ S 877 S a m K(a m ) Xh K(Y h ) Zh h a x D D 3 D 4 x - a a 3 x x D D 3 D 5 D 6 a x x Z 3 a x x D D 4 D 6 3 x x Z 3 a a x 3 D D 3 D 4 4 a 3 x 3 D D 3 D 6 5 a x 3-4 x 3 Z Z 3 5 a x D D 3 D 5 6 x Z 6 a 3 a 3 x x 4 D D 3 D 5 D 6 7 a 3 x x 4 Z 3 7 a x x 4 - D 4 D 5 8 x x 4 Z Z 3 8 59ィケ 8ィィ9ィー8 RG ィィ 9 ツィー ツィィィコ 56 ィィィャィー 9 D-ィーィィ7ィ, ヲラ {D,D }, 8{D 3, ュ,D 6 }, R, R 4, R 6. 6ィーィ ィコ, 965ィ 9ィェ6 () 397 9ィャ4 85 653ィェ4 ィィィャィーィケ 6 94669. 696ィェィ 7 356 ィィィャィー H 8 9ィー86ィコ, 6ィェィ ィコ6 78 F ィィ F 4, F ィィ F 7 9697ィ ィ 6ィー. 38ィィ996ィィィャ 786ィ ィャ 95ィョ6ィエィィ ィェ6ィャ8ィ : F ィィ F 4, F ィィ F 7, F 3 3, F 5 4, F 6 5, F 8 6. 6ィ ィコィィィャ 6ィ 8ィ 46ィャ, ィィィャィー97 H 6 8ィ 45ィィ ツィェ4 ィェ6ィャ869, 57 ィコ6ィィ869ィ ィェィィ7 ィコ6ィー684 69ィーィ ィー6 ツィェ6 R 3 ]log H [ (5) 78ィャィェィェ4 Z q ハ Z, ィー6 9ィーィケ Z{Z,Z,Z 3 }. 3ィョ9ィーィケ K(F )K(F 4 ), K(F )K(F 7 ), K(F 3 ), K(F 5 ), K(F 6 ), K(F 8 ). 4ィ 5ィィ4ィ ィィ7 769ィー86ィェィェ6ィヲ 356 (6ィ ィ 5.) 786ィィ496ィィィー97 55ィャィェィーィ 8ィェ6. 5ィィ9ィーィャィ ィョィェィコィィィヲ Z 9 6ィ ィエィャ 95ィョ ツィ ィィィャィー 95ィョ6ィエィィィヲ 9ィィ: H Z r V Crh Am Xh ( r h, ュ, R 3 ), (6) 6ィ ィ 5ィィィ 3-6ィ ィ 5ィィィ 786ィ 8ィ 469ィ ィー57 ィコ669 ィ 9ィー6ィャィ ィーィ S 877 S Zh 8 h ヲラ h h D 3 D 4 D, 4 D 3 D 5 D 6 D, 7 D 4 D 6 D 3 D 3 D 6 D 5 D 3 D 5 D 6 D 4 D 5-8 678694 8ィ 5ィィ4ィ ィィィィ ィコ6ィャィ ィィィェィ ィィ6ィェィェ4 9ィャ 76 ィーィ ィ 5ィィィ ィャ 9 ィ ィ 4ィィ9 397, 69ィーィ ィー6 ツィェ6 6ィー8ィ 3ィェ4 9 7 ツィ ィーィィ []. 59ィケ 5ィーィィ 9678694 ィェ 8ィ 99ィャィ ィー8ィィ9ィ 6ィー97. 96ィィ ツ9ィコィ 7 9ィャィ ィ 9ィー6ィャィ ィーィ S 78ィィ9ィェィ ィェィ 8ィィ9.3, 49ィケ 85 9ィィィェィー4ィィ869ィ ィェィ 76 9ィィ9ィーィャ (6). Crh - ィ ィョ59ィ 78ィャィェィェィ 7, 8ィ 9ィェィ 7 ィィィェィィ, 95ィィ ィィ ィー65ィケィコ6 95ィィ 9 h-ィヲ 9ィー86ィコ 356 Z r. Am - ィコ6ィェィイ6ィェィコィィ7 78ィャィェィェ4 T r ハ T, 966ィー9ィー9ィー9ィョ6ィエィ 7 ィコ6ィョ 969ィー67ィェィィ7 a m ハ A ィィ4 h-ィヲ 9ィー86ィコィィ 356; Xh - ィコ6ィェィイ6ィェィコィィ7 56ィィ ツ9ィコィィ ィョ9569ィィィヲ, 678576ィエィ 7 h-ィヲ 786 ィ 9ィー6ィャィ ィーィ S. ィ 78ィィィャ8, ィィ4 ィーィ ィ 5. ィィィャィャ Z A 3 x ナ A 3 x x 4 T T T 3 x ナ T T x x 4. x x x3 x4 T T 3 4 5 6 PLA ァャァウ 3 Z Z Z 3 3 y clock ROM ァアァャ & 3 4 5 6 ァアァ蟋罘ワ D3 D4 D5 D6 D D D D D RD C RG T T 6ィ ィ 5ィィィ 38 877 S (6ィ ィ 5.3) 9683ィィィー 9ィー65ィ 4 Z h (96ィィ ツィェ4ィヲ ィコ6 9ィー86ィコィィ), 8 h ィィ ヲラ h (ィョィェィコィィィィ ィョ78ィ 95ィェィィ7 RG ィィ 56), h - ィェ6ィャ8 9ィー86ィコィィ 356. 6ィ ィ 5ィィィ 38 7957ィー97 ィーィ ィ 5ィィィヲ ィィ9ィーィィィェィェ69ィーィィ ィョィェィコィィィヲ 8 ィィ ヲラ. 395ィィ 38 8ィ 5ィィ4ィョィー97 ィェィ 357, ィー6 9ィィ9ィーィャ4 ィョィェィコィィィヲ 8 ィィ ヲラ 6ィェィェィ ツィェ6 4ィ ィ 6ィー97 ィーィ ィ 5ィィィヲ 38. 4ィィ9ィョィェ6ィコ 3-96ィィ ツ9ィコィ 7 9ィャィ ィ 9ィー6ィャィ ィーィ S 877 S ィーィャィーィィィャ, ツィー6 9ィィィェ86ィェィィ4ィ ィィ7 RG C y *Clock, Clock - 9ィィィェィ 5 9ィィィェ86ィェィィ4ィ ィィィィ 877. 56 ISSN 67-374 ー4ィ 665ィコィー86ィェ6ィコィ. 5ィェ68ィャィ ィーィィィコィ. 778ィ 956ィェィェ7 ア,

33..967ィェ9ィコィィィヲ, 5..36769: 9 88366 9336366 6587556349 37449 53637 98 9489686 3449 5689 58897 チ363 699569ィ ィェィィ7 ィ 9ィー6869 76ィコィ 4ィ 5ィィ, ツィー6 78ィィ 94765ィェィェィィィィ ィョ9569ィィィヲ (3) ィィ (4) ィャィー6 ィコ6ィィ869ィ ィェィィ7 ィェ6ィャ869 78669 79657ィー ィョィャィェィケ3ィィィーィケ 9ィー6ィィィャ69ィーィケ 8ィ 5ィィ4ィ ィィィィ ィ 9ィー6ィャィ ィーィ S. 38ィィ 5ィー6ィャ 5ィコ6ィェ6ィャィィ7 76943ィ ィー97 76 ィャ8 869ィーィ 8ィ 4ィェ69ィーィィ R - t 397 ィィ ィャ63ィー 69ィーィィィ ィーィケ 35-4% 76 98ィ 9ィェィェィィ6 9 ィー8ィ ィィィィ6ィェィェ6ィヲ 8ィ 5ィィ4ィ ィィィヲ ィ 9ィー6ィャィ ィーィ S. 38563ィェィェ4ィヲ ィャィー6 78ィィ96ィィィー ィコ ィ 65 ィャ5ィェィェ4ィャ 9ィャィ ィャ 877 ィィ4-4ィ 99ィェィィ7 38, ィョ95ィィ ツィィ9ィ 6ィエ6 98ィャ7 68ィャィィ869ィ ィェィィ7 ィャィィィコ86678ィ ィィィヲ. 6ィ ィコィィィャ 6ィ 8ィ 46ィャ, ィャィー6 ィコ6ィィ869ィ ィェィィ7 ィェ6ィャ869 78669 78ィィィャィェィィィャ, 95ィィ ィコ8ィィィー8ィィィャ 5ィコィーィィ9ィェ69ィーィィ 9ィャ4 77 7957ィー97 ィャィィィェィィィャィョィャ 9ィー6ィィィャ69ィーィィ. 3343 チ35 55498. 9ィ 8ィコィ 569 8.8. 3ィ 5ィ ィーィィィェ8.. 5ィィィェィー4 ィャィィィコ867868ィ ィャィャィェ4 ィョ9ィー86ィヲ9ィー9 ィョ78ィ 95ィェィィ7. - 8ィィ9: 68 8 7ィコ8ィ ィィィェ4, 997-56 9.. 56569ィケ9.. 386ィコィーィィ869ィ ィェィィ ィィ8694 9ィィ9ィーィャ ィェィ 69ィェ69 7868ィ ィャィャィェ4 56ィィ ツ9ィコィィ ィィィェィー8ィ 5ィケィェ4 9ィャ. -.: 687 ツィ 7 5ィィィェィィ7-65ィコ6ィャ, - 636 9. 3. 9ィ 8ィコィ 569 8.8. 5ィィィェィー4 ィョ9ィー86ィヲ9ィー9 ィョ78ィ 95ィェィィ7 ィェィ 7868ィ ィャィャィィ8ィョィャ4 56ィィ ツ9ィコィィ ィョ9ィー86ィヲ9ィー9ィ. - 6ィェィコ: 6ィェ67, - 6 9. 78 68.53.6 9 88366 9336366 6587556349 37449 53637 98 9489686 3449 5689 3..967ィェ9ィコィィィヲ, 5..36769 4ィ 99ィャ6ィー8ィェィ 4ィ ィ ツィ 6ィ ィイィィィェィェィィ7 ィ ィェ9ィ ィャィ 57 ィィ9ィコィョ99ィー9ィェィェ4 ィェィヲ86ィェィェ4 9ィーィヲ, 69ィョィエ9ィー9576ィエィィ 6ィ 8ィ ィ 6ィーィコィョ ィャィェ66ィャ8ィェ4 9ィィィェィ 569, 4ィ ィ ィェィェ4 9 9ィィ ィィ9ィコ8ィーィェ4 98ィャィェィェ4 8769. 38563ィェ ィ 568ィィィーィャ 6ィ ィイィィィェィェィィ7 9ィーィヲ ィェィ ィコ6ィェ ツィェ6ィヲ 94ィ 68ィコ, ィィ 6ィコィ 4ィ ィェィ 6 67ィーィィィャィ 5ィケィェ69ィーィケ. 4ィ 48ィ ィ 6ィーィ ィェ4 8ィコィョ88ィェィーィェ4 786ィョ84, 78ィェィ 4ィェィ ツィェィェ4 57 8ィ ィ 6ィー4 9 8ィ 5ィケィェ6ィャ 98ィャィェィィ. 38563ィェ ィ 568ィィィーィャ 57 6ィェィコィィ "9ィコ5ィ ィ " ィコィ 36ィヲ ィィ4 9ィーィヲ, 967ィエィィ 9 ィ ィェ9ィ ィャィ 5ィケ, 9 6ィ ィイィィィェィェィェィョ6 6ィェィコィョ. 5ィィィェィー4ィィ869ィ ィェィ ィ 8ィィィーィコィーィョ8ィ ィェィヲ86ィェィェ6ィヲ ィャィーィ 9ィーィィ, 83ィ 6ィエィヲ 8ィ 99ィャィ ィー8ィィ9ィ ィャィョ6 786ィ 5ィャィョ. 4ィ 49ィィ9ィ ィャ4ィヲ 766 79657ィー 76949ィィィーィケ ィー6 ツィェ69ィーィケ 83ィェィィ7 4ィ ィ ツ 786ィェィィ869ィ ィェィィ7, ィィ5ィケィー8ィ ィィィィ, 95ィ 3ィィ9ィ ィェィィ7, 5ィャィョ57ィィィィ, 6ィ 8ィ ィーィェ66 ィャ65ィィ869ィ ィェィィ7 ィィ ィィィャ 766ィ ィェ4, 57 83ィェィィ7 ィコ6ィー684 ィィ9765ィケ4ィョ6ィー ィィ9ィコィョ99ィー9ィェィェ4 ィェィヲ86ィェィェ4 9ィーィィ, 69ィェ69ィ ィェィェ4 ィェィ 7ィ 8ィ ィィィャ 6ィ ィョ ツィェィィ7 9 ィョ ツィィィー5ィャ. 457ィェィョィー6 4ィ ィ ツィョ 6ィ 'ィェィ ィェィェ7 ィ ィェ9ィ ィャィ 56 3ィーィョ ツィェィィ ィェィヲ86ィェィェィィ ィャ83, ィエ6 46ィヲ9ィェ66ィーィケ 6ィ 86ィ ィコィョ ィ ィ ィ ィー69ィィィャ68ィェィィ 9ィィィェィ 569, 7ィコ6 4ィ ィ ィェ6 ィョ 9ィィ576 ィィ9ィコ8ィーィェィィ ツィ 969ィィ 8769. 5ィ 78676ィェ69ィ ィェ6 ィ 568ィィィーィャ 6ィ 'ィェィ ィェィェ7 ィャ83 ィェィ ィコ6ィェ96ィヲ 9ィィィ 686, 6 69ィェ6 ィヲ66 67ィーィィィャィ 5ィケィェ69ィーィケ. 486ィ 5ィェ6 8ィコィョ8ィェィーィェ6 786ィョ8ィィ 57 86ィ 6ィーィィ 9 8ィ 5ィケィェ6ィャィョ ツィ 96. 5ィ 78676ィェ69ィ ィェ6 ィ 568ィィィーィャ 57 66ィェィコィィ "9ィェ9ィコィョ" ィコ63ィェ67 4 ィャ83, ィエ6 967ィーィケ 6 ィ ィェ9ィ ィャィ 56, 9 6ィ 'ィェィ ィェィョ 66ィェィコィョ. 5ィィィェィー469ィ ィェ6 ィ 86ィーィコィーィョ8ィョ ィェィヲ86ィェィェ67 ィャィーィ ィャ836, ィエ6 89'74ィョ 857ィェィョィーィョ 786ィ 5ィャィョ. 366, ィエ6 89ィィ9ィ ィーィケ97, 9657 769ィィィエィィィーィィ ィー6 ツィェ69ィーィケ 89'74ィ ィェィェ7 4ィ ィ ツ 786ィェィョ9ィ ィェィェ7, 65ィケィー8ィ 67, 45ィ 3ィョ9ィ ィェィェ7, ィャィョ5767, 49686ィーィェ66 ィャ6569ィ ィェィェ7 ィーィ 766ィ ィェィィ 7ィャ, 57 89'74ィ ィェィェ7 7ィコィィ 9ィィィコ68ィィ9ィー69ィョ6ィーィケ 3ィーィョ ツィェ6 ィェィヲ86ィェィェ6 ィャ836, ィエ6 4ィ 9ィェ69ィ ィェ6 ィェィ 7ィ 8ィ ィィィャ6 ィェィ 9 ツィ ィェィェ7 4 ィョ ツィィィー5ィャ. The problem of combining of an ensemble of artificial neural networks that process multivariate signals presented as discrete time series is considered. An algorithm for combining of the networks on a finite sample is proposed and its optimality is proven. Recurrent procedures for real-time processing are developed. An algorithm for assessing the "contribution" of each member network into the combined estimate is proposed. Architecture of a neural meta-network for the considered problem's solution is synthesized. The presented approach provides improvement of the solution's precision in extrapolation, filtering, smoothing, emulation, reverse modeling and other similar problems that can be solved using artificial neural networks based on supervised learning paradigm.. 356888 349934 ィェィ 9ィー67ィエ 98ィャ7 ィィ9ィコィョ99ィー9ィェィェ4 ィェィヲ86ィェィェ4 9ィーィィ (65) ィェィ 35ィィ 3ィィ86ィコ6 78ィィィャィェィェィィ 9 4ィ ィ ツィ 6ィ 8ィ ィ 6ィーィコィィ ィィィェ68ィャィ ィィィィ ィーィ ィコィィ, ィコィ ィコ 786ィェィィ869ィ ィェィィ, ィィ5ィケィー8ィ ィィ7, 95ィ 3ィィ9ィ ィェィィ, 5ィャィョ57ィィ7, 6ィ 8ィ ィーィェ6 ィャ65ィィ869ィ ィェィィ, ィ ィ 7ィーィィ9ィェ6 ィョ78ィ 95ィェィィ, ィー.. 9 4ィ ィ ツィ, ィコ6ィー684 ィャ6ィョィー ィ 4ィーィケ 83ィェ4 9 8ィ ィャィコィ ィェィヲ869ィー96ィヲ 7ィ 8ィ ィィィャ4 6ィ ィョ ツィェィィ7 9 ィョ ツィィィー5ィャ. 797 ィィ9765ィケ469ィ ィェィィ7 65 9974ィ ィェ, 783 996, 9 ィョィェィィ989ィ 5ィケィェ4ィャィィ ィ 7786ィコ9ィィィャィィ8ィョ6ィエィィィャィィ 9ィャ63ィェ69ィー7ィャィィ, 796576ィエィィィャィィ 83ィ ィーィケ 69ィーィ ィー6 ツィェ6 9563ィェ4 ィェ5ィィィェィヲィェ4 9ィー6ィ 9ィーィィ ツ9ィコィィ 4ィ ィ ツィィ 9 ィョ9569ィィ7 ィ 78ィィ68ィェ6ィヲ ィィ ィーィコィョィエィヲ ィェ6785ィェィェ69ィーィィ 6 ィ 8ィ ィコィー8ィィ9ィーィィィコィ 6ィ 8ィ ィ ィ ィー49ィ ィャ4 9ィィィェィ 569. ィャ9ィー 9 ィーィャ 95ィョィー 6ィーィャィーィィィーィケ, ツィー6 6ィェィ ィィ ィーィ 3 4ィ ィ ツィ ィャ63ィー ィ 4ィーィケ 83ィェィ 9 76ィャ6ィエィケ6 8ィ 45ィィ ツィェ4 9ィーィヲ, 6ィー5ィィ ツィ 6ィエィィ97 ィ 8ィィィーィコィーィョ86ィヲ, ィコ65ィィ ツ9ィー96ィャ ィェィヲ86ィェ69, ィ 568ィィィーィャィ ィャィィ 6ィ ィョ ツィェィィ7, ィェィ ツィ 5ィケィェ4ィャィィ ィョ9569ィィ7ィャィィ, 97696ィ ィ ィャィィ 68ィ ィェィィ4ィ ィィィィ 6ィ ィョ ツィ 6ィエィヲ 94ィ 68ィコィィ ィーィ ィコ, ツィー6 94ィ 68 ィコ6ィェィコ8ィーィェ6ィヲ 9ィーィィ 57 ィコ6ィェィコ8ィーィェ6ィヲ 4ィ ィ ツィィ 7957ィー97 69ィーィ ィー6 ツィェ6 9563ィェ6ィヲ 786ィ 5ィャ6ィヲ. 8ィ ツ9ィー96 83ィェィィ7 4ィ ィ ツィィ ィャ63ィー ィ 4ィーィケ 9ィョィエ9ィー9ィェィェ6 76943ィェ6 9 76ィャ6ィエィケ6 ィーィ ィコ ィェィ 449ィ ィャ66 ィ ィェ9ィ ィャィ 57 (ィコ6ィャィィィーィーィ ) ィェィヲ869ィーィヲ [-9], ィコ6ィ 6ィェィィ ィィ ィー 3 ィ ィェィェ4 6ィ 8ィ ィ ィ ィー49ィ 6ィー97 7ィ 8ィ 555ィケィェ6 ィェ9ィコ65ィケィコィィィャィィ 65, 946ィェ4 9ィィィェィ 54 ィコ6ィー684 ィ 5 ィェィコ6ィー684ィャ 6ィ 8ィ 46ィャ ィコ6ィャィ ィィィェィィ8ィョ6ィー97 9 6ィ ィイィィィェィェィェィョ6 6ィェィコィョ, 7896967ィエィョ6 76 ィコィ ツ9ィー9ィョ 6ィェィコィィ, 765ィョ ツィェィェ4 9 76ィャ6ィエィケ6 56ィコィ 5ィケィェ4 9ィーィヲ, 967ィエィィ 9 ィ ィェ9ィ ィャィ 5ィケ ィーィ ィコ, ィコィ ィコ 5ィー6 76ィコィ 4ィ ィェ6 ィェィ 8ィィ9.. ィ 78ィ ィコィーィィィコ ィェィ ィィィ 65ィケ3 8ィ 97869ィー8ィ ィェィェィィ 765ィョ- ツィィ5ィィ 9ィ 766ィ ィコ 6ィ ィイィィィェィェィィ6 9ィーィヲ ィ ィェ9ィ ィャィ 57: ィャ6ィョ5ィケィェ4ィヲ ィィ 69ィェ69ィ ィェィェ4ィヲ ィェィ 9493ィェィェ6ィャ ィョ98ィェィェィィィィ [3]. 6 6ィー7 9683ィ ィー5ィケィェ6 6ィェィィ 69ィーィ ィー6 ツィェ6 6ィー5ィィ ツィ 6ィー97 8ィョ 6ィー 8ィョィ, ィィ 6ィ ィイィィィェ7ィー ィー6, ツィー6 6ィ ィ 6ィェィィ ィィ9765ィケ4ィョ6ィー 5ィィィェィヲィェィョ6 ィコ6ィャィ ィィィェィ ィィ6 996ィィ ィコ6ィャ76ィェィェィー 9 ィー6ィヲ ィィ5ィィ ィィィェ6ィヲ 68ィャ [9]. 6ィョ5ィケィェ4ィヲ 766 ィィィャィー 69ィーィ ィー6 ツィェ6 598ィィ9ィーィィ ツ9ィコィィィヲ ィ 8ィ ィコィー8 9 6ィー5ィィ ツィィ 6ィー ィャィ ィーィャィ ィーィィ ツ9ィコィィ ィ 65 57