Artificial IntelligenceMcGraw-Hill, 1991 - 621 páginas |
Dentro del libro
Resultados 1-3 de 28
Página 496
... weights is . Let w be the weight vector ( wo , W1 , ... , w ) , and let X be the subset of training instances misclassified by the current set of weights . Then define the perceptron criterion function , J ( w ) , to be the sum of the ...
... weights is . Let w be the weight vector ( wo , W1 , ... , w ) , and let X be the subset of training instances misclassified by the current set of weights . Then define the perceptron criterion function , J ( w ) , to be the sum of the ...
Página 498
... weights , where the extra input xo is always set to 1 . 2. Initialize the weights ( wo , W1 , ... , w ' , ) to random real values . 3. Iterate through the training set , collecting all examples misclassified by the current set of weights ...
... weights , where the extra input xo is always set to 1 . 2. Initialize the weights ( wo , W1 , ... , w ' , ) to random real values . 3. Iterate through the training set , collecting all examples misclassified by the current set of weights ...
Página 504
... weights . The network adjusts its weights each time it sees an input - output pair . Each pair requires two stages : a forward pass and a backward pass . The forward pass involves presenting a sample input to the network and letting ...
... weights . The network adjusts its weights each time it sees an input - output pair . Each pair requires two stages : a forward pass and a backward pass . The forward pass involves presenting a sample input to the network and letting ...
Contenido
5 | 24 |
Heuristic Search Techniques | 63 |
Knowledge Representation Issues | 105 |
Derechos de autor | |
Otras 28 secciones no mostradas
Otras ediciones - Ver todas
Términos y frases comunes
Abbott agents algorithm answer apply approach ARMEMPTY assertions attributes axioms backpropagation backtracking backward belief best-first search breadth-first search Caesar called Chapter chess clauses complete concept conceptual dependency consider constraints contains contradiction corresponding define depth-first depth-first search described discussed domain fact frame function game tree goal grammar graph heuristic Horn clauses important inference inheritance input instance interpretation isa links John justification knowledge base knowledge representation labeled learning Marcus match minimax move MYCIN natural language node object ON(B operators output parsing particular path perceptron perform players possible preconditions predicate logic problem problem-solving procedure produce PROLOG properties represent result robot rules script Section semantic semantic net sentence shown in Figure simple slot solution solve specific step structure Suppose syntactic task techniques theorem things tree truth maintenance system understanding variables version space