Journal article
1994
Alice Gabrielle Twight Professor of Psychology & Education
(847)467-1272
Department of Psychology
Northwestern University
APA
Click to copy
Forbus, K. D., Ferguson, R. W., & Gentner, D. (1994). Incremental Structure-mapping Figure 1 : the Sme Algorithm 2. Issues in Incremental Mapping.
Chicago/Turabian
Click to copy
Forbus, Kenneth D., R. W. Ferguson, and D. Gentner. “Incremental Structure-Mapping Figure 1 : the Sme Algorithm 2. Issues in Incremental Mapping” (1994).
MLA
Click to copy
Forbus, Kenneth D., et al. Incremental Structure-Mapping Figure 1 : the Sme Algorithm 2. Issues in Incremental Mapping. 1994.
BibTeX Click to copy
@article{kenneth1994a,
title = {Incremental Structure-mapping Figure 1 : the Sme Algorithm 2. Issues in Incremental Mapping},
year = {1994},
author = {Forbus, Kenneth D. and Ferguson, R. W. and Gentner, D.}
}
Many cognitive tasks involving analogy, such as understanding metaphors, problem-solving, and learning, require the ability to extend mappings as new information is found. This paper describes a new version of SME, called I-SME, that operates incrementally . I-SME is inspired by Keane's IAM model and the use of incremental mapping in Falkenhainer's PHINEAS learning system . We describe the I-SME algorithm and discuss tradeoffs introduced by incremental mapping, including parallel versus serial processing and pragmatic influences . The utility of I-SME is illustrated by two examples . First, we show that I-SME can account for the psychological results found by Keane on a serial version of the Holyoak & Thagard attribute mapping task . Second, we describe how I-SME is used in the Minimal Analogical Reasoning System (MARS), which uses analogy to solve engineering thermodynamics problems . Many cognitive tasks involving analogy, including metaphor understanding, problem solving, and learning require the ability to process information incrementally . In metaphor understanding, readers often build up correspondences across several sentences (Boronat & Gentner, in preparation) . In problem solving, students using a worked example to solve a related novel problem may go back and forth between them, seeking additional ways to interpret the new problem in light of the old. In conceptual change, new data can lead to analogies being modified or abandoned. Burstein (1986) was the first to computationally model incremental processing in analogical learning. Falkenhainer's (1987, 1990) PHINEAS demonstrated that SME could be used in a map/analyze cycle to model the incremental use of analogy in discovering physical theories . The first general-purpose incremental analogical matcher was Keane's IAM (Keane & Bradshaw, 1988). This paper presents I-SME, an incremental version of SME (Falkenhainer, Forbus, & Gentner, 1986 ; 1989) that simulates the computations proposed by Structure-Mapping theory (Gentner, 1983). I-SME has all the capabilities of SME, including the ability to generate novel candidate inferences . Unlike SME, I-SME can extend its existing interpretations when given new information about base or Incremental Structure-Mapping To appear in the Proceedings of the Cognitive Science Society, August, 1994 Dedre Gentner Psychology Department Northwestern University 2029 Sheridan Road Evanston, IL, 60208, USA [email protected] target . Section 2 discusses psychological and computational issues raised by incremental mapping. Section 3 describes the I-SME algorithm. Section 4 illustrates its operation on two kinds of tasks: (1) an attribute-mapping task that shows I-SMEcan explain order effects found by Keane (in press), and (2) modeling student behavior in solving thermodynamics problems by referring to worked examples . Section 5 discusses other issues and future work . Given propositional representations for abase and target, 1 . LocalMatch Construction : For each pair of expressions in base and target, if their predicates are identical, create a match hypothesis (MH) expressing the possibility that they match. Corresponding arguments of items participating in an MH are also matched if either (1) their predicates are identical, (2) the predicates are functions, or (3) they are entities . This network of match hypotheses provides the grist for subsequent processing . 2. Kernel Construction : Find kernel mappings by starting at MH's which are not themselves the argument of any other. For each such MH;, if it is structurally consistent, then it and every MH below it comprise a kernel . Otherwise, recurse on the MH's that are its arguments. 3 . Structural Evaluation: Propagate scores through the network of MH's, using argument connections to pass activation downward . This "trickle-down" of scores implements the systematicity preference . The score of a mapping is the sum of the scores of its MH's . 4 . Merge: Construct up to n global interpretations of a match by using a greedy algorithm to find structurally consistent combinations of kernels that maximize structural evaluation scores . Figure 1 : The SME Algorithm 2. Issues in incremental mapping Serial vs Parallel : What parts of the matching process should be serial versus parallel? In SME (Figure 1), processing is essentially parallel within the first three Figure 2: Kernels for the simple water/heat analogy example. Mappings are constructed by merging consistent collections of kernels. In this example, the structurally best interpretation merges kernels 1 and 2. The worst interpretation merges kernels 4 and 5. stages : only the final step of constructing global mappings is serial . By contrast, IAM is serial throughout : Even decisions about local matches are made sequentially, so that exploring alternate interpretations requires backtracking . SME avoids backtracking by creating, in parallel, networks representing all alternative local matches between items plus intermediate clusters (i .e ., kernel mappings) We believe that a combination of initial parallel and later serial operation will best model human analogical processing . Some serial processing is essential: One cannot combine all information in parallel when some if it is not yet available. However, we believe the fully serial approach of IAM is unlikely to scale up to cognitively interesting representations. The natural place for serial processing is in the Merge step, because the kernels, which represent coherent, structurally-consistent collections of local matches, form a more appropriate unit of analysis for limited-resource serial processing than individual local matches themselves . (Figure 2 shows the kernel mappings for a simple example.) When base and target share large systematic structures, the number of kernels is small. Serial, capacity-limited merging of kernels could thus provide a plausible explanation for the "More is Less" phenomena (Gentner & Ratterman, 1991) where additional knowledge can improve both the rapidity and accuracy of mapping . Pragmatics .Our original position (Gentner & Clement, 1988, Gentner, 1989) was that pragmatic concerns affected I The fact that most SME implementations happen to run on serial machines is irrelevant here . ACME (Holyoak & Thagard, 1989), is also parallel in conception but is commonly implemented on serial machines, and the extraction of a mapping from its network is a serial process outside the simulation . Hummel &Holyoak (1992) explored incremental mapping, but their scheme does not handle higher-order relations , making it unsuitable for problemsolving . To appear in the Proceedings ofthe Cognitive Science Society, August, 1994 only the preand post-processing of the mapping subsystem, with mapping being entirely based on structural consistency and relational identity matches. Others, such as Holyoak (1985) claimed "The distinction [between surface and structural similarities] thus crucially depends on the goal of the problem solver ." Both positions have become less extreme. Holyoak & Thagard (1989) have incorporated structural consistency into ACME, and Hummel & Holyoak (1992) have pointed out problems in using pragmatics to influence the online mapping process. On our side, we have adopted for some purposes a technique called pragmatic marking (Forbus & Oblinger, 1990), to filter kernels that cannot yield novel candidate inferences relevant to a goal . Incremental matching is subject to misleading matches when only partial information is available, and early poor decisions can mislead subsequent processing . Theuse of this limited form of pragmatic information during mapping can help overcome these problems . However, we continue to believe that in order to capture the generativity of human analogical reasoning, the mapping process needs to have enough leeway to detect unanticipated structural correspondences. The incremental algorithm in this paper cycles between a largely autonomous mapping process and intervention by other goal-oriented processes . 3. The Incremental SME Algorithm The Incremental SME algorithm (I-SME) accepts information about base and target descriptions to compare. Information can be presented to it all at once, one fact at a time, or in any combination desired. (When I-SME is given every item in base and target at the same time, its results are identical to those of SME.) I-SME maintains a small set of global mappings that represent its best interpretations of how the base and target compare, according to the information it has received . Each Legend Match HypothcoJo Base item kernel 2 kernel 3 Target Item Argument relation Greater-Than Greater-Than Greater-Than Greater-Than kernel 4 kernel 5 kernel 1 Flow Pressure Temperature Pressure Temperature Diameter Temperature Diameter Temperature Flat-Top Flat-Top Liquid L' uid zV" ~ ~ 1 1 Beaker Vial water pipe Beaker Vial beaker vial water water coffee ice-cube heat bar coffee ice-cube coffee ice-cube coffee coffee EXTEND: Given new base items Bl ..Bi and new target items T1 ..Tj, and previous global mappings M1 ..Mk (if any), I . Extend set of local match hypotheses (Mhi) by testing B l . .Bi against new and old target items, and by testing Tl . .Tj against old and new base items. New elements of the match hypothesis network that violate the parallel connectivity constraint are marked as inconsistent, and nogood links are introduced to enforce the 1 :1 constraint in merge steps . 2. Update the set of kernels. Starting with new root MH's (those which are not arguments of other MH's), search downward in parallel until the first structurally consistent MH is found. It and all of its descendants comprise a new kernel . New match hypotheses can introduce new root mappings or subsume old ones . Let K' be the set of new kernels. 3. Carry out structural evaluation on the new match hypotheses and K' . Filter K' via pragmatic constraints, if any. 4. Extend M1 ..Mk by merging in kernels from