HOME

TheInfoList



OR:

In
computing Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery. It includes the study and experimentation of algorithmic processes, and development of both hardware and software. Computing has scientific, e ...
, memoization or memoisation is an
optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
technique used primarily to speed up
computer programs A computer program is a sequence or set of instructions in a programming language for a computer to execute. Computer programs are one component of software, which also includes documentation and other intangible components. A computer program ...
by storing the results of expensive function calls and returning the cached result when the same inputs occur again. Memoization has also been used in other contexts (and for purposes other than speed gains), such as in simple mutually recursive descent parsing. Although related to caching, memoization refers to a specific case of this optimization, distinguishing it from forms of caching such as buffering or page replacement. In the context of some
logic programming Logic programming is a programming paradigm which is largely based on formal logic. Any program written in a logic programming language is a set of sentences in logical form, expressing facts and rules about some problem domain. Major logic prog ...
languages, memoization is also known as tabling.


Etymology

The term "memoization" was coined by
Donald Michie Donald Michie (; 11 November 1923 – 7 July 2007) was a British researcher in artificial intelligence. During World War II, Michie worked for the Government Code and Cypher School at Bletchley Park, contributing to the effort to solve " Tunny ...
in 1968 and is derived from the
Latin Latin (, or , ) is a classical language belonging to the Italic branch of the Indo-European languages. Latin was originally a dialect spoken in the lower Tiber area (then known as Latium) around present-day Rome, but through the power of the ...
word "
memorandum A memorandum ( : memoranda; abbr: memo; from the Latin ''memorandum'', "(that) which is to be remembered") is a written message that is typically used in a professional setting. Commonly abbreviated "memo," these messages are usually brief and ...
" ("to be remembered"), usually truncated as "memo" in American English, and thus carries the meaning of "turning
he results of He or HE may refer to: Language * He (pronoun), an English pronoun * He (kana), the romanization of the Japanese kana へ * He (letter), the fifth letter of many Semitic alphabets * He (Cyrillic), a letter of the Cyrillic script called ''He'' in ...
a function into something to be remembered". While "memoization" might be confused with "
memorization Memorization is the process of committing something to memory. It is a mental process undertaken in order to store in memory for later recall visual, auditory, or tactical information. The scientific study of memory is part of cognitive neurosc ...
" (because they are etymological
cognate In historical linguistics, cognates or lexical cognates are sets of words in different languages that have been inherited in direct descent from an etymology, etymological ancestor in a proto-language, common parent language. Because language c ...
s), "memoization" has a specialized meaning in computing.


Overview

A memoized function "remembers" the results corresponding to some set of specific inputs. Subsequent calls with remembered inputs return the remembered result rather than recalculating it, thus eliminating the primary cost of a call with given parameters from all but the first call made to the function with those parameters. The set of remembered associations may be a fixed-size set controlled by a replacement algorithm or a fixed set, depending on the nature of the function and its use. A function can only be memoized if it is
referentially transparent In computer science, referential transparency and referential opacity are properties of parts of computer programs. An expression is called ''referentially transparent'' if it can be replaced with its corresponding value (and vice-versa) withou ...
; that is, only if calling the function has exactly the same effect as replacing that function call with its return value. (Special case exceptions to this restriction exist, however.) While related to
lookup table In computer science, a lookup table (LUT) is an array that replaces runtime computation with a simpler array indexing operation. The process is termed as "direct addressing" and LUTs differ from hash tables in a way that, to retrieve a value v wi ...
s, since memoization often uses such tables in its implementation, memoization populates its cache of results transparently on the fly, as needed, rather than in advance. Memoization is a way to lower a function's ''time'' cost in exchange for ''space'' cost; that is, memoized functions become optimized for ''speed'' in exchange for a higher use of
computer memory In computing, memory is a device or system that is used to store information for immediate use in a computer or related computer hardware and digital electronic devices. The term ''memory'' is often synonymous with the term ''primary storage ...
''space''. The time/space "cost" of
algorithm In mathematics and computer science, an algorithm () is a finite sequence of rigorous instructions, typically used to solve a class of specific Computational problem, problems or to perform a computation. Algorithms are used as specificat ...
s has a specific name in computing: ''
computational complexity In computer science, the computational complexity or simply complexity of an algorithm is the amount of resources required to run it. Particular focus is given to computation time (generally measured by the number of needed elementary operations) ...
''. All functions have a computational complexity in ''time'' (i.e. they take time to execute) and in ''space''. Although a
space–time tradeoff A space–time trade-off or time–memory trade-off in computer science is a case where an algorithm or program trades increased space usage with decreased time. Here, ''space'' refers to the data storage consumed in performing a given task (RA ...
occurs (i.e., space used is speed gained), this differs from some other optimizations that involve time-space trade-off, such as
strength reduction In compiler construction, strength reduction is a compiler optimization where expensive operations are replaced with equivalent but less expensive operations. The classic example of strength reduction converts "strong" multiplications inside a loo ...
, in that memoization is a run-time rather than
compile-time In computer science, compile time (or compile-time) describes the time window during which a computer program is compiled. The term is used as an adjective to describe concepts related to the context of program compilation, as opposed to concept ...
optimization. Moreover, strength reduction potentially replaces a costly operation such as multiplication with a less costly operation such as addition, and the results in savings can be highly
machine-dependent Machine-dependent software is software that runs only on a specific computer. Applications that run on multiple computer architectures are called machine-independent, or cross-platform. Many organisations opt for such software because they believe t ...
(non-portable across machines), whereas memoization is a more machine-independent,
cross-platform In computing, cross-platform software (also called multi-platform software, platform-agnostic software, or platform-independent software) is computer software that is designed to work in several computing platforms. Some cross-platform software r ...
strategy. Consider the following
pseudocode In computer science, pseudocode is a plain language description of the steps in an algorithm or another system. Pseudocode often uses structural conventions of a normal programming language, but is intended for human reading rather than machine re ...
function to calculate the
factorial In mathematics, the factorial of a non-negative denoted is the product of all positive integers less than or equal The factorial also equals the product of n with the next smaller factorial: \begin n! &= n \times (n-1) \times (n-2) \t ...
of ''n'': function factorial (''n'' is a non-negative integer) if ''n'' is 0 then return 1 'by the convention that'' 0! = 1 else return factorial(''n'' – 1) times ''n'' 'recursively invoke factorial'' ''with the parameter 1 less than n'' end if end function For every
integer An integer is the number zero (), a positive natural number (, , , etc.) or a negative integer with a minus sign (−1, −2, −3, etc.). The negative numbers are the additive inverses of the corresponding positive numbers. In the language ...
''n'' such that n ≥ 0, the final result of the function factorial is
invariant Invariant and invariance may refer to: Computer science * Invariant (computer science), an expression whose value doesn't change during program execution ** Loop invariant, a property of a program loop that is true before (and after) each iteratio ...
; if invoked as ''x'' = factorial(3), the result is such that ''x'' will ''always'' be assigned the value 6. The non-memoized implementation above, given the nature of the
recursive Recursion (adjective: ''recursive'') occurs when a thing is defined in terms of itself or of its type. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in mathematics ...
algorithm involved, would require ''n + 1'' invocations of factorial to arrive at a result, and each of these invocations, in turn, has an associated cost in the time it takes the function to return the value computed. Depending on the machine, this cost might be the sum of: # The cost to set up the functional call stack frame. # The cost to compare ''n'' to 0. # The cost to subtract 1 from ''n''. # The cost to set up the recursive call stack frame. (As above.) # The cost to multiply the result of the recursive call to factorial by ''n''. # The cost to store the return result so that it may be used by the calling context. In a non-memoized implementation, ''every'' top-level call to factorial includes the cumulative cost of steps 2 through 6 proportional to the initial value of ''n''. A memoized version of the factorial function follows: function factorial (''n'' is a non-negative integer) if ''n'' is 0 then return 1 'by the convention that'' 0! = 1 else if ''n'' is in ''lookup-table'' then return ''lookup-table-value-for-n'' else let x = factorial(n – 1) times ''n'' 'recursively invoke factorial'' ''with the parameter 1 less than n'' store ''x'' in ''lookup-table'' in the ''n''th slot 'remember the result of n! for later'' return x end if end function In this particular example, if factorial is first invoked with 5, and then invoked later with any value less than or equal to five, those return values will also have been memoized, since factorial will have been called recursively with the values 5, 4, 3, 2, 1, and 0, and the return values for ''each'' of those will have been stored. If it is then called with a number greater than 5, such as 7, only 2 recursive calls will be made (7 and 6), and the value for 5! will have been stored from the previous call. In this way, memoization allows a function to become more time-efficient the more often it is called, thus resulting in eventual overall speed-up.


Some other considerations


Functional programming

Memoization is heavily used in compilers for
functional programming In computer science, functional programming is a programming paradigm where programs are constructed by Function application, applying and Function composition (computer science), composing Function (computer science), functions. It is a declar ...
languages, which often use
call by name In a programming language, an evaluation strategy is a set of rules for evaluating expressions. The term is often used to refer to the more specific notion of a ''parameter-passing strategy'' that defines the kind of value that is passed to the f ...
evaluation strategy. To avoid overhead with calculating argument values, compilers for these languages heavily use auxiliary functions called
thunk In computer programming, a thunk is a subroutine used to inject a calculation into another subroutine. Thunks are primarily used to delay a calculation until its result is needed, or to insert operations at the beginning or end of the other subr ...
s to compute the argument values, and memoize these functions to avoid repeated calculations.


Automatic memoization

While memoization may be added to functions ''internally'' and ''explicitly'' by a
computer programmer A computer programmer, sometimes referred to as a software developer, a software engineer, a programmer or a coder, is a person who creates computer programs — often for larger computer software. A programmer is someone who writes/creates ...
in much the same way the above memoized version of factorial is implemented, referentially transparent functions may also be automatically memoized ''externally''. The techniques employed by
Peter Norvig Peter Norvig (born December 14, 1956) is an American computer scientist and Distinguished Education Fellow at the Stanford Institute for Human-Centered AI. He previously served as a director of research and search quality at Google. Norvig is t ...
have application not only in
Common Lisp Common Lisp (CL) is a dialect of the Lisp programming language, published in ANSI standard document ''ANSI INCITS 226-1994 (S20018)'' (formerly ''X3.226-1994 (R1999)''). The Common Lisp HyperSpec, a hyperlinked HTML version, has been derived fro ...
(the language in which his paper demonstrated automatic memoization), but also in various other
programming language A programming language is a system of notation for writing computer programs. Most programming languages are text-based formal languages, but they may also be graphical. They are a kind of computer language. The description of a programming ...
s. Applications of automatic memoization have also been formally explored in the study of
term rewriting In mathematics, computer science, and logic, rewriting covers a wide range of methods of replacing subterms of a formula with other terms. Such methods may be achieved by rewriting systems (also known as rewrite systems, rewrite engines, or reduc ...
and
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machines, as opposed to intelligence displayed by animals and humans. Example tasks in which this is done include speech re ...
. In programming languages where functions are
first-class object In programming language design, a first-class citizen (also type, object, entity, or value) in a given programming language is an entity which supports all the operations generally available to other entities. These operations typically include ...
s (such as
Lua Lua or LUA may refer to: Science and technology * Lua (programming language) * Latvia University of Agriculture * Last universal ancestor, in evolution Ethnicity and language * Lua people, of Laos * Lawa people, of Thailand sometimes referred t ...
,
Python Python may refer to: Snakes * Pythonidae, a family of nonvenomous snakes found in Africa, Asia, and Australia ** ''Python'' (genus), a genus of Pythonidae found in Africa and Asia * Python (mythology), a mythical serpent Computing * Python (pro ...
, or
Perl Perl is a family of two high-level, general-purpose, interpreted, dynamic programming languages. "Perl" refers to Perl 5, but from 2000 to 2019 it also referred to its redesigned "sister language", Perl 6, before the latter's name was offici ...
), automatic memoization can be implemented by replacing (at run-time) a function with its calculated value once a value has been calculated for a given set of parameters. The function that does this value-for-function-object replacement can generically wrap any referentially transparent function. Consider the following pseudocode (where it is assumed that functions are first-class values): function memoized-call (''F'' is a function object parameter) if ''F'' has no attached array ''values'' then allocate an
associative array In computer science, an associative array, map, symbol table, or dictionary is an abstract data type that stores a collection of (key, value) pairs, such that each possible key appears at most once in the collection. In mathematical terms an ...
called ''values''; attach ''values'' to ''F''; end if; if ''F''.''values rguments' is empty then ''F''.''values rguments' = ''F''(arguments); end if; return F.''values rguments'; end function In order to call an automatically memoized version of factorial using the above strategy, rather than calling factorial directly, code invokes memoized-call(factorial(''n'')). Each such call first checks to see if a holder array has been allocated to store results, and if not, attaches that array. If no entry exists at the position values rguments/code> (where arguments are used as the key of the associative array), a ''real'' call is made to factorial with the supplied arguments. Finally, the entry in the array at the key position is returned to the caller. The above strategy requires ''explicit'' wrapping at each call to a function that is to be memoized. In those languages that allow closures, memoization can be effected ''implicitly'' via a
functor In mathematics, specifically category theory, a functor is a Map (mathematics), mapping between Category (mathematics), categories. Functors were first considered in algebraic topology, where algebraic objects (such as the fundamental group) ar ...
factory that returns a wrapped memoized function object in a
decorator pattern In object-oriented programming, the decorator pattern is a design pattern that allows behavior to be added to an individual object, dynamically, without affecting the behavior of other objects from the same class. The decorator pattern is often ...
. In pseudocode, this can be expressed as follows: function construct-memoized-functor (''F'' is a function object parameter) allocate a function object called ''memoized-version''; let memoized-version(arguments) be if ''self'' has no attached array values then
this This may refer to: * ''This'', the singular proximal demonstrative pronoun Places * This, or ''Thinis'', an ancient city in Upper Egypt * This, Ardennes, a commune in France People with the surname * Hervé This, French culinary chemist Arts, e ...
object''] allocate an associative array called ''values''; attach ''values'' to ''self''; end if; if self.''values rguments' is empty then self.''values rguments' = ''F''(arguments); end if; return self.''values rguments'; end let; return ''memoized-version''; end function Rather than call factorial, a new function object memfact is created as follows: memfact = construct-memoized-functor(factorial) The above example assumes that the function factorial has already been defined ''before'' the call to construct-memoized-functor is made. From this point forward, memfact(''n'') is called whenever the factorial of ''n'' is desired. In languages such as Lua, more sophisticated techniques exist which allow a function to be replaced by a new function with the same name, which would permit: factorial = construct-memoized-functor(factorial) Essentially, such techniques involve attaching the ''original function object'' to the created functor and forwarding calls to the original function being memoized via an alias when a call to the actual function is required (to avoid endless
recursion Recursion (adjective: ''recursive'') occurs when a thing is defined in terms of itself or of its type. Recursion is used in a variety of disciplines ranging from linguistics to logic. The most common application of recursion is in mathematics ...
), as illustrated below: function construct-memoized-functor (''F'' is a function object parameter) allocate a function object called ''memoized-version''; let ''memoized-version''(arguments) be if ''self'' has no attached array values then 'self is a reference to this object'' allocate an associative array called ''values''; attach ''values'' to ''self''; allocate a new function object called ''alias''; attach ''alias'' to ''self''; 'for later ability to invoke F indirectly'' self.''alias'' = ''F''; end if; if self.''values rguments' is empty then self.''values rguments' = self.''alias''(arguments); 'not a direct call to F'' end if; return self.''values rguments'; end let; return ''memoized-version''; end function (Note: Some of the steps shown above may be implicitly managed by the implementation language and are provided for illustration.)


Parsers

When a
top-down parser Top-down parsing in computer science is a parsing strategy where one first looks at the highest level of the parse tree and works down the parse tree by using the rewriting rules of a formal grammar. LL parsers are a type of parser that uses a top-d ...
tries to parse an ambiguous input with respect to an ambiguous context-free grammar (CFG), it may need an exponential number of steps (with respect to the length of the input) to try all alternatives of the CFG in order to produce all possible parse trees. This eventually would require exponential memory space. Memoization was explored as a
parsing Parsing, syntax analysis, or syntactic analysis is the process of analyzing a string of symbols, either in natural language, computer languages or data structures, conforming to the rules of a formal grammar. The term ''parsing'' comes from Lati ...
strategy in 1991 by Peter Norvig, who demonstrated that an algorithm similar to the use of
dynamic programming Dynamic programming is both a mathematical optimization method and a computer programming method. The method was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics. I ...
and state-sets in Earley's algorithm (1970), and tables in the
CYK algorithm In computer science, the Cocke–Younger–Kasami algorithm (alternatively called CYK, or CKY) is a parsing algorithm for context-free grammars published by Itiroo Sakai in 1961. The algorithm is named after some of its rediscoverers: John Cocke, ...
of Cocke, Younger and Kasami, could be generated by introducing automatic memoization to a simple
backtracking Backtracking is a class of algorithms for finding solutions to some computational problems, notably constraint satisfaction problems, that incrementally builds candidates to the solutions, and abandons a candidate ("backtracks") as soon as it d ...
recursive descent parser In computer science, a recursive descent parser is a kind of top-down parser built from a set of mutually recursive procedures (or a non-recursive equivalent) where each such procedure implements one of the nonterminals of the grammar. Thus t ...
to solve the problem of exponential time complexity. The basic idea in Norvig’s approach is that when a parser is applied to the input, the result is stored in a memotable for subsequent reuse if the same parser is ever reapplied to the same input. Richard Frost also used memoization to reduce the exponential time complexity of parser combinators, which can be viewed as “Purely Functional Top-Down Backtracking” parsing technique. He showed that basic memoized parser combinators can be used as building blocks to construct complex parsers as executable specifications of CFGs. It was again explored in the context of parsing in 1995 by Johnson and Dörre. In 2002, it was examined in considerable depth by Ford in the form called packrat parsing. In 2007, Frost, Hafiz and Callaghan described a top-down parsing algorithm that uses memoization for refraining redundant computations to accommodate any form of ambiguous CFG in
polynomial In mathematics, a polynomial is an expression consisting of indeterminates (also called variables) and coefficients, that involves only the operations of addition, subtraction, multiplication, and positive-integer powers of variables. An exa ...
time ( Θ(n4) for left-recursive grammars and Θ(n3) for non left-recursive grammars). Their top-down parsing algorithm also requires polynomial space for potentially exponential ambiguous parse trees by 'compact representation' and 'local ambiguities grouping'. Their compact representation is comparable with Tomita’s compact representation of
bottom-up parsing In computer science, parsing reveals the grammatical structure of linear input text, as a first step in working out its meaning. Bottom-up parsing recognizes the text's lowest-level small details first, before its mid-level structures, and leavin ...
. Their use of memoization is not only limited to retrieving the previously computed results when a parser is applied to a same input position repeatedly (which is essential for polynomial time requirement); it is specialized to perform the following additional tasks: * The memoization process (which could be viewed as a ‘wrapper’ around any parser execution) accommodates an ever-growing direct left-recursive parse by imposing depth restrictions with respect to input length and current input position. * The algorithm’s memo-table ‘lookup’ procedure also determines the reusability of a saved result by comparing the saved result’s computational context with the parser’s current context. This contextual comparison is the key to accommodate indirect (or hidden) left-recursion. * When performing a successful lookup in a memotable, instead of returning the complete result-set, the process only returns the references of the actual result and eventually speeds up the overall computation. * During updating the memotable, the memoization process groups the (potentially exponential) ambiguous results and ensures the polynomial space requirement. Frost, Hafiz and Callaghan also described the implementation of the algorithm in PADL’08 as a set of higher-order functions (called parser combinators) in
Haskell Haskell () is a general-purpose, statically-typed, purely functional programming language with type inference and lazy evaluation. Designed for teaching, research and industrial applications, Haskell has pioneered a number of programming lan ...
, which enables the construction of directly executable specifications of CFGs as language processors. The importance of their polynomial algorithm’s power to accommodate ‘any form of ambiguous CFG’ with top-down parsing is vital with respect to the syntax and semantics analysis during
natural language processing Natural language processing (NLP) is an interdisciplinary subfield of linguistics, computer science, and artificial intelligence concerned with the interactions between computers and human language, in particular how to program computers to pro ...
. Th
X-SAIGA
site has more about the algorithm and implementation details. While Norvig increased the ''power'' of the parser through memoization, the augmented parser was still as time complex as Earley's algorithm, which demonstrates a case of the use of memoization for something other than speed optimization. Johnson and Dörre demonstrate another such non-speed related application of memoization: the use of memoization to delay linguistic constraint resolution to a point in a parse where sufficient information has been accumulated to resolve those constraints. By contrast, in the speed optimization application of memoization, Ford demonstrated that memoization could guarantee that
parsing expression grammar In computer science, a parsing expression grammar (PEG) is a type of analytic formal grammar, i.e. it describes a formal language in terms of a set of rules for recognizing strings in the language. The formalism was introduced by Bryan Ford in 200 ...
s could parse in
linear Linearity is the property of a mathematical relationship (''function'') that can be graphically represented as a straight line. Linearity is closely related to '' proportionality''. Examples in physics include rectilinear motion, the linear r ...
time even those
languages Language is a structured system of communication. The structure of a language is its grammar and the free components are its vocabulary. Languages are the primary means by which humans communicate, and may be conveyed through a variety of met ...
that resulted in worst-case backtracking behavior. Consider the following
grammar In linguistics, the grammar of a natural language is its set of structure, structural constraints on speakers' or writers' composition of clause (linguistics), clauses, phrases, and words. The term can also refer to the study of such constraint ...
: S → (A c) , (B d) A → X (a, b) B → X b X → x (Notation note: In the above example, the
production Production may refer to: Economics and business * Production (economics) * Production, the act of manufacturing goods * Production, in the outline of industrial organization, the act of making products (goods and services) * Production as a stati ...
S → (A c) , (B d) reads: "An ''S'' is either an ''A'' followed by a c or a ''B'' followed by a d." The production X → x reads "An ''X'' is an x followed by an optional ''X''.") This grammar generates one of the following three variations of string: ''xac'', ''xbc'', or ''xbd'' (where ''x'' here is understood to mean ''one or more ''x'''s''.) Next, consider how this grammar, used as a parse specification, might effect a top-down, left-right parse of the string ''xxxxxbd'': :The rule ''A'' will recognize ''xxxxxb'' (by first descending into ''X'' to recognize one ''x'', and again descending into ''X'' until all the ''x'''s are consumed, and then recognizing the ''b''), and then return to ''S'', and fail to recognize a ''c''. The next clause of ''S'' will then descend into B, which in turn again descends into ''X'' and recognizes the ''x'''s by means of many recursive calls to ''X'', and then a ''b'', and returns to ''S'' and finally recognizes a ''d''. The key concept here is inherent in the phrase again descends into ''X''. The process of looking forward, failing, backing up, and then retrying the next alternative is known in parsing as backtracking, and it is primarily backtracking that presents opportunities for memoization in parsing. Consider a function RuleAcceptsSomeInput(Rule, Position, Input), where the parameters are as follows: * Rule is the name of the rule under consideration. * Position is the offset currently under consideration in the input. * Input is the input under consideration. Let the return value of the function RuleAcceptsSomeInput be the length of the input accepted by Rule, or 0 if that rule does not accept any input at that offset in the string. In a backtracking scenario with such memoization, the parsing process is as follows: : When the rule ''A'' descends into ''X'' at offset 0, it memoizes the length 5 against that position and the rule ''X''. After having failed at ''d'', ''B'' then, rather than descending again into ''X'', queries the position 0 against rule ''X'' in the memoization engine, and is returned a length of 5, thus saving having to actually descend again into ''X'', and carries on ''as if'' it had descended into ''X'' as many times as before. In the above example, one or ''many'' descents into ''X'' may occur, allowing for strings such as ''xxxxxxxxxxxxxxxxbd''. In fact, there may be ''any number'' of ''x'''s before the ''b''. While the call to S must recursively descend into X as many times as there are ''x'''s, ''B'' will never have to descend into X at all, since the return value of RuleAcceptsSomeInput(''X'', 0, ''xxxxxxxxxxxxxxxxbd'') will be 16 (in this particular case). Those parsers that make use of
syntactic predicate A syntactic predicate specifies the syntactic validity of applying a production in a formal grammar and is analogous to a semantic predicate that specifies the semantic validity of applying a production. It is a simple and effective means of dramat ...
s are also able to memoize the results of predicate parses, as well, thereby reducing such constructions as: S → (A)? A A → /* some rule */ to one descent into ''A''. If a parser builds a
parse tree A parse tree or parsing tree or derivation tree or concrete syntax tree is an ordered, rooted tree that represents the syntactic structure of a string according to some context-free grammar. The term ''parse tree'' itself is used primarily in co ...
during a parse, it must memoize not only the ''length'' of the input that matches at some offset against a given rule, but also must store the sub-tree that is generated by that rule at that offset in the input, since subsequent calls to the rule by the parser will not actually descend and rebuild that tree. For the same reason, memoized parser algorithms that generate calls to external code (sometimes called a semantic action routine) when a rule matches must use some scheme to ensure that such rules are invoked in a predictable order. Since, for any given backtracking or syntactic predicate capable parser not every grammar will ''need'' backtracking or predicate checks, the overhead of storing each rule's parse results against every offset in the input (and storing the parse tree if the parsing process does that implicitly) may actually ''slow down'' a parser. This effect can be mitigated by explicit selection of those rules the parser will memoize.


See also

*
Approximate computing Approximate computing is an emerging paradigm for energy-efficient and/or high-performance design. It includes a plethora of computation techniques that return a possibly inaccurate result rather than a guaranteed accurate result, and that can be u ...
– category of techniques to improve efficiency *
Computational complexity theory In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by ...
– more information on algorithm complexity *
Director string In mathematics, in the area of lambda calculus and computation, directors or director strings are a mechanism for keeping track of the free variables in a term. Loosely speaking, they can be understood as a kind of memoization for free variable ...
– rapidly locating free variables in expressions *
Flyweight pattern In computer programming, the flyweight software design pattern refers to an object that minimizes memory usage by sharing some of its data with other similar objects. The flyweight pattern is one of twenty-three well-known '' GoF design patterns' ...
– an object programming
design pattern A design pattern is the re-usable form of a solution to a design problem. The idea was introduced by the architect Christopher Alexander and has been adapted for various other disciplines, particularly software engineering. The " Gang of Four" b ...
, that also uses a kind of memoization *
Hashlife Hashlife is a memoized algorithm for computing the long-term fate of a given starting configuration in Conway's Game of Life and related cellular automata, much more quickly than would be possible using alternative algorithms that simulate each ...
– a memoizing technique to speed up the computation of
cellular automata A cellular automaton (pl. cellular automata, abbrev. CA) is a discrete model of computation studied in automata theory. Cellular automata are also called cellular spaces, tessellation automata, homogeneous structures, cellular structures, tessel ...
*
Lazy evaluation In programming language theory, lazy evaluation, or call-by-need, is an evaluation strategy which delays the evaluation of an expression until its value is needed (non-strict evaluation) and which also avoids repeated evaluations (sharing). The b ...
– shares some concepts with memoization *
Materialized view In computing, a materialized view is a database object that contains the results of a query. For example, it may be a local copy of data located remotely, or may be a subset of the rows and/or columns of a table or join result, or may be a summary ...
– analogous caching in database queries *
Partial evaluation In computing, partial evaluation is a technique for several different types of program optimization by specialization. The most straightforward application is to produce new programs that run faster than the originals while being guaranteed to ...
– a related technique for automatic program optimization


References


External links

; Examples of memoization in various programming languages
groovy.lang.Closure#memoize()
– Memoize is an
Apache Groovy Apache Groovy is a Java-syntax-compatible object-oriented programming language for the Java platform. It is both a static and dynamic language with features similar to those of Python, Ruby, and Smalltalk. It can be used as both a programming lan ...
1.8 language feature.
Memoize
– Memoize is a small library, written by Tim Bradshaw, for performing memoization in
Common Lisp Common Lisp (CL) is a dialect of the Lisp programming language, published in ANSI standard document ''ANSI INCITS 226-1994 (S20018)'' (formerly ''X3.226-1994 (R1999)''). The Common Lisp HyperSpec, a hyperlinked HTML version, has been derived fro ...
.
IncPy
– A custom
Python Python may refer to: Snakes * Pythonidae, a family of nonvenomous snakes found in Africa, Asia, and Australia ** ''Python'' (genus), a genus of Pythonidae found in Africa and Asia * Python (mythology), a mythical serpent Computing * Python (pro ...
interpreter that performs automatic memoization (with no required user annotations) *Dave Herman'
Macros for defining memoized procedures
in Racket.
Memoize.pm
– a
Perl module A Perl module is a discrete component of software for the Perl programming language. Technically, it is a particular set of conventions for using Perl's package mechanism that has become universally adopted. A module defines its source code to ...
that implements memoized functions.
Java memoization
– an example in Java using dynamic proxy classes to create a generic memoization pattern.
memoization.java
- A Java memoization library.
C++Memo
– A
C++ C++ (pronounced "C plus plus") is a high-level general-purpose programming language created by Danish computer scientist Bjarne Stroustrup as an extension of the C programming language, or "C with Classes". The language has expanded significan ...
memoization framework.
C-Memo
– Generic memoization library for C, implemented using pre-processor function wrapper macros.
Tek271 Memoizer
– Open source Java memoizer using annotations and pluggable cache implementations.
memoizable
– A
Ruby A ruby is a pinkish red to blood-red colored gemstone, a variety of the mineral corundum ( aluminium oxide). Ruby is one of the most popular traditional jewelry gems and is very durable. Other varieties of gem-quality corundum are called sa ...
gem that implements memoized methods.
Python memoization
– A
Python Python may refer to: Snakes * Pythonidae, a family of nonvenomous snakes found in Africa, Asia, and Australia ** ''Python'' (genus), a genus of Pythonidae found in Africa and Asia * Python (mythology), a mythical serpent Computing * Python (pro ...
example of memoization.
OCaml memoization
– Implemented as a
Camlp4 Camlp4 is a software system for writing extensible parsers for programming languages. It provides a set of OCaml libraries that are used to define grammars as well as loadable syntax extensions of such grammars. Camlp4 stands for Caml Preprocessor ...
syntax extension.
Memoization in Lua
– Two example implementations of a general memoize function in
Lua Lua or LUA may refer to: Science and technology * Lua (programming language) * Latvia University of Agriculture * Last universal ancestor, in evolution Ethnicity and language * Lua people, of Laos * Lawa people, of Thailand sometimes referred t ...
.
Memoization in Mathematica
– Memoization and limited memoization in Mathematica.
Memoization in Javascript
– Extending the Function prototype in
JavaScript JavaScript (), often abbreviated as JS, is a programming language that is one of the core technologies of the World Wide Web, alongside HTML and CSS. As of 2022, 98% of Website, websites use JavaScript on the Client (computing), client side ...
( archived version of http://talideon.com/weblog/2005/07/javascript-memoization.cfm ).
Memoization in Javascript
– Examples of memoization in javascript using own caching mechanism and using the YUI library

– eXecutable SpecificAtIons of GrAmmars. Contains publications related to top-down parsing algorithm that supports left-recursion and ambiguity in polynomial time and space.
Memoization in Scheme
– A
Scheme A scheme is a systematic plan for the implementation of a certain idea. Scheme or schemer may refer to: Arts and entertainment * ''The Scheme'' (TV series), a BBC Scotland documentary series * The Scheme (band), an English pop band * ''The Schem ...
example of memoization on a class webpage.
Memoization in Combinatory Logic
– A web service to reduce
Combinatory Logic Combinatory logic is a notation to eliminate the need for quantified variables in mathematical logic. It was introduced by Moses Schönfinkel and Haskell Curry, and has more recently been used in computer science as a theoretical model of comput ...
while memoizing every step in a database.
MbCache
– Cache method results in .NET. {{Parsers Software optimization Computer performance