Foundations Of Cognitive Science

Context-Free Grammar

Consider a grammar that has an alphabet of nonterminal symbols, and an alphabet of terminal symbols.  Let this grammar have rules that conform to one basic pattern:  ab, where a must be a nonterminal symbol, and b is any combination of terminal and nonterminal symbols.   If these properties are true of a grammar, then the grammar is called context-free (Parkes, 2002).  This is because in this grammar, no context is permitted on the left side of the rule (Chomsky, 1965).  For example, nonterminal symbols are not allowed to flank the a in order to provide a context that might affect how the a is to be rewritten according to the right side of the rule.

Note that in a context-free grammar, it is possible to have expressions grow to the left, to the right, or to grow from the middle by inserting internal “clauses”.  This is because a context-free grammar is less restricted than a regular grammar.  As a result, the expressions generated from such a grammar can be more complicated, and cannot be accommodate by a finite state automaton.  Context-free grammars are important in cognitive science because they are used to generate the phrase-markers that are manipulated by transformations in a transformational grammar.


  1. Chomsky, N. (1965). Aspects Of The Theory Of Syntax. Cambridge, MA: MIT Press.
  2. Parkes, A. (2002). Introduction to Languages, Machines and Logic: Computable Languages, Abstract Machines and Formal Logic. London ; New York: Springer.

(Added September 2010)