Consider a grammar that has an alphabet of nonterminal symbols, and an alphabet of terminal symbols. Let this grammar have rules that conform to one of two different patterns. The first pattern is *a* → *b*, where *a* must be a nonterminal symbol, and *b* must either be a terminal symbol or the empty string. The second pattern is *a* → *cd*, where *a* must be a nonterminal symbol, *c* must be a terminal symbol, and *d* must be a nonterminal symbol. If these properties are true of a grammar, then the grammar is called regular (Parkes, 2002). Note that in a regular grammar, expressions in the grammar grow in one direction – for instance, in the rules describe here, expressions would always grow from left to right. One characteristic of this type of grammar is that no rules exist that would insert a clause into the middle of an expression. This is because the rules are such that it is not possible to have a nonterminal symbol flanked on either side by terminal symbols.

Regular grammars are important in cognitive science because they only require finite state automata to be processed. Other grammars in the Chomsky hierarchy have additional properties that require that they be accommodated by more complex computational devices. Another way to think about this is that a regular grammar is more restricted than other grammars in the Chomsky hierarchy. These restrictions – assumed constraints on the grammar – mean that less computational power is required to deal with expressions from the grammar.

**References:**

- Parkes, A. (2002).
*Introduction to Languages, Machines and Logic: Computable Languages, Abstract Machines and Formal Logic*. London ; New York: Springer.

(Added September 2010)