# Associative property Information

(Redirected from Associative)
https://en.wikipedia.org/wiki/Associative
Type A visual graph representing associative operations; $(x\circ y)\circ z=x\circ (y\circ z)$ Law, rule of replacement Elementary algebra $(x\,*\,y)\,*\,z=x\,*\,(y\,*\,z)\forall x,y,z\in S$ Propositional calculus $(P\lor (Q\lor R))\Leftrightarrow ((P\lor Q)\lor R)$ $(P\land (Q\land R))\Leftrightarrow ((P\land Q)\land R),$ In mathematics, the associative property  is a property of some binary operations, which means that rearranging the parentheses in an expression will not change the result. In propositional logic, associativity is a valid rule of replacement for expressions in logical proofs.

Within an expression containing two or more occurrences in a row of the same associative operator, the order in which the operations are performed does not matter as long as the sequence of the operands is not changed. That is (after rewriting the expression with parentheses and in infix notation if necessary), rearranging the parentheses in such an expression will not change its value. Consider the following equations:

{\begin{aligned}(2+3)+4&=2+(3+4)=9\,\\2\times (3\times 4)&=(2\times 3)\times 4=24.\end{aligned}} Even though the parentheses were rearranged on each line, the values of the expressions were not altered. Since this holds true when performing addition and multiplication on any real numbers, it can be said that "addition and multiplication of real numbers are associative operations".

Associativity is not the same as commutativity, which addresses whether the order of two operands affects the result. For example, the order does not matter in the multiplication of real numbers, that is, a × b = b × a, so we say that the multiplication of real numbers is a commutative operation. However, operations such as function composition and matrix multiplication are associative, but (generally) not commutative.

Associative operations are abundant in mathematics; in fact, many algebraic structures (such as semigroups and categories) explicitly require their binary operations to be associative.

However, many important and interesting operations are non-associative; some examples include subtraction, exponentiation, and the vector cross product. In contrast to the theoretical properties of real numbers, the addition of floating point numbers in computer science is not associative, and the choice of how to associate an expression can have a significant effect on rounding error.

## Definition A binary operation ∗ on the set S is associative when this diagram commutes. That is, when the two paths from S×S×S to S compose to the same function from S×S×S to S.

Formally, a binary operation on a set S is called associative if it satisfies the associative law:

(xy) ∗ z = x ∗ (yz) for all x, y, z in S.

Here, ∗ is used to replace the symbol of the operation, which may be any symbol, and even the absence of symbol ( juxtaposition) as for multiplication.

(xy)z = x(yz) = xyz for all x, y, z in S.

The associative law can also be expressed in functional notation thus: f(f(x, y), z) = f(x, f(y, z)).

## Generalized associative law In the absence of the associative property, five factors a, b,c, d, e result in a Tamari lattice of order four, possibly different products.

If a binary operation is associative, repeated application of the operation produces the same result regardless of how valid pairs of parentheses are inserted in the expression.  This is called the generalized associative law. For instance, a product of four elements may be written, without changing the order of the factors, in five possible ways:

• ((ab)c)d
• (ab)(cd)
• (a(bc))d
• a((bc)d)
• a(b(cd))

If the product operation is associative, the generalized associative law says that all these expressions will yield the same result. So unless the expression with omitted parentheses already has a different meaning (see below), the parentheses can be considered unnecessary and "the" product can be written unambiguously as

abcd.

As the number of elements increases, the number of possible ways to insert parentheses grows quickly, but they remain unnecessary for disambiguation.

An example where this does not work is the logical biconditional . It is associative; thus, A ↔ (BC) is equivalent to (AB) ↔ C, but ABC most commonly means (AB) and (BC), which is not equivalent.

## Examples

Some examples of associative operations include the following.

• The concatenation of the three strings "hello", " ", "world" can be computed by concatenating the first two strings (giving "hello ") and appending the third string ("world"), or by joining the second and third string (giving " world") and concatenating the first string ("hello") with the result. The two methods produce the same result; string concatenation is associative (but not commutative).
• In arithmetic, addition and multiplication of real numbers are associative; i.e.,

$\left.{\begin{matrix}(x+y)+z=x+(y+z)=x+y+z\quad \\(x\,y)z=x(y\,z)=x\,y\,z\qquad \qquad \qquad \quad \ \ \,\end{matrix}}\right\}{\mbox{for all }}x,y,z\in \mathbb {R} .$ Because of associativity, the grouping parentheses can be omitted without ambiguity.
• The trivial operation xy = x (that is, the result is the first argument, no matter what the second argument is) is associative but not commutative. Likewise, the trivial operation xy = y (that is, the result is the second argument, no matter what the first argument is) is associative but not commutative.
• Addition and multiplication of complex numbers and quaternions are associative. Addition of octonions is also associative, but multiplication of octonions is non-associative.
• The greatest common divisor and least common multiple functions act associatively.
$\left.{\begin{matrix}\operatorname {gcd} (\operatorname {gcd} (x,y),z)=\operatorname {gcd} (x,\operatorname {gcd} (y,z))=\operatorname {gcd} (x,y,z)\ \quad \\\operatorname {lcm} (\operatorname {lcm} (x,y),z)=\operatorname {lcm} (x,\operatorname {lcm} (y,z))=\operatorname {lcm} (x,y,z)\quad \end{matrix}}\right\}{\mbox{ for all }}x,y,z\in \mathbb {Z} .$ • Taking the intersection or the union of sets:
$\left.{\begin{matrix}(A\cap B)\cap C=A\cap (B\cap C)=A\cap B\cap C\quad \\(A\cup B)\cup C=A\cup (B\cup C)=A\cup B\cup C\quad \end{matrix}}\right\}{\mbox{for all sets }}A,B,C.$ • If M is some set and S denotes the set of all functions from M to M, then the operation of function composition on S is associative:
$(f\circ g)\circ h=f\circ (g\circ h)=f\circ g\circ h\qquad {\mbox{for all }}f,g,h\in S.$ • Slightly more generally, given four sets M, N, P and Q, with h : MN, g : NP, and f : PQ, then
$(f\circ g)\circ h=f\circ (g\circ h)=f\circ g\circ h$ as before. In short, composition of maps is always associative.
• In category theory, composition of morphisms is associative by definition. Associativity of functors and natural transformations follows from associativity of morphisms.
• Consider a set with three elements, A, B, and C. The following operation:
× A B C
A A A A
B A B C
C A A A
is associative. Thus, for example, A(BC) = (AB)C = A. This operation is not commutative.
• Because matrices represent linear functions, and matrix multiplication represents function composition, one can immediately conclude that matrix multiplication is associative. 
• For real numbers (and for any totally ordered set), the minimum and maximum operation is associative:
$\max(a,\max(b,c))=\max(\max(a,b),c)\quad {\text{ and }}\quad \min(a,\min(b,c))=\min(\min(a,b),c).$ ## Propositional logic

### Rule of replacement

In standard truth-functional propositional logic, association,   or associativity  are two valid rules of replacement. The rules allow one to move parentheses in logical expressions in logical proofs. The rules (using logical connectives notation) are:

$(P\lor (Q\lor R))\Leftrightarrow ((P\lor Q)\lor R)$ and

$(P\land (Q\land R))\Leftrightarrow ((P\land Q)\land R),$ where "$\Leftrightarrow$ " is a metalogical symbol representing "can be replaced in a proof with".

### Truth functional connectives

Associativity is a property of some logical connectives of truth-functional propositional logic. The following logical equivalences demonstrate that associativity is a property of particular connectives. The following (and their converses, since is commutative) are truth-functional tautologies.[ citation needed]

Associativity of disjunction
$((P\lor Q)\lor R)\leftrightarrow (P\lor (Q\lor R))$ Associativity of conjunction
$((P\land Q)\land R)\leftrightarrow (P\land (Q\land R))$ Associativity of equivalence
$((P\leftrightarrow Q)\leftrightarrow R)\leftrightarrow (P\leftrightarrow (Q\leftrightarrow R))$ Joint denial is an example of a truth functional connective that is not associative.

## Non-associative operation

A binary operation $*$ on a set S that does not satisfy the associative law is called non-associative. Symbolically,

$(x*y)*z\neq x*(y*z)\qquad {\mbox{for some }}x,y,z\in S.$ For such an operation the order of evaluation does matter. For example:

Subtraction
$(5-3)-2\,\neq \,5-(3-2)$ Division
$(4/2)/2\,\neq \,4/(2/2)$ Exponentiation
$2^{(1^{2})}\,\neq \,(2^{1})^{2}$ Vector cross product
{\begin{aligned}\mathbf {i} \times (\mathbf {i} \times \mathbf {j} )&=\mathbf {i} \times \mathbf {k} =-\mathbf {j} \\(\mathbf {i} \times \mathbf {i} )\times \mathbf {j} &=\mathbf {0} \times \mathbf {j} =\mathbf {0} \end{aligned}} Also although addition is associative for finite sums, it is not associative inside infinite sums ( series). For example,

$(1+-1)+(1+-1)+(1+-1)+(1+-1)+(1+-1)+(1+-1)+\dots =0$ whereas
$1+(-1+1)+(-1+1)+(-1+1)+(-1+1)+(-1+1)+(-1+1)+\dots =1.$ Some non-associative operations are fundamental in mathematics. They appear often as the multiplication in structures called non-associative algebras, which have also an addition and a scalar multiplication. Examples are the octonions and Lie algebras. In Lie algebras, the multiplication satisfies Jacobi identity instead of the associative law; this allows abstracting the algebraic nature of infinitesimal transformations.

Other examples are quasigroup, quasifield, non-associative ring, and commutative non-associative magmas.

### Nonassociativity of floating point calculation

In mathematics, addition and multiplication of real numbers is associative. By contrast, in computer science, the addition and multiplication of floating point numbers is not associative, as rounding errors are introduced when dissimilar-sized values are joined together. 

To illustrate this, consider a floating point representation with a 4-bit mantissa:

(1.0002×20 + 1.0002×20) + 1.0002×24 = 1.0002×21 + 1.0002×24 = 1.0012×24
1.0002×20 + (1.0002×20 + 1.0002×24) = 1.0002×20 + 1.0002×24 = 1.0002×24

Even though most computers compute with 24 or 53 bits of mantissa,  this is an important source of rounding error, and approaches such as the Kahan summation algorithm are ways to minimise the errors. It can be especially problematic in parallel computing.  

### Notation for non-associative operations

In general, parentheses must be used to indicate the order of evaluation if a non-associative operation appears more than once in an expression (unless the notation specifies the order in another way, like ${\dfrac {2}{3/4}}$ ). However, mathematicians agree on a particular order of evaluation for several common non-associative operations. This is simply a notational convention to avoid parentheses.

A left-associative operation is a non-associative operation that is conventionally evaluated from left to right, i.e.,

$\left.{\begin{matrix}x*y*z=(x*y)*z\qquad \qquad \quad \,\\w*x*y*z=((w*x)*y)*z\quad \\{\mbox{etc.}}\qquad \qquad \qquad \qquad \qquad \qquad \ \ \,\end{matrix}}\right\}{\mbox{for all }}w,x,y,z\in S$ while a right-associative operation is conventionally evaluated from right to left:

$\left.{\begin{matrix}x*y*z=x*(y*z)\qquad \qquad \quad \,\\w*x*y*z=w*(x*(y*z))\quad \\{\mbox{etc.}}\qquad \qquad \qquad \qquad \qquad \qquad \ \ \,\end{matrix}}\right\}{\mbox{for all }}w,x,y,z\in S$ Both left-associative and right-associative operations occur. Left-associative operations include the following:

Subtraction and division of real numbers     
$x-y-z=(x-y)-z$ $x/y/z=(x/y)/z$ Function application
$(f\,x\,y)=((f\,x)\,y)$ This notation can be motivated by the currying isomorphism, which enables partial application.

Right-associative operations include the following:

Exponentiation of real numbers in superscript notation
$x^{y^{z}}=x^{(y^{z})}$ Exponentiation is commonly used with brackets or right-associatively because a repeated left-associative exponentiation operation is of little use. Repeated powers would mostly be rewritten with multiplication:

$(x^{y})^{z}=x^{(yz)}$ Formatted correctly, the superscript inherently behaves as a set of parentheses; e.g. in the expression $2^{x+3}$ the addition is performed before the exponentiation despite there being no explicit parentheses $2^{(x+3)}$ wrapped around it. Thus given an expression such as $x^{y^{z}}$ , the full exponent $y^{z}$ of the base $x$ is evaluated first. However, in some contexts, especially in handwriting, the difference between ${x^{y}}^{z}=(x^{y})^{z}$ , $x^{yz}=x^{(yz)}$ and $x^{y^{z}}=x^{(y^{z})}$ can be hard to see. In such a case, right-associativity is usually implied.

Function definition
$\mathbb {Z} \rightarrow \mathbb {Z} \rightarrow \mathbb {Z} =\mathbb {Z} \rightarrow (\mathbb {Z} \rightarrow \mathbb {Z} )$ $x\mapsto y\mapsto x-y=x\mapsto (y\mapsto x-y)$ Using right-associative notation for these operations can be motivated by the Curry–Howard correspondence and by the currying isomorphism.

Non-associative operations for which no conventional evaluation order is defined include the following.

Exponentiation of real numbers in infix notation 
$(x^{\wedge }y)^{\wedge }z\neq x^{\wedge }(y^{\wedge }z)$ Knuth's up-arrow operators
$a\uparrow \uparrow (b\uparrow \uparrow c)\neq (a\uparrow \uparrow b)\uparrow \uparrow c$ $a\uparrow \uparrow \uparrow (b\uparrow \uparrow \uparrow c)\neq (a\uparrow \uparrow \uparrow b)\uparrow \uparrow \uparrow c$ Taking the cross product of three vectors
${\vec {a}}\times ({\vec {b}}\times {\vec {c}})\neq ({\vec {a}}\times {\vec {b}})\times {\vec {c}}\qquad {\mbox{ for some }}{\vec {a}},{\vec {b}},{\vec {c}}\in \mathbb {R} ^{3}$ Taking the pairwise average of real numbers
${(x+y)/2+z \over 2}\neq {x+(y+z)/2 \over 2}\qquad {\mbox{for all }}x,y,z\in \mathbb {R} {\mbox{ with }}x\neq z.$ Taking the relative complement of sets
$(A\backslash B)\backslash C\neq A\backslash (B\backslash C)$ .

(Compare material nonimplication in logic.)

## History

William Rowan Hamilton seems to have coined the term "associative property"  around 1844, a time when he was contemplating the non-associative algebra of the octonions he had learned about from John T. Graves.