Why is commutativity optional in multiplication for rings?
$begingroup$
More precisely, why is it that all rings are required by the axioms to have commutativity in addition, but are not held to the same axiom regarding multiplication? I know that we have commutative and non-commutative rings depending on whether or not they are commutative in multiplication, but I am wondering why it is that the axioms were defined that way, providing us with this option.
I am using this list of axioms, from David Sharpe’s Rings and factorization:
Definition 1.3.1. A ring is a non-empty set $R$ which satisfies the following axioms:
(1) $R$ has a binary operation denoted by $+$ defined on it;
(2) addition is associative, i.e.
begin{align}
a + left(b+cright) = left(a+bright) + c text{ for all } a, b, c in R
end{align}
(so that we can write $a+b+c$ without brackets);
(3) addition is commutative, i.e.
begin{align}
a + b = b + a text{ for all } a, b in R;
end{align}
(4) there is an element denoted by $0$ in $R$ such that
begin{align}
0 + a = a text{ for all } a in R
end{align}
(there is only one such element because, if $0_1$ and $0_2$ are two such, then $0_1 = 0_1 + 0_2 = 0_2$ and they are the same -- we call $0$ the zero element of $R$);
(5) for every $a in R$, there exists an element $-a in R$ such that
begin{align}
left(-aright) + a = 0
end{align}
(there is only one such element for each $a$, because if $b + a = 0$ and $c + a = 0$, then
begin{align}
b = 0 + b = left(c + aright) + b = c + left(a + bright) = c + 0 = c;
end{align}
we call $-a$ the negative of $a$);
(6) $R$ has a binary operation denoted by multiplication defined on it;
(7) multiplication is associative, i.e.
begin{align}
aleft(bcright) = left(abright)c text{ for all } a, b, c in R;
end{align}
(8) multiplication is left and right distributive over addition, i.e.
begin{align}
aleft(b+cright) = ab + ac, left(a+bright)c = ac + bc
text{ for all } a, b, c in R;
end{align}
(9) there is an element denoted by $1$ in $R$ such that $1 neq 0$ and
begin{align}
1 cdot a = a cdot 1 = a text{ for all } a in R
end{align}
(as for the zero element, there is only one such element, and it is called the identity element of $R$).
abstract-algebra ring-theory commutative-algebra
$endgroup$
|
show 6 more comments
$begingroup$
More precisely, why is it that all rings are required by the axioms to have commutativity in addition, but are not held to the same axiom regarding multiplication? I know that we have commutative and non-commutative rings depending on whether or not they are commutative in multiplication, but I am wondering why it is that the axioms were defined that way, providing us with this option.
I am using this list of axioms, from David Sharpe’s Rings and factorization:
Definition 1.3.1. A ring is a non-empty set $R$ which satisfies the following axioms:
(1) $R$ has a binary operation denoted by $+$ defined on it;
(2) addition is associative, i.e.
begin{align}
a + left(b+cright) = left(a+bright) + c text{ for all } a, b, c in R
end{align}
(so that we can write $a+b+c$ without brackets);
(3) addition is commutative, i.e.
begin{align}
a + b = b + a text{ for all } a, b in R;
end{align}
(4) there is an element denoted by $0$ in $R$ such that
begin{align}
0 + a = a text{ for all } a in R
end{align}
(there is only one such element because, if $0_1$ and $0_2$ are two such, then $0_1 = 0_1 + 0_2 = 0_2$ and they are the same -- we call $0$ the zero element of $R$);
(5) for every $a in R$, there exists an element $-a in R$ such that
begin{align}
left(-aright) + a = 0
end{align}
(there is only one such element for each $a$, because if $b + a = 0$ and $c + a = 0$, then
begin{align}
b = 0 + b = left(c + aright) + b = c + left(a + bright) = c + 0 = c;
end{align}
we call $-a$ the negative of $a$);
(6) $R$ has a binary operation denoted by multiplication defined on it;
(7) multiplication is associative, i.e.
begin{align}
aleft(bcright) = left(abright)c text{ for all } a, b, c in R;
end{align}
(8) multiplication is left and right distributive over addition, i.e.
begin{align}
aleft(b+cright) = ab + ac, left(a+bright)c = ac + bc
text{ for all } a, b, c in R;
end{align}
(9) there is an element denoted by $1$ in $R$ such that $1 neq 0$ and
begin{align}
1 cdot a = a cdot 1 = a text{ for all } a in R
end{align}
(as for the zero element, there is only one such element, and it is called the identity element of $R$).
abstract-algebra ring-theory commutative-algebra
$endgroup$
1
$begingroup$
For what it's worth, we also have the notion of a "commutative ring," one in which multiplication does commute. (en.wikipedia.org/wiki/Commutative_ring) Of course I get you're trying to get at why we don't require this, and I don't know how to answer you on that, but I figured it's a point worth bringing up.
$endgroup$
– Eevee Trainer
Mar 4 at 0:00
$begingroup$
@EeveeTrainer Thank you! I dunno if you missed it, but I did express my awareness of that.
$endgroup$
– Kusa
Mar 4 at 0:02
1
$begingroup$
Note that matrices are an example where addition is commutative but multiplication is not necessarily
$endgroup$
– J. W. Tanner
Mar 4 at 0:03
$begingroup$
Oh, I did miss it, sorry about that Kusa. xD
$endgroup$
– Eevee Trainer
Mar 4 at 0:03
4
$begingroup$
The first rings that were considered were generally commutative, but it soon became apparent that rings with noncommutative multiplication were far too common, starting with matrices and, more generally, endomorphism rings of abelian groups. On the other hand, if you take the definition of ring with unity, but omit the condition that addition is commutative, it turns out that you can prove that the other conditions force commutativity of addition.
$endgroup$
– Arturo Magidin
Mar 4 at 0:11
|
show 6 more comments
$begingroup$
More precisely, why is it that all rings are required by the axioms to have commutativity in addition, but are not held to the same axiom regarding multiplication? I know that we have commutative and non-commutative rings depending on whether or not they are commutative in multiplication, but I am wondering why it is that the axioms were defined that way, providing us with this option.
I am using this list of axioms, from David Sharpe’s Rings and factorization:
Definition 1.3.1. A ring is a non-empty set $R$ which satisfies the following axioms:
(1) $R$ has a binary operation denoted by $+$ defined on it;
(2) addition is associative, i.e.
begin{align}
a + left(b+cright) = left(a+bright) + c text{ for all } a, b, c in R
end{align}
(so that we can write $a+b+c$ without brackets);
(3) addition is commutative, i.e.
begin{align}
a + b = b + a text{ for all } a, b in R;
end{align}
(4) there is an element denoted by $0$ in $R$ such that
begin{align}
0 + a = a text{ for all } a in R
end{align}
(there is only one such element because, if $0_1$ and $0_2$ are two such, then $0_1 = 0_1 + 0_2 = 0_2$ and they are the same -- we call $0$ the zero element of $R$);
(5) for every $a in R$, there exists an element $-a in R$ such that
begin{align}
left(-aright) + a = 0
end{align}
(there is only one such element for each $a$, because if $b + a = 0$ and $c + a = 0$, then
begin{align}
b = 0 + b = left(c + aright) + b = c + left(a + bright) = c + 0 = c;
end{align}
we call $-a$ the negative of $a$);
(6) $R$ has a binary operation denoted by multiplication defined on it;
(7) multiplication is associative, i.e.
begin{align}
aleft(bcright) = left(abright)c text{ for all } a, b, c in R;
end{align}
(8) multiplication is left and right distributive over addition, i.e.
begin{align}
aleft(b+cright) = ab + ac, left(a+bright)c = ac + bc
text{ for all } a, b, c in R;
end{align}
(9) there is an element denoted by $1$ in $R$ such that $1 neq 0$ and
begin{align}
1 cdot a = a cdot 1 = a text{ for all } a in R
end{align}
(as for the zero element, there is only one such element, and it is called the identity element of $R$).
abstract-algebra ring-theory commutative-algebra
$endgroup$
More precisely, why is it that all rings are required by the axioms to have commutativity in addition, but are not held to the same axiom regarding multiplication? I know that we have commutative and non-commutative rings depending on whether or not they are commutative in multiplication, but I am wondering why it is that the axioms were defined that way, providing us with this option.
I am using this list of axioms, from David Sharpe’s Rings and factorization:
Definition 1.3.1. A ring is a non-empty set $R$ which satisfies the following axioms:
(1) $R$ has a binary operation denoted by $+$ defined on it;
(2) addition is associative, i.e.
begin{align}
a + left(b+cright) = left(a+bright) + c text{ for all } a, b, c in R
end{align}
(so that we can write $a+b+c$ without brackets);
(3) addition is commutative, i.e.
begin{align}
a + b = b + a text{ for all } a, b in R;
end{align}
(4) there is an element denoted by $0$ in $R$ such that
begin{align}
0 + a = a text{ for all } a in R
end{align}
(there is only one such element because, if $0_1$ and $0_2$ are two such, then $0_1 = 0_1 + 0_2 = 0_2$ and they are the same -- we call $0$ the zero element of $R$);
(5) for every $a in R$, there exists an element $-a in R$ such that
begin{align}
left(-aright) + a = 0
end{align}
(there is only one such element for each $a$, because if $b + a = 0$ and $c + a = 0$, then
begin{align}
b = 0 + b = left(c + aright) + b = c + left(a + bright) = c + 0 = c;
end{align}
we call $-a$ the negative of $a$);
(6) $R$ has a binary operation denoted by multiplication defined on it;
(7) multiplication is associative, i.e.
begin{align}
aleft(bcright) = left(abright)c text{ for all } a, b, c in R;
end{align}
(8) multiplication is left and right distributive over addition, i.e.
begin{align}
aleft(b+cright) = ab + ac, left(a+bright)c = ac + bc
text{ for all } a, b, c in R;
end{align}
(9) there is an element denoted by $1$ in $R$ such that $1 neq 0$ and
begin{align}
1 cdot a = a cdot 1 = a text{ for all } a in R
end{align}
(as for the zero element, there is only one such element, and it is called the identity element of $R$).
abstract-algebra ring-theory commutative-algebra
abstract-algebra ring-theory commutative-algebra
edited Mar 5 at 3:12
darij grinberg
11.2k33167
11.2k33167
asked Mar 3 at 23:54
KusaKusa
775
775
1
$begingroup$
For what it's worth, we also have the notion of a "commutative ring," one in which multiplication does commute. (en.wikipedia.org/wiki/Commutative_ring) Of course I get you're trying to get at why we don't require this, and I don't know how to answer you on that, but I figured it's a point worth bringing up.
$endgroup$
– Eevee Trainer
Mar 4 at 0:00
$begingroup$
@EeveeTrainer Thank you! I dunno if you missed it, but I did express my awareness of that.
$endgroup$
– Kusa
Mar 4 at 0:02
1
$begingroup$
Note that matrices are an example where addition is commutative but multiplication is not necessarily
$endgroup$
– J. W. Tanner
Mar 4 at 0:03
$begingroup$
Oh, I did miss it, sorry about that Kusa. xD
$endgroup$
– Eevee Trainer
Mar 4 at 0:03
4
$begingroup$
The first rings that were considered were generally commutative, but it soon became apparent that rings with noncommutative multiplication were far too common, starting with matrices and, more generally, endomorphism rings of abelian groups. On the other hand, if you take the definition of ring with unity, but omit the condition that addition is commutative, it turns out that you can prove that the other conditions force commutativity of addition.
$endgroup$
– Arturo Magidin
Mar 4 at 0:11
|
show 6 more comments
1
$begingroup$
For what it's worth, we also have the notion of a "commutative ring," one in which multiplication does commute. (en.wikipedia.org/wiki/Commutative_ring) Of course I get you're trying to get at why we don't require this, and I don't know how to answer you on that, but I figured it's a point worth bringing up.
$endgroup$
– Eevee Trainer
Mar 4 at 0:00
$begingroup$
@EeveeTrainer Thank you! I dunno if you missed it, but I did express my awareness of that.
$endgroup$
– Kusa
Mar 4 at 0:02
1
$begingroup$
Note that matrices are an example where addition is commutative but multiplication is not necessarily
$endgroup$
– J. W. Tanner
Mar 4 at 0:03
$begingroup$
Oh, I did miss it, sorry about that Kusa. xD
$endgroup$
– Eevee Trainer
Mar 4 at 0:03
4
$begingroup$
The first rings that were considered were generally commutative, but it soon became apparent that rings with noncommutative multiplication were far too common, starting with matrices and, more generally, endomorphism rings of abelian groups. On the other hand, if you take the definition of ring with unity, but omit the condition that addition is commutative, it turns out that you can prove that the other conditions force commutativity of addition.
$endgroup$
– Arturo Magidin
Mar 4 at 0:11
1
1
$begingroup$
For what it's worth, we also have the notion of a "commutative ring," one in which multiplication does commute. (en.wikipedia.org/wiki/Commutative_ring) Of course I get you're trying to get at why we don't require this, and I don't know how to answer you on that, but I figured it's a point worth bringing up.
$endgroup$
– Eevee Trainer
Mar 4 at 0:00
$begingroup$
For what it's worth, we also have the notion of a "commutative ring," one in which multiplication does commute. (en.wikipedia.org/wiki/Commutative_ring) Of course I get you're trying to get at why we don't require this, and I don't know how to answer you on that, but I figured it's a point worth bringing up.
$endgroup$
– Eevee Trainer
Mar 4 at 0:00
$begingroup$
@EeveeTrainer Thank you! I dunno if you missed it, but I did express my awareness of that.
$endgroup$
– Kusa
Mar 4 at 0:02
$begingroup$
@EeveeTrainer Thank you! I dunno if you missed it, but I did express my awareness of that.
$endgroup$
– Kusa
Mar 4 at 0:02
1
1
$begingroup$
Note that matrices are an example where addition is commutative but multiplication is not necessarily
$endgroup$
– J. W. Tanner
Mar 4 at 0:03
$begingroup$
Note that matrices are an example where addition is commutative but multiplication is not necessarily
$endgroup$
– J. W. Tanner
Mar 4 at 0:03
$begingroup$
Oh, I did miss it, sorry about that Kusa. xD
$endgroup$
– Eevee Trainer
Mar 4 at 0:03
$begingroup$
Oh, I did miss it, sorry about that Kusa. xD
$endgroup$
– Eevee Trainer
Mar 4 at 0:03
4
4
$begingroup$
The first rings that were considered were generally commutative, but it soon became apparent that rings with noncommutative multiplication were far too common, starting with matrices and, more generally, endomorphism rings of abelian groups. On the other hand, if you take the definition of ring with unity, but omit the condition that addition is commutative, it turns out that you can prove that the other conditions force commutativity of addition.
$endgroup$
– Arturo Magidin
Mar 4 at 0:11
$begingroup$
The first rings that were considered were generally commutative, but it soon became apparent that rings with noncommutative multiplication were far too common, starting with matrices and, more generally, endomorphism rings of abelian groups. On the other hand, if you take the definition of ring with unity, but omit the condition that addition is commutative, it turns out that you can prove that the other conditions force commutativity of addition.
$endgroup$
– Arturo Magidin
Mar 4 at 0:11
|
show 6 more comments
4 Answers
4
active
oldest
votes
$begingroup$
The first rings that were considered were generally commutative; polynomial rings, then from work of Dedekind in number fields. The properties were then abstracted by Fraenkel and Noether, who still dealt mostly with commutative rings.
However, it soon became apparent that there were too many instances where commutativity of multiplication did not hold. You had famously the quaternions, of course, but you also had matrices and, more generally, the endomorphism ring of an abelian group (where “multiplication” is composition of functions). So that we have two different, related, notions: commutative rings and noncommutative rings, just like we have noncommutative groups and commutative/abelian groups.
Now, why do this with multiplication and not with addition? Well, if you take your definition of ring above, which includes a unity, but you drop condition (3) (that is, you require everything except you do not require that addition be commutative), it turns out that the other eight axioms force commutativity of addition.
Indeed, suppose you have a structure $(R,+,cdot,0,1)$ that satisfies axioms (1), (2), and (4)-(9) above. I claim that one can deduce (3). Indeed, let $a,bin R$. Then using distributivity on the left first, and distributivity on the right second, we have
$$begin{align*}
(1+1)(a+b) &= 1(a+b) + 1(a+b) = a+b+a+b\
(1+1)(a+b) &= (1+1)a + (1+1)b = a+a+b+b.
end{align*}$$
From this we get that $a+b+a+b = a+a+b+b$. Now add the inverse of $a$ on the left and the inverse of $b$ on the right on both sides to get
$$begin{align*}
(-a) + a + b + a + b + (-b) &= 0+b+a+0 = b+a\
(-a) + a + a + b + b + (-b) &= 0+a+b+0 = a+b
end{align*}$$
Thus, we conclude that $a+b=b+a$. That is, commutativity of addition is a consequence of the other eight axioms.
The reason we include it is two-fold: one, is that it is much nicer to say that the first few axioms force $(R,+)$ to be a commutative/abelian group. The second is that it is also common to consider rings without unity, and if we do that, then it is no longer true that addition is forced to be commutative. To see this, note that if $(G,cdot)$ is any group with identity element $e_G$, and we define a multiplication on $G$ by letting $a*b=e_G$ for all $a,bin G$, then $(G,cdot,*)$ satisfies axioms (1)-(8) given above. But if the original group is not commutative, then the “addition” in this ring is not commutative. So if we want to consider rings without unity, we do want to explicitly require addition to be commutative.
$endgroup$
1
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
add a comment |
$begingroup$
I don't know about the history, but I think the right way to motivate rings is via their linear action on some set. Even the semi-ring of natural numbers $defnn{mathbb{N}}$$nn$ should be motivated by the action of counting numbers on objects, where you want the following for any $a,b ∈ nn$ and object collections $X,Y$:
$a·X+a·Y = a·(X+Y)$ [$a$ copies of $X$ plus $a$ copies of $Y$ is $a$ copies of ( $X$ plus $Y$ )]
$a·X+b·X = (a+b)·X$ [$a$ copies of $X$ plus $b$ copies of $X$ is $(a+b)$ copies of $X$]
$a·(b·X) = (a·b)·X$ [$a$ copies of $b$ copies of $X$ is $(a·b)$ copies of $X$]
$1·X = X$ [$1$ copy of $X$ is just $X$]
$0·X + Y = Y$ [$0$ copies of $X$ plus $Y$ is just $Y$]
$X + Y = Y + X$ [Combining collections is symmetric]
Here $nn$ acts via $·$ on the commutative semi-group $C$ of collections of things under combining, and the point is that we can abstract out the counting numbers $nn$ by simply dropping the semi-group $C$ that $nn$ acts on.
Note that associativity and commutativity of $+$ for $nn$ immediately follows from associativity and commutativity for $C$.
Now observe that for any $a,b,c ∈ nn$ and object collection $X$ we have:
$(a·(b+c))·X = a·((b+c)·X)$ $= a·(b·X+c·X)$ $= a·(b·X)+a·(c·X)$ $= (a·b)·X+(a·c)·X$.
$((a+b)·c)·X = (a+b)·(c·X)$ $= a·(c·X)+b·(c·X)$ $= (a·c)·X+(b·c)·X$.
So we have obtained distributivity for $nn$!
But what about commutativity of $·$ for $nn$? That corresponds to:
$a·(b·X) = b·(a·X)$ [$a$ copies of $b$ copies of $X$ is $b$ copies of $a$ copies of $X$]
Is it obviously true? For "copies" in the real world, sure, and hence we get the familiar semi-ring properties of $nn$. Similar motivation involving scalings gives us semi-ring properties of $mathbb{R}_{≥0}$.
If we move to the more abstract notion of collections of assets owned/owed, we can easily get the ring $mathbb{Z}$, and likewise once we consider inverse scalings we get the ring $mathbb{R}$.
In general, if a ring $R$ acts on a group $G$, then $R$ will automatically acquire associativity, and also naturally acquire commutative addition if $G$ is commutative.
But commutative multiplication is different. For copying and scaling, indeed the action is commutative. But it should be obvious that in general actions are not commutative!
For example, the collection $T$ of rigid transformations acts on set $S$ of locations (vectors), and certainly $A·(B·X)$ may not be $B·(A·X)$ for general $A,B ∈ T$ and location $X$ (rotations and translations do not generally commute). So if $T$ is viewed as a ring, with addition being pointwise addition and multiplication being composition, then this ring has commutative addition (since vector addition is commutative) but has non-commutative multiplication. And of course $T$ is a subset of linear operators on the vector space of locations, which can be represented by matrices. After all, matrix multiplication is defined so that it is the same as composition.
$endgroup$
1
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
1
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
add a comment |
$begingroup$
Matrix rings are the same as set of linear transformations ('endomorphisms') after writing them in a fixed basis. These linear transformations are functions of special kind (satisfying linearity conditions).
The functions taking values in vector space can be added and so make a vector space. As the domain and codomain are the same vector spaces, they can be composed.
The function addition and compositions satisfy distributive law.
Unfortunately function compositions are rarely commutative.
For example in the nicest situation of functions from real numbers to real numbers namely $f(x)= sin x, g(x)= x^2$. Clearly $sin (x^2)neq sin^2 (x)$.
(Or much simpler $(x+5)^2neq x^2 +5$.)
Functions and their compositions being mainstream operations in all branches of mathematics (not just algebra) we need a study of these operations and so we have to accommodate non-commutative rings.
In a different set-up, where there is no ring just a group: Galois group of a normal separable finite extension of fields consists of field automorphisms. Automorphisms are first of all functions from a field onto themselves.
Their compositions make them into a group. Again as function compositions are rarely commutative, one needs to single out the case where these Galois groups turn out to be abelian and find out what is special about them.
There is a whole subfield (pun intended) of Algebraic Number Theory called Class Field Theory that deals with cases of abelian Galois groups (and much more). The simplest cases of rational numbers as base field there is a celebrated theorem of Kronecker-Weber describing how to obtain all algebraic number fields that are abelian Galois extensions.
$endgroup$
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
add a comment |
$begingroup$
Commutativity is optional in the definition of a ring (or even a field) because one can already prove tons of propositions about rings (and even fields and modules and vector spaces) without assuming commutativity. The fewer assumptions one makes, the better are one's axioms.
It is more convenient to define a ring in general and to assume commutativity only when it is needed.
$endgroup$
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3134219%2fwhy-is-commutativity-optional-in-multiplication-for-rings%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The first rings that were considered were generally commutative; polynomial rings, then from work of Dedekind in number fields. The properties were then abstracted by Fraenkel and Noether, who still dealt mostly with commutative rings.
However, it soon became apparent that there were too many instances where commutativity of multiplication did not hold. You had famously the quaternions, of course, but you also had matrices and, more generally, the endomorphism ring of an abelian group (where “multiplication” is composition of functions). So that we have two different, related, notions: commutative rings and noncommutative rings, just like we have noncommutative groups and commutative/abelian groups.
Now, why do this with multiplication and not with addition? Well, if you take your definition of ring above, which includes a unity, but you drop condition (3) (that is, you require everything except you do not require that addition be commutative), it turns out that the other eight axioms force commutativity of addition.
Indeed, suppose you have a structure $(R,+,cdot,0,1)$ that satisfies axioms (1), (2), and (4)-(9) above. I claim that one can deduce (3). Indeed, let $a,bin R$. Then using distributivity on the left first, and distributivity on the right second, we have
$$begin{align*}
(1+1)(a+b) &= 1(a+b) + 1(a+b) = a+b+a+b\
(1+1)(a+b) &= (1+1)a + (1+1)b = a+a+b+b.
end{align*}$$
From this we get that $a+b+a+b = a+a+b+b$. Now add the inverse of $a$ on the left and the inverse of $b$ on the right on both sides to get
$$begin{align*}
(-a) + a + b + a + b + (-b) &= 0+b+a+0 = b+a\
(-a) + a + a + b + b + (-b) &= 0+a+b+0 = a+b
end{align*}$$
Thus, we conclude that $a+b=b+a$. That is, commutativity of addition is a consequence of the other eight axioms.
The reason we include it is two-fold: one, is that it is much nicer to say that the first few axioms force $(R,+)$ to be a commutative/abelian group. The second is that it is also common to consider rings without unity, and if we do that, then it is no longer true that addition is forced to be commutative. To see this, note that if $(G,cdot)$ is any group with identity element $e_G$, and we define a multiplication on $G$ by letting $a*b=e_G$ for all $a,bin G$, then $(G,cdot,*)$ satisfies axioms (1)-(8) given above. But if the original group is not commutative, then the “addition” in this ring is not commutative. So if we want to consider rings without unity, we do want to explicitly require addition to be commutative.
$endgroup$
1
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
add a comment |
$begingroup$
The first rings that were considered were generally commutative; polynomial rings, then from work of Dedekind in number fields. The properties were then abstracted by Fraenkel and Noether, who still dealt mostly with commutative rings.
However, it soon became apparent that there were too many instances where commutativity of multiplication did not hold. You had famously the quaternions, of course, but you also had matrices and, more generally, the endomorphism ring of an abelian group (where “multiplication” is composition of functions). So that we have two different, related, notions: commutative rings and noncommutative rings, just like we have noncommutative groups and commutative/abelian groups.
Now, why do this with multiplication and not with addition? Well, if you take your definition of ring above, which includes a unity, but you drop condition (3) (that is, you require everything except you do not require that addition be commutative), it turns out that the other eight axioms force commutativity of addition.
Indeed, suppose you have a structure $(R,+,cdot,0,1)$ that satisfies axioms (1), (2), and (4)-(9) above. I claim that one can deduce (3). Indeed, let $a,bin R$. Then using distributivity on the left first, and distributivity on the right second, we have
$$begin{align*}
(1+1)(a+b) &= 1(a+b) + 1(a+b) = a+b+a+b\
(1+1)(a+b) &= (1+1)a + (1+1)b = a+a+b+b.
end{align*}$$
From this we get that $a+b+a+b = a+a+b+b$. Now add the inverse of $a$ on the left and the inverse of $b$ on the right on both sides to get
$$begin{align*}
(-a) + a + b + a + b + (-b) &= 0+b+a+0 = b+a\
(-a) + a + a + b + b + (-b) &= 0+a+b+0 = a+b
end{align*}$$
Thus, we conclude that $a+b=b+a$. That is, commutativity of addition is a consequence of the other eight axioms.
The reason we include it is two-fold: one, is that it is much nicer to say that the first few axioms force $(R,+)$ to be a commutative/abelian group. The second is that it is also common to consider rings without unity, and if we do that, then it is no longer true that addition is forced to be commutative. To see this, note that if $(G,cdot)$ is any group with identity element $e_G$, and we define a multiplication on $G$ by letting $a*b=e_G$ for all $a,bin G$, then $(G,cdot,*)$ satisfies axioms (1)-(8) given above. But if the original group is not commutative, then the “addition” in this ring is not commutative. So if we want to consider rings without unity, we do want to explicitly require addition to be commutative.
$endgroup$
1
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
add a comment |
$begingroup$
The first rings that were considered were generally commutative; polynomial rings, then from work of Dedekind in number fields. The properties were then abstracted by Fraenkel and Noether, who still dealt mostly with commutative rings.
However, it soon became apparent that there were too many instances where commutativity of multiplication did not hold. You had famously the quaternions, of course, but you also had matrices and, more generally, the endomorphism ring of an abelian group (where “multiplication” is composition of functions). So that we have two different, related, notions: commutative rings and noncommutative rings, just like we have noncommutative groups and commutative/abelian groups.
Now, why do this with multiplication and not with addition? Well, if you take your definition of ring above, which includes a unity, but you drop condition (3) (that is, you require everything except you do not require that addition be commutative), it turns out that the other eight axioms force commutativity of addition.
Indeed, suppose you have a structure $(R,+,cdot,0,1)$ that satisfies axioms (1), (2), and (4)-(9) above. I claim that one can deduce (3). Indeed, let $a,bin R$. Then using distributivity on the left first, and distributivity on the right second, we have
$$begin{align*}
(1+1)(a+b) &= 1(a+b) + 1(a+b) = a+b+a+b\
(1+1)(a+b) &= (1+1)a + (1+1)b = a+a+b+b.
end{align*}$$
From this we get that $a+b+a+b = a+a+b+b$. Now add the inverse of $a$ on the left and the inverse of $b$ on the right on both sides to get
$$begin{align*}
(-a) + a + b + a + b + (-b) &= 0+b+a+0 = b+a\
(-a) + a + a + b + b + (-b) &= 0+a+b+0 = a+b
end{align*}$$
Thus, we conclude that $a+b=b+a$. That is, commutativity of addition is a consequence of the other eight axioms.
The reason we include it is two-fold: one, is that it is much nicer to say that the first few axioms force $(R,+)$ to be a commutative/abelian group. The second is that it is also common to consider rings without unity, and if we do that, then it is no longer true that addition is forced to be commutative. To see this, note that if $(G,cdot)$ is any group with identity element $e_G$, and we define a multiplication on $G$ by letting $a*b=e_G$ for all $a,bin G$, then $(G,cdot,*)$ satisfies axioms (1)-(8) given above. But if the original group is not commutative, then the “addition” in this ring is not commutative. So if we want to consider rings without unity, we do want to explicitly require addition to be commutative.
$endgroup$
The first rings that were considered were generally commutative; polynomial rings, then from work of Dedekind in number fields. The properties were then abstracted by Fraenkel and Noether, who still dealt mostly with commutative rings.
However, it soon became apparent that there were too many instances where commutativity of multiplication did not hold. You had famously the quaternions, of course, but you also had matrices and, more generally, the endomorphism ring of an abelian group (where “multiplication” is composition of functions). So that we have two different, related, notions: commutative rings and noncommutative rings, just like we have noncommutative groups and commutative/abelian groups.
Now, why do this with multiplication and not with addition? Well, if you take your definition of ring above, which includes a unity, but you drop condition (3) (that is, you require everything except you do not require that addition be commutative), it turns out that the other eight axioms force commutativity of addition.
Indeed, suppose you have a structure $(R,+,cdot,0,1)$ that satisfies axioms (1), (2), and (4)-(9) above. I claim that one can deduce (3). Indeed, let $a,bin R$. Then using distributivity on the left first, and distributivity on the right second, we have
$$begin{align*}
(1+1)(a+b) &= 1(a+b) + 1(a+b) = a+b+a+b\
(1+1)(a+b) &= (1+1)a + (1+1)b = a+a+b+b.
end{align*}$$
From this we get that $a+b+a+b = a+a+b+b$. Now add the inverse of $a$ on the left and the inverse of $b$ on the right on both sides to get
$$begin{align*}
(-a) + a + b + a + b + (-b) &= 0+b+a+0 = b+a\
(-a) + a + a + b + b + (-b) &= 0+a+b+0 = a+b
end{align*}$$
Thus, we conclude that $a+b=b+a$. That is, commutativity of addition is a consequence of the other eight axioms.
The reason we include it is two-fold: one, is that it is much nicer to say that the first few axioms force $(R,+)$ to be a commutative/abelian group. The second is that it is also common to consider rings without unity, and if we do that, then it is no longer true that addition is forced to be commutative. To see this, note that if $(G,cdot)$ is any group with identity element $e_G$, and we define a multiplication on $G$ by letting $a*b=e_G$ for all $a,bin G$, then $(G,cdot,*)$ satisfies axioms (1)-(8) given above. But if the original group is not commutative, then the “addition” in this ring is not commutative. So if we want to consider rings without unity, we do want to explicitly require addition to be commutative.
edited Mar 5 at 3:07
darij grinberg
11.2k33167
11.2k33167
answered Mar 4 at 0:25
Arturo MagidinArturo Magidin
264k34590917
264k34590917
1
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
add a comment |
1
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
1
1
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
There is an entire thread on the 4th paragragh, see Why is ring addition commuattive?
$endgroup$
– Bill Dubuque
Mar 4 at 0:27
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
Nitpick: The way the author of the quote defines $0$ and $-a$, your argument may not work. You are relying on the "sane" definitions on $0$ and $-a$, which require $0$ to be a two-sided neutral element and $-a$ to be a two-sided inverse of $a$.
$endgroup$
– darij grinberg
Mar 5 at 3:14
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@darijgrinberg: No... if you assume that your (possibly nonabelian) group has a left neutral element and left inverses, then you can deduce they are both two-sided. Problems only arise if you require the neutral and inverses to be on opposite sides.
$endgroup$
– Arturo Magidin
Mar 5 at 3:23
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
$begingroup$
@ArturoMagidin: Ah, I see! That's a neat exercise in itself.
$endgroup$
– darij grinberg
Mar 5 at 3:32
add a comment |
$begingroup$
I don't know about the history, but I think the right way to motivate rings is via their linear action on some set. Even the semi-ring of natural numbers $defnn{mathbb{N}}$$nn$ should be motivated by the action of counting numbers on objects, where you want the following for any $a,b ∈ nn$ and object collections $X,Y$:
$a·X+a·Y = a·(X+Y)$ [$a$ copies of $X$ plus $a$ copies of $Y$ is $a$ copies of ( $X$ plus $Y$ )]
$a·X+b·X = (a+b)·X$ [$a$ copies of $X$ plus $b$ copies of $X$ is $(a+b)$ copies of $X$]
$a·(b·X) = (a·b)·X$ [$a$ copies of $b$ copies of $X$ is $(a·b)$ copies of $X$]
$1·X = X$ [$1$ copy of $X$ is just $X$]
$0·X + Y = Y$ [$0$ copies of $X$ plus $Y$ is just $Y$]
$X + Y = Y + X$ [Combining collections is symmetric]
Here $nn$ acts via $·$ on the commutative semi-group $C$ of collections of things under combining, and the point is that we can abstract out the counting numbers $nn$ by simply dropping the semi-group $C$ that $nn$ acts on.
Note that associativity and commutativity of $+$ for $nn$ immediately follows from associativity and commutativity for $C$.
Now observe that for any $a,b,c ∈ nn$ and object collection $X$ we have:
$(a·(b+c))·X = a·((b+c)·X)$ $= a·(b·X+c·X)$ $= a·(b·X)+a·(c·X)$ $= (a·b)·X+(a·c)·X$.
$((a+b)·c)·X = (a+b)·(c·X)$ $= a·(c·X)+b·(c·X)$ $= (a·c)·X+(b·c)·X$.
So we have obtained distributivity for $nn$!
But what about commutativity of $·$ for $nn$? That corresponds to:
$a·(b·X) = b·(a·X)$ [$a$ copies of $b$ copies of $X$ is $b$ copies of $a$ copies of $X$]
Is it obviously true? For "copies" in the real world, sure, and hence we get the familiar semi-ring properties of $nn$. Similar motivation involving scalings gives us semi-ring properties of $mathbb{R}_{≥0}$.
If we move to the more abstract notion of collections of assets owned/owed, we can easily get the ring $mathbb{Z}$, and likewise once we consider inverse scalings we get the ring $mathbb{R}$.
In general, if a ring $R$ acts on a group $G$, then $R$ will automatically acquire associativity, and also naturally acquire commutative addition if $G$ is commutative.
But commutative multiplication is different. For copying and scaling, indeed the action is commutative. But it should be obvious that in general actions are not commutative!
For example, the collection $T$ of rigid transformations acts on set $S$ of locations (vectors), and certainly $A·(B·X)$ may not be $B·(A·X)$ for general $A,B ∈ T$ and location $X$ (rotations and translations do not generally commute). So if $T$ is viewed as a ring, with addition being pointwise addition and multiplication being composition, then this ring has commutative addition (since vector addition is commutative) but has non-commutative multiplication. And of course $T$ is a subset of linear operators on the vector space of locations, which can be represented by matrices. After all, matrix multiplication is defined so that it is the same as composition.
$endgroup$
1
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
1
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
add a comment |
$begingroup$
I don't know about the history, but I think the right way to motivate rings is via their linear action on some set. Even the semi-ring of natural numbers $defnn{mathbb{N}}$$nn$ should be motivated by the action of counting numbers on objects, where you want the following for any $a,b ∈ nn$ and object collections $X,Y$:
$a·X+a·Y = a·(X+Y)$ [$a$ copies of $X$ plus $a$ copies of $Y$ is $a$ copies of ( $X$ plus $Y$ )]
$a·X+b·X = (a+b)·X$ [$a$ copies of $X$ plus $b$ copies of $X$ is $(a+b)$ copies of $X$]
$a·(b·X) = (a·b)·X$ [$a$ copies of $b$ copies of $X$ is $(a·b)$ copies of $X$]
$1·X = X$ [$1$ copy of $X$ is just $X$]
$0·X + Y = Y$ [$0$ copies of $X$ plus $Y$ is just $Y$]
$X + Y = Y + X$ [Combining collections is symmetric]
Here $nn$ acts via $·$ on the commutative semi-group $C$ of collections of things under combining, and the point is that we can abstract out the counting numbers $nn$ by simply dropping the semi-group $C$ that $nn$ acts on.
Note that associativity and commutativity of $+$ for $nn$ immediately follows from associativity and commutativity for $C$.
Now observe that for any $a,b,c ∈ nn$ and object collection $X$ we have:
$(a·(b+c))·X = a·((b+c)·X)$ $= a·(b·X+c·X)$ $= a·(b·X)+a·(c·X)$ $= (a·b)·X+(a·c)·X$.
$((a+b)·c)·X = (a+b)·(c·X)$ $= a·(c·X)+b·(c·X)$ $= (a·c)·X+(b·c)·X$.
So we have obtained distributivity for $nn$!
But what about commutativity of $·$ for $nn$? That corresponds to:
$a·(b·X) = b·(a·X)$ [$a$ copies of $b$ copies of $X$ is $b$ copies of $a$ copies of $X$]
Is it obviously true? For "copies" in the real world, sure, and hence we get the familiar semi-ring properties of $nn$. Similar motivation involving scalings gives us semi-ring properties of $mathbb{R}_{≥0}$.
If we move to the more abstract notion of collections of assets owned/owed, we can easily get the ring $mathbb{Z}$, and likewise once we consider inverse scalings we get the ring $mathbb{R}$.
In general, if a ring $R$ acts on a group $G$, then $R$ will automatically acquire associativity, and also naturally acquire commutative addition if $G$ is commutative.
But commutative multiplication is different. For copying and scaling, indeed the action is commutative. But it should be obvious that in general actions are not commutative!
For example, the collection $T$ of rigid transformations acts on set $S$ of locations (vectors), and certainly $A·(B·X)$ may not be $B·(A·X)$ for general $A,B ∈ T$ and location $X$ (rotations and translations do not generally commute). So if $T$ is viewed as a ring, with addition being pointwise addition and multiplication being composition, then this ring has commutative addition (since vector addition is commutative) but has non-commutative multiplication. And of course $T$ is a subset of linear operators on the vector space of locations, which can be represented by matrices. After all, matrix multiplication is defined so that it is the same as composition.
$endgroup$
1
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
1
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
add a comment |
$begingroup$
I don't know about the history, but I think the right way to motivate rings is via their linear action on some set. Even the semi-ring of natural numbers $defnn{mathbb{N}}$$nn$ should be motivated by the action of counting numbers on objects, where you want the following for any $a,b ∈ nn$ and object collections $X,Y$:
$a·X+a·Y = a·(X+Y)$ [$a$ copies of $X$ plus $a$ copies of $Y$ is $a$ copies of ( $X$ plus $Y$ )]
$a·X+b·X = (a+b)·X$ [$a$ copies of $X$ plus $b$ copies of $X$ is $(a+b)$ copies of $X$]
$a·(b·X) = (a·b)·X$ [$a$ copies of $b$ copies of $X$ is $(a·b)$ copies of $X$]
$1·X = X$ [$1$ copy of $X$ is just $X$]
$0·X + Y = Y$ [$0$ copies of $X$ plus $Y$ is just $Y$]
$X + Y = Y + X$ [Combining collections is symmetric]
Here $nn$ acts via $·$ on the commutative semi-group $C$ of collections of things under combining, and the point is that we can abstract out the counting numbers $nn$ by simply dropping the semi-group $C$ that $nn$ acts on.
Note that associativity and commutativity of $+$ for $nn$ immediately follows from associativity and commutativity for $C$.
Now observe that for any $a,b,c ∈ nn$ and object collection $X$ we have:
$(a·(b+c))·X = a·((b+c)·X)$ $= a·(b·X+c·X)$ $= a·(b·X)+a·(c·X)$ $= (a·b)·X+(a·c)·X$.
$((a+b)·c)·X = (a+b)·(c·X)$ $= a·(c·X)+b·(c·X)$ $= (a·c)·X+(b·c)·X$.
So we have obtained distributivity for $nn$!
But what about commutativity of $·$ for $nn$? That corresponds to:
$a·(b·X) = b·(a·X)$ [$a$ copies of $b$ copies of $X$ is $b$ copies of $a$ copies of $X$]
Is it obviously true? For "copies" in the real world, sure, and hence we get the familiar semi-ring properties of $nn$. Similar motivation involving scalings gives us semi-ring properties of $mathbb{R}_{≥0}$.
If we move to the more abstract notion of collections of assets owned/owed, we can easily get the ring $mathbb{Z}$, and likewise once we consider inverse scalings we get the ring $mathbb{R}$.
In general, if a ring $R$ acts on a group $G$, then $R$ will automatically acquire associativity, and also naturally acquire commutative addition if $G$ is commutative.
But commutative multiplication is different. For copying and scaling, indeed the action is commutative. But it should be obvious that in general actions are not commutative!
For example, the collection $T$ of rigid transformations acts on set $S$ of locations (vectors), and certainly $A·(B·X)$ may not be $B·(A·X)$ for general $A,B ∈ T$ and location $X$ (rotations and translations do not generally commute). So if $T$ is viewed as a ring, with addition being pointwise addition and multiplication being composition, then this ring has commutative addition (since vector addition is commutative) but has non-commutative multiplication. And of course $T$ is a subset of linear operators on the vector space of locations, which can be represented by matrices. After all, matrix multiplication is defined so that it is the same as composition.
$endgroup$
I don't know about the history, but I think the right way to motivate rings is via their linear action on some set. Even the semi-ring of natural numbers $defnn{mathbb{N}}$$nn$ should be motivated by the action of counting numbers on objects, where you want the following for any $a,b ∈ nn$ and object collections $X,Y$:
$a·X+a·Y = a·(X+Y)$ [$a$ copies of $X$ plus $a$ copies of $Y$ is $a$ copies of ( $X$ plus $Y$ )]
$a·X+b·X = (a+b)·X$ [$a$ copies of $X$ plus $b$ copies of $X$ is $(a+b)$ copies of $X$]
$a·(b·X) = (a·b)·X$ [$a$ copies of $b$ copies of $X$ is $(a·b)$ copies of $X$]
$1·X = X$ [$1$ copy of $X$ is just $X$]
$0·X + Y = Y$ [$0$ copies of $X$ plus $Y$ is just $Y$]
$X + Y = Y + X$ [Combining collections is symmetric]
Here $nn$ acts via $·$ on the commutative semi-group $C$ of collections of things under combining, and the point is that we can abstract out the counting numbers $nn$ by simply dropping the semi-group $C$ that $nn$ acts on.
Note that associativity and commutativity of $+$ for $nn$ immediately follows from associativity and commutativity for $C$.
Now observe that for any $a,b,c ∈ nn$ and object collection $X$ we have:
$(a·(b+c))·X = a·((b+c)·X)$ $= a·(b·X+c·X)$ $= a·(b·X)+a·(c·X)$ $= (a·b)·X+(a·c)·X$.
$((a+b)·c)·X = (a+b)·(c·X)$ $= a·(c·X)+b·(c·X)$ $= (a·c)·X+(b·c)·X$.
So we have obtained distributivity for $nn$!
But what about commutativity of $·$ for $nn$? That corresponds to:
$a·(b·X) = b·(a·X)$ [$a$ copies of $b$ copies of $X$ is $b$ copies of $a$ copies of $X$]
Is it obviously true? For "copies" in the real world, sure, and hence we get the familiar semi-ring properties of $nn$. Similar motivation involving scalings gives us semi-ring properties of $mathbb{R}_{≥0}$.
If we move to the more abstract notion of collections of assets owned/owed, we can easily get the ring $mathbb{Z}$, and likewise once we consider inverse scalings we get the ring $mathbb{R}$.
In general, if a ring $R$ acts on a group $G$, then $R$ will automatically acquire associativity, and also naturally acquire commutative addition if $G$ is commutative.
But commutative multiplication is different. For copying and scaling, indeed the action is commutative. But it should be obvious that in general actions are not commutative!
For example, the collection $T$ of rigid transformations acts on set $S$ of locations (vectors), and certainly $A·(B·X)$ may not be $B·(A·X)$ for general $A,B ∈ T$ and location $X$ (rotations and translations do not generally commute). So if $T$ is viewed as a ring, with addition being pointwise addition and multiplication being composition, then this ring has commutative addition (since vector addition is commutative) but has non-commutative multiplication. And of course $T$ is a subset of linear operators on the vector space of locations, which can be represented by matrices. After all, matrix multiplication is defined so that it is the same as composition.
answered Mar 4 at 6:58
user21820user21820
39.4k543155
39.4k543155
1
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
1
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
add a comment |
1
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
1
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
1
1
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
it is safe to say I’ve learned a great deal from this comment, thank you sincerely!
$endgroup$
– Kusa
Mar 4 at 10:42
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
$begingroup$
@Kusa: You're very welcome! Also see this and other posts linked from my profile under "Rigour with intuition".
$endgroup$
– user21820
Mar 4 at 14:47
1
1
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
$begingroup$
@Kusa The "linear action" view of rings is that they are isomorphic to a subring of the ring of linear maps on their underlying abelian group (this can be viewed as a ring theoretic analog of Cayley's theorem that a group may be represented as a subgroup of the group of bijections (permutations) acting on it. See the links here for more on such representation theory (including category-theoretic generalizations).
$endgroup$
– Bill Dubuque
Mar 4 at 17:04
add a comment |
$begingroup$
Matrix rings are the same as set of linear transformations ('endomorphisms') after writing them in a fixed basis. These linear transformations are functions of special kind (satisfying linearity conditions).
The functions taking values in vector space can be added and so make a vector space. As the domain and codomain are the same vector spaces, they can be composed.
The function addition and compositions satisfy distributive law.
Unfortunately function compositions are rarely commutative.
For example in the nicest situation of functions from real numbers to real numbers namely $f(x)= sin x, g(x)= x^2$. Clearly $sin (x^2)neq sin^2 (x)$.
(Or much simpler $(x+5)^2neq x^2 +5$.)
Functions and their compositions being mainstream operations in all branches of mathematics (not just algebra) we need a study of these operations and so we have to accommodate non-commutative rings.
In a different set-up, where there is no ring just a group: Galois group of a normal separable finite extension of fields consists of field automorphisms. Automorphisms are first of all functions from a field onto themselves.
Their compositions make them into a group. Again as function compositions are rarely commutative, one needs to single out the case where these Galois groups turn out to be abelian and find out what is special about them.
There is a whole subfield (pun intended) of Algebraic Number Theory called Class Field Theory that deals with cases of abelian Galois groups (and much more). The simplest cases of rational numbers as base field there is a celebrated theorem of Kronecker-Weber describing how to obtain all algebraic number fields that are abelian Galois extensions.
$endgroup$
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
add a comment |
$begingroup$
Matrix rings are the same as set of linear transformations ('endomorphisms') after writing them in a fixed basis. These linear transformations are functions of special kind (satisfying linearity conditions).
The functions taking values in vector space can be added and so make a vector space. As the domain and codomain are the same vector spaces, they can be composed.
The function addition and compositions satisfy distributive law.
Unfortunately function compositions are rarely commutative.
For example in the nicest situation of functions from real numbers to real numbers namely $f(x)= sin x, g(x)= x^2$. Clearly $sin (x^2)neq sin^2 (x)$.
(Or much simpler $(x+5)^2neq x^2 +5$.)
Functions and their compositions being mainstream operations in all branches of mathematics (not just algebra) we need a study of these operations and so we have to accommodate non-commutative rings.
In a different set-up, where there is no ring just a group: Galois group of a normal separable finite extension of fields consists of field automorphisms. Automorphisms are first of all functions from a field onto themselves.
Their compositions make them into a group. Again as function compositions are rarely commutative, one needs to single out the case where these Galois groups turn out to be abelian and find out what is special about them.
There is a whole subfield (pun intended) of Algebraic Number Theory called Class Field Theory that deals with cases of abelian Galois groups (and much more). The simplest cases of rational numbers as base field there is a celebrated theorem of Kronecker-Weber describing how to obtain all algebraic number fields that are abelian Galois extensions.
$endgroup$
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
add a comment |
$begingroup$
Matrix rings are the same as set of linear transformations ('endomorphisms') after writing them in a fixed basis. These linear transformations are functions of special kind (satisfying linearity conditions).
The functions taking values in vector space can be added and so make a vector space. As the domain and codomain are the same vector spaces, they can be composed.
The function addition and compositions satisfy distributive law.
Unfortunately function compositions are rarely commutative.
For example in the nicest situation of functions from real numbers to real numbers namely $f(x)= sin x, g(x)= x^2$. Clearly $sin (x^2)neq sin^2 (x)$.
(Or much simpler $(x+5)^2neq x^2 +5$.)
Functions and their compositions being mainstream operations in all branches of mathematics (not just algebra) we need a study of these operations and so we have to accommodate non-commutative rings.
In a different set-up, where there is no ring just a group: Galois group of a normal separable finite extension of fields consists of field automorphisms. Automorphisms are first of all functions from a field onto themselves.
Their compositions make them into a group. Again as function compositions are rarely commutative, one needs to single out the case where these Galois groups turn out to be abelian and find out what is special about them.
There is a whole subfield (pun intended) of Algebraic Number Theory called Class Field Theory that deals with cases of abelian Galois groups (and much more). The simplest cases of rational numbers as base field there is a celebrated theorem of Kronecker-Weber describing how to obtain all algebraic number fields that are abelian Galois extensions.
$endgroup$
Matrix rings are the same as set of linear transformations ('endomorphisms') after writing them in a fixed basis. These linear transformations are functions of special kind (satisfying linearity conditions).
The functions taking values in vector space can be added and so make a vector space. As the domain and codomain are the same vector spaces, they can be composed.
The function addition and compositions satisfy distributive law.
Unfortunately function compositions are rarely commutative.
For example in the nicest situation of functions from real numbers to real numbers namely $f(x)= sin x, g(x)= x^2$. Clearly $sin (x^2)neq sin^2 (x)$.
(Or much simpler $(x+5)^2neq x^2 +5$.)
Functions and their compositions being mainstream operations in all branches of mathematics (not just algebra) we need a study of these operations and so we have to accommodate non-commutative rings.
In a different set-up, where there is no ring just a group: Galois group of a normal separable finite extension of fields consists of field automorphisms. Automorphisms are first of all functions from a field onto themselves.
Their compositions make them into a group. Again as function compositions are rarely commutative, one needs to single out the case where these Galois groups turn out to be abelian and find out what is special about them.
There is a whole subfield (pun intended) of Algebraic Number Theory called Class Field Theory that deals with cases of abelian Galois groups (and much more). The simplest cases of rational numbers as base field there is a celebrated theorem of Kronecker-Weber describing how to obtain all algebraic number fields that are abelian Galois extensions.
answered Mar 5 at 3:03
P VanchinathanP Vanchinathan
15.3k12136
15.3k12136
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
add a comment |
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
$begingroup$
Right, and I hope that my answer succeeds in giving a layman-level motivation of ring action corresponding to the first half of your answer. =)
$endgroup$
– user21820
yesterday
add a comment |
$begingroup$
Commutativity is optional in the definition of a ring (or even a field) because one can already prove tons of propositions about rings (and even fields and modules and vector spaces) without assuming commutativity. The fewer assumptions one makes, the better are one's axioms.
It is more convenient to define a ring in general and to assume commutativity only when it is needed.
$endgroup$
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
add a comment |
$begingroup$
Commutativity is optional in the definition of a ring (or even a field) because one can already prove tons of propositions about rings (and even fields and modules and vector spaces) without assuming commutativity. The fewer assumptions one makes, the better are one's axioms.
It is more convenient to define a ring in general and to assume commutativity only when it is needed.
$endgroup$
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
add a comment |
$begingroup$
Commutativity is optional in the definition of a ring (or even a field) because one can already prove tons of propositions about rings (and even fields and modules and vector spaces) without assuming commutativity. The fewer assumptions one makes, the better are one's axioms.
It is more convenient to define a ring in general and to assume commutativity only when it is needed.
$endgroup$
Commutativity is optional in the definition of a ring (or even a field) because one can already prove tons of propositions about rings (and even fields and modules and vector spaces) without assuming commutativity. The fewer assumptions one makes, the better are one's axioms.
It is more convenient to define a ring in general and to assume commutativity only when it is needed.
edited Mar 5 at 2:38
J. W. Tanner
3,1081320
3,1081320
answered Mar 4 at 14:06
Olivier RocheOlivier Roche
1
1
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
add a comment |
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
French authors (and those who follow Bourbaki) do not assume that “field” implies commutativity. However, pretty much everyone else uses “division ring” for a ring (commutative or not) in which every nonzero element has a multiplicative inverse, and reserves “field” for the commutative case.
$endgroup$
– Arturo Magidin
Mar 5 at 2:49
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Your claim that axiomatizations are better when they make fewer assumptions is absolutely wrong, and demonstrably so. For example, groups may be axiomatized by only requiring the existence of left-identity and left-inverses. One can recover the existence of right-identity and right-inverses, but it would be ridiculous to use the former axiomatization. Another example, propositional logic can be reduced to the Sheffer stroke with just modus ponens plus a single short axiom schema, but it is obviously practically useless.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
Furthermore, one can prove the translation (via suitable encoding) of a lot of results in real analysis in very weak systems of arithmetic such as ACA. Does that mean that ACA is better than some set theory that actually can construct the real numbers (like Z) for real analysis? Of course not. It isn't important whether fewer assumptions are made. What's important is how meaningful those assumptions are. You are right that we can prove lots of facts about rings and modules, but it is the general applicability of those facts that make rings and modules important in the first place.
$endgroup$
– user21820
yesterday
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
If a property (eg existence of right identity and right inverse) can be deduced from other axioms, it is not an assumption. Of course, axiomatizing groups stating only the existence of left identity and left inverse would be pedantic and unpractical.
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
$begingroup$
I had difficulties finding good material about non commutative linear algebra (I recommand van der Waerden Algebra btw), just to discover that the results I needed could have been included almost for free in any algebra book!
$endgroup$
– Olivier Roche
19 hours ago
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3134219%2fwhy-is-commutativity-optional-in-multiplication-for-rings%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
$begingroup$
For what it's worth, we also have the notion of a "commutative ring," one in which multiplication does commute. (en.wikipedia.org/wiki/Commutative_ring) Of course I get you're trying to get at why we don't require this, and I don't know how to answer you on that, but I figured it's a point worth bringing up.
$endgroup$
– Eevee Trainer
Mar 4 at 0:00
$begingroup$
@EeveeTrainer Thank you! I dunno if you missed it, but I did express my awareness of that.
$endgroup$
– Kusa
Mar 4 at 0:02
1
$begingroup$
Note that matrices are an example where addition is commutative but multiplication is not necessarily
$endgroup$
– J. W. Tanner
Mar 4 at 0:03
$begingroup$
Oh, I did miss it, sorry about that Kusa. xD
$endgroup$
– Eevee Trainer
Mar 4 at 0:03
4
$begingroup$
The first rings that were considered were generally commutative, but it soon became apparent that rings with noncommutative multiplication were far too common, starting with matrices and, more generally, endomorphism rings of abelian groups. On the other hand, if you take the definition of ring with unity, but omit the condition that addition is commutative, it turns out that you can prove that the other conditions force commutativity of addition.
$endgroup$
– Arturo Magidin
Mar 4 at 0:11