In the previous post we saw the following definition for a ring $R$: An element $r\in R$ is called **strongly nilpotent** if every sequence $r = r_0,r_1,r_2,\dots$ such that $r_{n+1}\in r_nRr_n$ is eventually zero. Why introduce this notion?

Well, did you know that every finite integral domain is a field? If $R$ is an integral domain and $a\in R$ is *nonzero*, then the multiplication map $R\to R$ given by $x\mapsto ax$ is injective. If $R$ is finite, then it must also be surjective so $a$ is invertible!

Another way of stating this neat fact is that if $R$ is any ring and $P$ is a prime ideal of $R$ such that $R/P$ is finite, then $P$ is also a maximal ideal. A variation of this idea is that every prime ideal in a finite commutative ring is actually maximal. Yet another is that finite commutative rings have Krull dimension zero.

But let's not get carried away. Remember we also recently talked about the Jacobson radical of a ring $R$? It is the intersection of all the maximal left ideals of $R$, which turns out to be the same as the intersection of all the maximal right ideals of $R$. From our discussion of finite commutative rings, we see that:

For any commutative ring, the intersection of all the primes is known as the **nilradical**, and is so named because the nilradical is also the set of all the nilpotent elements. Thus:

What if we drop the word "commutative"? That's why we introduced the notion of **strongly nilpotent**!

We already saw this following example: in any $2\times 2$ matrix ring (over a nonzeror ring), the element

$$\begin{pmatrix}0 & 1\\ 0 & 0\end{pmatrix}$$

is nilpotent but not strongly nilpotent. This makes sense from the point of view of the Jacobson radical too. For example, the ring of $2\times 2$ matrices over a field is simple and so has no two-sided ideals and hence zero is the only maximal ideal. Therefore this element is not in the Jacobson radical. Can you find a direct argument that shows that this element is not in the Jacobson radical of any $2\times 2$ nonzero matrix ring?

So that's the reason for introducing the concept of **strongly nilpotent**. It is a possible appropriate generalisation of nilpotent to noncommutative rings, at least from the point of view of the Jacobson radical!