winterkoninkje: shadowcrane (clean) (Default)

Over on Reddit there's a discussion where one commenter admitted:

"the whole (^) vs (^^) vs (**) [distinction in Haskell] confuses me."
It's clear to me, but it's something they don't teach in primary school, and it's something most programming languages fail to distinguish. The main problem, I think, for both primary ed and for other PLs, is that they have an impoverished notion of what "numbers" could be, and this leads to artificially conflating things which should be kept distinct. I wrote a reply over on Reddit, hoping to elucidate the distinction, but I thought I should repeat it in more persistent venue so it can reach a wider audience.

First, let us recall the basic laws of powers:

a^0 = e
a^1 = a
(a^x)^y = a^(x*y)
(a^x)*(a^y) = a^(x+y)
(a*b)^x = (a^x)*(b^x)

There are two very important things to notice here. First off, our underlying algebra (the as and bs) only needs to have the notion of multiplication, (*), with identity, e. Second, our powers (the xs and ys) have an entirely different structure; in particular, they form at least a semiring (+,0,*,1). Moreover, if we're willing to give up some of those laws, then we can weaken these requirements. For example, if we get rid of a^0 = e then we no longer need our underlying algebra to be a monoid, being a semigroup is enough. And actually, we don't even need it to be a semigroup. We don't need full associativity, all we need for this to be consistent is power-associativity.

So we can go weaker and more abstract, but let's stick here for now. Any time we have a monoid, we get a notion of powers for free. This notion is simply iterating our multiplication, and we use the commutative semiring (Natural,+,0,*,1) in order to represent our iteration count. This is the notion of powers that Haskell's (^) operator captures. Unfortunately, since Haskell lacks a standard Natural type (or Semiring class), the type signature for (^) lies and says we could use Integer (or actually, Num which is the closest thing we have to Ring), but the documentation warns that negative powers will throw exceptions.

Moving on to the (^^) operator: suppose our monoid is actually a group, i.e. it has a notion of reciprocals. Now, we need some way to represent those reciprocals; so if we add subtraction to our powers (yielding the commutative ring (Integer,+,-,0,*,1)), we get the law a^(-x) = 1/(a^x). The important thing here is to recognize that not all monoids form groups. For example, take the monoid of lists with concatenation. We do have a (^) notion of powers, which may be more familiar as the replicate function from the Prelude. But, what is the reciprocal of a string? what is the inverse of concatenation? The replicate function simply truncates things and treats negative powers as if they were zero, which is on par with (^) throwing exceptions. It is because not all monoids are groups that we need a notion of powers for monoids (i.e., (^)) which is different from the notion of powers for groups (i.e., (^^)). And though Haskell fails to do so, we can cleanly capture this difference in the type signatures for these operations.

Further up, we get another notion of powers which Haskell doesn't highlight; namely the notion of powers that arises from the field (Rational,+,-,0,*,/,1). To get here, we take our group and add to it the ability to take roots. The fractions in powers are now taken to represent the roots, as in the law a^(1/y) = root y a. Again note that there's a vast discrepancy between our underlying algebra which has multiplication, reciprocals, and roots vs our powers which have addition, subtraction, multiplication, and division.

Pulling it back a bit, what if our monoid has a notion of roots, but does not have inverses? Here our powers form a semifield; i.e., a commutative semiring with multiplicative inverses; e.g., the non-negative rationals. This notion is rather obscure, so I don't fault Haskell for lacking it, though it's worth mentioning.

Finally, (**) is another beast altogether. In all the previous examples of powers there is a strong distinction between the underlying algebra and the powers over that algebra. But here, we get exponentiation; that is, our algebra has an internal notion of powers over itself! This is remarkably powerful and should not be confused with the basic notion of powers. Again, this is easiest to see by looking at where it fails. Consider multiplication of (square) matrices over some semiring. This multiplication is associative, so we can trivially implement (^). Assuming our semiring is actually a commutative ring then almost all (though not all) matrices have inverses, so we can pretend to implement (^^). For some elements we can even go so far as taking roots, though we run into the problem of there being multiple roots. But as for exponentiation? It's not even clear that (**) should be meaningful on matrices. Or rather, to the extent that it is meaningful, it's not clear that the result should be a matrix.

N.B., I refer to (**) as exponentials in contrast to (^), (^^), etc as powers, following the standard distinction in category theory and elsewhere. Do note, however, that this notion of exponentials is different again from the notion of the antilog exp, i.e. the inverse of log. The log and antilog are maps between additive monoids and multiplicative monoids, with all the higher structure arising from that. We can, in fact, give a notion of antilog for matrices if we assume enough about the elements of those matrices.

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

If you are unable to use this captcha for any reason, please contact us by email at support@dreamwidth.org

April 2019

S M T W T F S
 123456
78910111213
14151617181920
212223242526 27
282930    

Tags

Page generated 20 Jun 2025 06:46 am
Powered by Dreamwidth Studios