# Dual of a Hopf Algebra

The dual of hopf algebra $A$ is its group of characters and also an $A$-module?

1. There are several notions of a dual of a bialgebra (the linear-algebraic dual, the graded dual, the reduced dual/0-dual/Sweedler dual), but none of them consist only of characters. Recall that characters are *algebra* homomorphisms from A to the ground field, so they don’t form a vector space; they are actually linearly independent (by a suitable generalization of Theorem 3 in http://www.math.uconn.edu/~kconrad/blurbs/galoistheory/linearchar.pdf ), so they form a *basis* of a vector space (but this space is, in general, much smaller than any dual of A). In order to get a vector space, you have to use *vector space* homomorphisms from A to the ground field.

The *linear-algebraic dual* of A is just the vector space A* of all k-linear maps from A to k. (Here and in the following, k is the ground field.) Unless A is finite-dimensional, this dual A* is merely an algebra with counit, but with no comultiplication to make it a bialgebra.

If A is a graded vector space, then the *graded dual* of A is the direct sum of the linear-algebraic duals of the graded components of A. This is a subspace of the linear-algebraic dual A* of A, and has the advantage of being a bialgebra if A was a graded bialgebra with all graded components finite-dimensional. (This is a frequent case to occur in algebraic combinatorics. Usually the n-th graded component of a combinatorial Hopf algebra is spanned by things like graphs on n vertices / partitions of n / subsets of {1,2,…,n} / etc., and thus finite-dimensional.)

Only in case you are interested in the third notion of dual (I don’t think it has much uses in combinatorics; it is used in representation theory of algebraic groups):

The *Sweedler dual* (a.k.a. reduced dual, a.k.a. 0-dual) of a bialgebra A is defined as the subspace of A* (the linear-algebraic dual) consisting of only those linear maps f : A -> k which satisfy any of the following equivalent properties:

1) There exists some natural number n and some linear maps f_1, f_2, …, f_n, g_1, g_2, …, g_n from A to k such that every x in A and y in A satisfy f(xy) = \sum_{i=1}^n f_i(x) g_i(y). (This condition looks messy but in fact it is just a formal way to say “we can define the coproduct of f”. This coproduct, by the way, will then be the tensor \sum_{i=1}^n f_i \otimes g_i.)

2) There is an ideal I of A such that A/I is a finite-dimensional vector space and f(I) = 0.

3) There is a left ideal I of A such that A/I is a finite-dimensional vector space and f(I) = 0.

4) There is a right ideal I of A such that A/I is a finite-dimensional vector space and f(I) = 0.

5) The vector space Af is finite-dimensional. Here, you have to use the left A-module structure on A* which is given by (af)(x) = f(xa) for all a in A, f in A* and x in A.

6) The vector space fA is finite-dimensional. Here, you have to use the right A-module structure on A* which is given by (fa)(x) = f(ax) for all a in A, f in A* and x in A.

This Sweedler dual is often written A^0 or A^{\circ}, whence the name “0-dual”.

Sorry for absence of LaTeX; it seems I am never concentrated enough to get it right on WordPress without a preview function.

2. briansdumb

The set of characters of $A$ consists of algebra morphisms $A \rightarrow k$, all of which are also linear functionals, hence elements of the set $A^*$, true? Is your point that $A^*$ also contains non-multiplicative maps? If so, I agree.
Then for finite-dimensional $A$ do we have a sub-hopf algebra consisting of the characters?

As for $A$-modules, I think the construction could be done in multiple ways. One might be to have ring elements from $A$ act on the right on elements of the set of characters ( or $A^*$ ) by evaluation.

3. Yes, A* contains all linear maps, not just the multiplicative ones.

Yes, the subspace of A* generated by the characters is a sub-Hopf algebra of A* if A is finite-dimensional. If A is not finite-dimensional, it is a sub-Hopf algebra of the Sweedler dual A^0 (characters always lie in A^0).

As for the A-module structure on A*, I do not think there are many different ways to define it. What you propose (evaluation) doesn’t give A-module structures, since the evaluation of an element of f at an element of A is an element of k, not of A.

4. briansdumb

Let $a\in A$ act on an element of the $A$-module $\alpha\in A^*$ by $\langle \alpha , a \rangle$. this is again in $A^*$ if we define its action on $h\in A$ as $\alpha(ah)$. We could just as easily have defined it to be $\langle \alpha , S(a) \rangle$, where $S$ is the antipode of $A$. I think any string of elements from $A$ and $S$ could be put in there, and it would still be in $A^*$, which is why I say there are many ways of constructing $A^*$ as an $A$ module.

5. Ah, right, you could use the antipode, although you would then have to switch left with right (since the antipode is an algebra antihomomorphism). Or you could use powers of the antipode. I forgot that you are talking of a Hopf algebra, not just a bialgebra.

This is still not being called “evaluation”. And for it to be an action, you cannot just put any random string of elements: e.g. you cannot define an action by (af)(x) = f(2xa) for all a in A, f in A* and x in A.

• briansdumb

Since to be in our module called $A^*$ you only need to be a linear map (see your comment above,) and not an algebra homomorphism, I wonder if the order of the multiplication matters.
As for your comment about strings, could you elaborate on what the trouble would be with this construction?

6. Well, to define an action of an algebra R on a vector space V it is not enough to just define a bilinear map R \times V \to V. We also must ensure that this map satisfies the “associativity” property (ab)v = a(bv) for all a in R, b in R and v in V. And this is where the correct order of multiplication matters (unless R is commutative).

7. briansdumb

I agree now, good point.

What trouble is there with defining the action to include a random string?

8. The same as with the order. If you define an action by

(af)(x) = f(xS(a)) for all a in A, f in A* and x in A,

then you get a nasty surprise: (ab)f = b(af) rather than = a(bf). So you have to make this a right action rather than a left one. And with more complicated strings you get no action at all.

9. briansdumb

so antipodes are rude, got it…
We agree then to a revised statement?
1. The characters of A form a sub-hopf algebra of A*
2. A* forms an A-module under the action(s) we just discussed
3. The characters of A form a submodule of 2.