\classheader{2012-11-07} Let $\lambda\in\eucal{P}_d$. Recall the various classes of symmetric polynomials we defined last time: \begin{center} \begin{tabular}{c|c} $e_\lambda$ & elementary\\\hline $m_\lambda$ & monomial\\\hline $h_\lambda$ & complete\\\hline $p_\lambda$ & power sum\\\hline $s_\lambda$ & Schur \end{tabular} \end{center} Last time, we defined the quantity \[XY=\prod_{i,j\geq 1}\frac{1}{1-x_iy_j}.\] \begin{proposition}[2nd Cauchy identity] \[XY=\sum_\lambda h_\lambda(x)m_\lambda(y)=\sum_\lambda h_\lambda(y)m_\lambda(x).\] \end{proposition} \begin{proof} Recall that the generating function for the complete symmetric polynomials was \[H(t)=\prod_{i\geq 1}\frac{1}{1-tx_i}.\] Thus, \[XY=\prod_jH(y_j)=\prod_j\sum_{r\geq 0}h_r(x)y_j^r=\sum_{\alpha\in\Z_{\geq 0}^n} h_\alpha(x)y^\alpha=\sum_{\lambda}h_{\lambda}(x)m_{\lambda}(y),\] where in the last equality we grouped the $\alpha$'s according to which $\lambda$ they are a permutation of. \end{proof} \begin{proposition}[3rd Cauchy identity] \[XY=\sum_\lambda s_\lambda(x)s_\lambda(y).\] \end{proposition} \begin{proof} The determinantal formula says that for any $\alpha\in\Z_{\geq 0}^n$, \[a(x^\alpha)=a(x^\rho)\det(h_{\alpha_i-n+j})_{ij},\] where $h$'s with negative indices are declared to be 0, and $\rho$ is as we defined last time, \[\rho=(n-1,n-2,\ldots,1,0).\] Thus, \[a(x^\alpha)=a(x^\rho)\sum_{\substack{s\in S_n\\ \beta\in\Z_{\geq 0}^n}}\sign(s)h_{\beta-s(\rho)}(x).\] Now we use the 2nd identity to see that \[a(x^\rho)a(y^\rho)\cdot XY=a(x^\rho)\sum_{\substack{s\in S_n\\ \lambda}} h_\lambda(x)\sign(s)y^{s(p)}m_\lambda(y),\] where we expanded $a(y^\rho)$ as an alternation explicitly. Then \[a(x^\rho)a(y^\rho)\cdot XY=a(x^\rho)\sum_{\substack{\alpha\in\Z_{\geq 0}^n\\s\in S_n}} h_\alpha(x)\sign(s)y^{\alpha+s(\rho)}.\] Letting $\beta=\alpha+s(\rho)$, \[a(x^\rho)a(y^\rho)\cdot XY=a(x^\rho)\sum_{\beta,s}\sign(s)h_{\beta-s(\rho)}(x)y^\beta,\] which is precisely the expression in the determinental formula. Thus, \[a(x^\rho)a(y^\rho)\cdot XY=\sum_{\beta\in\Z_{\geq 0}^n}a(x^\beta)y^\beta=\sum_{\substack{\mu\\ s\in S_n}}a(x^{s(\mu)})y^{s(\mu)}=\sum_{\mu,s}a(x^\mu)\sign(s)y^{s(\mu)},\] where in the last step we used that $a(x^{s(\mu)})=\sign(s)\cdot a(x^\mu)$. Finally, letting $\mu=\lambda+\rho$, this is equal to \[\sum_{\mu}a(x^\mu)a(y^\mu)=\sum_{\lambda}a(x^{\lambda+\rho})a(y^{\lambda+\rho}).\] Dividing both sides by $a(x^\rho)a(y^\rho)$ and applying the definition of the Schur polynomials, we are done. \end{proof} The Hall inner product $\langle\,\cdot\,,\,\cdot\,\rangle:R\times R\to\Q$ is defined by $\langle h_\mu,m_\lambda\rangle=\delta_{\lambda\mu}$. \begin{lemma} Let $\{u_\lambda\}$ and $\{v_\lambda\}$ be a pair of bases of $R$. Then the following are equivalent: \begin{enumerate} \item $\langle u_\lambda,v_\mu\rangle=\delta_{\lambda\mu}$ \item $\sum_{\lambda} u_\lambda(x)v_\lambda(y)=XY$. \end{enumerate} \end{lemma} \begin{proof} Expanding in the basis, \[u_\lambda=\sum_{\nu}a_{\lambda\nu}h_\nu,\qquad v_\mu=\sum_{\sigma} b_{\mu\sigma}m_\sigma.\] Then 1 is equivalent to \[\sum_\nu a_{\lambda\nu}b_{\mu\nu}=\delta_{\lambda\mu}\] and 2 is equivalent to \[\sum_{\lambda}u_\lambda(x)v_\lambda(y)=XY\overset{\text{Cauchy 2}}{=}\sum_\nu h_\nu(x)m_\nu(y).\] But both of these are equivalent to the claim that the matrix of $a$'s is inverse to the matrix of $b$'s, i.e., \[\sum_\lambda a_{\lambda\nu}b_{\lambda\sigma}=\delta_{\nu\sigma}.\qedhere\] \end{proof} Applying the 1st Cauchy identity, \[XY=\sum_{\lambda}\frac{1}{z_\lambda}p_\lambda(x)p_\lambda(y)\] which we proved last time, we get as a corollary of this lemma that \[\langle p_\lambda,p_\mu\rangle=z_\lambda\cdot\delta_{\lambda\mu}.\] Another corollary is that $\langle s_\lambda,s_\mu\rangle=\delta_{\lambda\mu}$ (by the 3rd Cauchy identity, and the lemma). Lastly, we get as a corollary that $\langle\,\cdot\,,\,\cdot\,\rangle$ is a symmetric positive definite bilinear form (this is important because it is defined in an \textit{a priori} non-symmetric way). Now we define an involution $\tau:R\to R$. \begin{definition} Define $\tau:R\to R$ by $\tau(e_r)=h_r$. Because these generate $R$ as an algebra, this then implies that $\tau(e_\lambda)=h_\lambda$ for all $\lambda\in\eucal{P}_n$. \end{definition} \begin{proposition} The map $\tau$ is an involution, and $\tau$ respects $\langle\,\cdot\,,\,\cdot\,\rangle$. \end{proposition} \begin{proof} The action of $\tau$ is as a matrix $T$, and Newton's identities tell us that \[\sum_{r=0}^n(-1)^re_rh_{n-r}=0.\] Because $T^2(e_r)=T(h_r)$ and because Newton's identities are symmetric under switching $e$'s and $h$'s, we must have that $T^2(e_r)=e_r$, and hence $T^2=\id$. You have equations expressing $p_r$ in terms of $e_\lambda$'s and $h_\lambda$'s from last time. %, and you can check that they are of the form %\[p_r=\langle e_\lambda,\,\cdot\,\rangle,\qquad p_r=\langle h_\lambda,\,\cdot\,\rangle.\] %jw These equations and induction imply that $\tau(p_r)=(-1)^{r-1}p_r$. Thus, \[\langle \tau(p_\lambda),\tau(p_\mu)\rangle=\begin{cases} \pm \langle p_\lambda,p_\mu\rangle = 0 & \text{ if }\lambda\neq \mu,\\ \langle p_\lambda,p_\lambda\rangle & \text{ if }\lambda=\mu. \end{cases}\qedhere\] \end{proof} \subsection*{Young diagrams} The Young diagram of a partition $\lambda=(\lambda_1,\lambda_2,\ldots)$ with $|\lambda|=n$ is \begin{center} \begin{tikzpicture}[scale=0.4] \draw[thick] (0,0) rectangle (1,7); \draw[thick] (0,0) rectangle (1,6); \draw[thick] (0,0) rectangle (1,5); \draw[thick] (0,0) rectangle (1,4); \draw[thick] (0,0) rectangle (1,3); \draw[thick] (0,0) rectangle (1,2); \draw[thick] (0,0) rectangle (1,1); \draw[thick] (0,0) rectangle (2,5); \draw[thick] (0,0) rectangle (2,4); \draw[thick] (0,0) rectangle (2,3); \draw[thick] (0,0) rectangle (2,2); \draw[thick] (0,0) rectangle (2,1); \draw[thick] (0,0) rectangle (3,5); \draw[thick] (0,0) rectangle (3,4); \draw[thick] (0,0) rectangle (3,3); \draw[thick] (0,0) rectangle (3,2); \draw[thick] (0,0) rectangle (3,1); \draw[thick] (0,0) rectangle (8,2); \draw[thick] (0,0) rectangle (7,2); \draw[thick] (0,0) rectangle (6,2); \draw[thick] (0,0) rectangle (5,2); \draw[thick] (0,0) rectangle (4,2); \draw[thick] (0,0) rectangle (9,1); \node at (-0.7,0.5) {$\lambda_1$}; \node at (-0.7,1.5) {$\lambda_2$}; \node at (-0.7,3.2) {$\vdots$}; \end{tikzpicture} %\vspace{-0.1in} \end{center} Thus, the total number of blocks in the diagram is $n$. There is a natural involution on Young diagrams, $\lambda\mapsto \lambda^t$, defined by interchanging rows and columns. There is another determinantal formula, which is symmetric with respect to this involution: \[s_\lambda=\det(e_{\lambda_i^t-i+j}).\] As a corollary, we get that $\tau(s_\lambda)=s_{\lambda^t}$ (apply involution to first determinantal formula; $e$'s and $h$'s are interchanged, and we flip $\lambda$). Now we'll start relating this to the representation theory of $S_n$. Fix a field $k$, and consider the polynomial ring $k[x_1,\ldots,x_n]$. Let $\{1,\ldots,n\}=I_1\sqcup \cdots\sqcup I_k$ be a partition with $\# I_j=\lambda_j$, which collectively we will call $I$. This naturally corresponds to $\lambda$, a partition in the sense we've been using. Then we define \[S_I=S_{I_1}\times\cdots\times S_{I_k}\subset S_n.\] We define $D_I$ to be a product of certain Vandermonde determinants of various sizes, \[D_I=D_{I_1}\times\cdots\times D_{I_k},\] where \[D_{I_m}:=\prod_{i,j\in I_m}(x_i-x_j).\] \begin{theorem} Let $\operatorname{char}(k)=0$, and let $I$ be a partition as above. Then $V_\lambda:=kS_n\cdot D_I$ is an irrep of $S_n$, and in fact $\{V_\lambda\mid \lambda\in \eucal{P}_n\}\cong\widehat{S_n}$. \end{theorem}