关于shankar《principles of quantum mechanics》的读书笔记(半抄书。

因为在知乎看到了别人写的抄书笔记所以自己也想着来写了。在大白纸上写过一遍后还是感觉很容易忘记学过了什么,再加上常年把纸装在包里已经破破烂烂的了,很担心笔记说不定哪一天就拜拜了(。

于是在博客里再整理一下,权当复习。

!!!NOTICE:我是直接抄的英文书,可能有空会翻译一下,也会写写自己的一些理解,有的部分英语突然蹩脚那大概也是我自己写的英文理解,但是由于本人水平非常非常菜又是自学,所有的部分不建议各位完全相信。

以上!正式开始

1.Mathematical Introduction

1.1 Linear Vector Spaces

missing the requirements that every vector have a magnitude and direction.

Definition 1 A linear vector space​\mathbb{V} is a collection of objects ​|1\rangle,|2\rangle,\dots|v\rangle,\dots|w\rangle,\dots,called vectors,for which there ecists.

  • vector sum:​|v\rang+|w\rang

  • multiplication by scalars:​a|v\rangle

  • ​|v\rang+|w\rang \in \mathbb{V}

    • distributive:​a(|v\rang+|w\rang)=a|v\rang+a|w\rang
      ​(a+b)|v\rang=a|v\rang+b|v\rang
    • associative:​a(b|v\rang)=ab|b\rang
    • commutativa:​|v\rang+|w\rang=|w\rang+|v\rang
      ​|v\rang+(|w\rang+|z\rang)=(|v\rang+|w\rang)+|z\rang
    • null vector:​|v\rang+|0\rang=|v\rang[exists]
    • inverse:​|-v\rang+|v\rang=|0\rang[exists]

Definition 2 The numbers ​a,b,\dotsare called the field over which the veotor space is defined.

  • ​|0\rang is unique
  • ​0|v\rang=|0\rang
  • ​|-v\rang=-|v\rang
  • ​|-v\rangis unique for ​|v\rang

​\vec{v}\neq|v\rang

Definition 3 linearly independent and linearly dependent.

Definition 4 A vector space has demendion ​n if it can a commodate a maximum of ​n linearly independent vectors. It will be denoted by ​\mathbb{V}^n(R) if the field is real and by ​\mathbb{V}^n(C) if the field is complex.

Theorem 1 Any vector ​|v\rang in a n-dimensional space can be written as a linearly combination of ​n linearly independent vevtors ​|1\rang\dots|n\rang.

Definition 5 A set of ​n linearly independent vectors in an n-dimensional space is called a basis.

Definition 6 The coefficients of expansion ​v_i of a vector in terms of a linearly independent basis are called the components of the vector in that basis.

Theorem 2 The expansion in ​\sum\limits_{n=1}^n a_n|n\rang=|0\rang is unique.

1.2 Inner Product Spaces

like ​\vec{A}\cdot\vec{B}=A_xB_x+A_yB_y+A_zB_z :for ​|v\rang and ​|w\rang we donote it by symbol ​\lang v|w\rang .

  • skew-symmetry ​\lang v|w\rang=\lang w|v\rang^* .
  • positive semidefiniteness ​\lang v|v\rang \ge 0 ,​0 iff ​|v\rang=|0\rang .
  • ​\lang v|(a|w\rang +b|z\rang)\equiv\lang v|aw+vz\rang =a\lang v|w\rang+b\lang v|z\rang

Definition 7 A vector space with an inner product is called an inner product space.

Definiton 8 We say that two vectors are orthogonal or perpendicualr if their inner profuct vanishes.

Definition 9 We will refer to ​\sqrt{\lang v|v\rang} \equiv|v| as the norm or length of the vector. A normalized vector has unit norm.

Definiton 10 A set of basis vectors all of unit norm,which are pariwise orthogonal will be called an orthonormal basis.

Given​|v\rang=\sum\limits_j|i\rangand​|w\rang=\sum\limits_j|j\rang, then ​\lang v|w\rang=\sum\limits_i\sum\limits_j v_i^*w_j\lang i|j\rang

Theorme 3 Given a linearly independent basis, we can form linear cobinations of the bsis vectoers to obtain an orthonmal basis.

If ​|i\rang and ​|j\rang is orthonormal, then ​\lang i|j\rang =\{\begin{aligned}1\quad&for\quad i=j\\0\quad &for\quad i\neq j\end{aligned}\equiv\delta_{ij}.

So, the double sum collapses:​\lang v|w\rang=\sum\limits_iv_i^*w_i.


​|v\rang is uniquely specified by its components in a given basis:​|v\rang\rightarrow\begin{bmatrix}v_1\\ v_2\\ \vdots \\ v_n\end{bmatrix}

likewise: ​|w\rang\rightarrow\begin{bmatrix}w_1\\ w_2\\ \vdots \\ w_n\end{bmatrix}

so ​\lang v|w\rang=\begin{bmatrix}v_1^*&v_2^*&\dots &v_n^*\end{bmatrix}\begin{bmatrix}w_1\\2_2\\ \vdots\\ w_n\end{bmatrix}

1.3 Dual Spaces and the Ditac Notation

adjoint​\rightarrow or transpose conjugate

the rules for taking the adjoint:

  • like: ​|v\rang =\sum\limits_{i=1}v_i|i\rang\rightarrow\lang v|=\sum\limits_{i=1}\lang i|v_i^*
    ​|v\rang=\sum\limits_{i=1}|i\rang\lang i|v\rang\rightarrow\lang v|=\sum\limits_{i=1}\lang v|i\rang\lang i|
  • from: reverse the order of all factors, exchanging bars and kets, and complex conjugating all coefficients.

Cram-Schmidt Theorem: converting a linear independent basis into an orthonormal one.

|1\rang=\frac{I\rang}{|I|}

clearly

\lang 1|1\rang=\frac{\lang I|I\rang}{|I|^2}=1

then

|2^{\prime}\rangle=|II\rangle-|1\rangle\langle1|II\rangle

clearly

\langle1|2^{\prime}\rangle=\langle1|II\rangle-\langle1|1\rangle\langle1|II\rangle=0

then

|3^{\prime}\rangle=|III\rangle-|1\rangle\langle1|III\rangle-|2\rangle\langle2|III\rangle

subtract from the second vector its projection along the first.

Thorme 4 The dimensionality of a space equals ​n_{\perp} , the maximum number of mutually orthogonal vectors in it.

Theorme 5 The Schwarz Inequality​|\langle V|W\rangle|\leq|V||W|

Theorme 6 The Triangle Inequality​|V+W|\leq|V|+|W|

1.4 Subspace

Definition 11 Given a vector space ​\mathbb{V} a subset of its elements that form a vector space among themselves is called a subspace. We will denote a particular subspace ​i of dimensionality ​n_i by ​\mathbb{V}^{n_i}_i.

Definition 12 Given two subspaces ​\mathbb{V}^{n_i}_i and ​\mathbb{V}^{m_j}_j, we define their sum ​\mathbb{V}^{n_i}_i \oplus \mathbb{V}^{m_j}_j=\mathbb{V}^{m_k}_k as the set containing:

  1. all elements of ​\mathbb{V}^{n_i}_i,
  2. all elements of ​\mathbb{V}^{m_j}_j,
  3. all possible linear combinations of the above. But for the elements 3, closure would be lost.

1.5 Linear Operators

An operator ​\Omega is an instruction for transforming any given vector ​|v\rang into another, ​|v^{\prime}\rang. Follows:​\Omega{|V\rangle=|V^{\prime}\rangle}

We only be concerned with linear operators.

Example: ​I\rightarrowLeave the vector alone

(作者可爱捏><)

and:​\bold{R}(2\pi i)\rightarrowRotate vector by ​2\pi about the unit vector ​i


The order of the operators in a product is important: ​\Omega\Lambda-\Lambda\Omega\equiv[\Omega,\Lambda] called the commutator of ​\Omega and ​\Lambda.

It's nonzero = They do not commute.

useful identities:

[\Omega,\Lambda\theta]=\Lambda[\Omega,\theta]+[\Omega,\Lambda]\theta
[\Lambda\Omega,\theta]=\Lambda[\Omega,\theta]+[\Lambda,\theta]\Omega

The inverse of ​\Omega, donated by ​\Omega^{-1}: ​\Omega\Omega^{-1}=\Omega^{-1}\Omega=I

​(\Omega\Lambda)^{-1}=\Lambda^{-1}\Omega^{-1}

1.6 Matrix Elements of Linear Operators

这一章是矩阵元,我觉得是比较重要的一章。字面来看就是指的算子中j行i列的元素。

​\Omega|i\rangle=|i^{\prime}\rangle\rightarrow\langle j|i'\rangle=\langle j|\Omega|i\rangle\equiv\Omega_{ji}

​\Omega_{ji} is a ​n^2 number. it is the matrix elements of ​\Omega.


projection operators: ​\mathbb{P}_{i}=|i\rangle\langle i| for ​|i\rang

completeness relation: ​I=\sum_{i=1}^{n}|i\rangle\langle i|=\sum_{i=1}^{n}\mathbb{P}_{i}

这是什么?完备关系!学一下!

名字听着怪叫什么什么关系的但是只给了一个等式,但是其实是蛮重要的,很多地方可以看到它。

​\mathbb{P}_i projects out the component of any ket ​|v\rang along the direction ​|i\rang.

这个投影算符也是很重要的,注意投影算符是厄米算符,所以(虽然不知道为什么所以)它是正交投影,也就是说,对投影再做投影结果是不变的。

​\lang v|v^{\prime}\rang: scalar--inner product

​|v\rang\lang v^{\prime}|: operator--outer product

我觉得这里很有意思,它反而把内积和外积这两个称呼作为参考,正式称呼则是scalar和operator,我认为这两个称呼是更有意义的。

span:

|i\rangle\langle i|\leftrightarrow\begin{bmatrix}0\\0\\\vdots\\1\\0\\0\\\vdots\\0\end{bmatrix}\begin{bmatrix}0,0,\ldots,1,0,\ldots,0\end{bmatrix}=\begin{bmatrix}0&&&\cdots&&&0\\&\ddots&&&&&\\&&0&&&&\\\vdots&&&1&&&\\&&&&0&&\\&&&&&\ddots&\\0&&&&&&0\end{bmatrix}

这是什么?张成!是目前仍然觉得没有学会的东西(

The Adjoint of an operator: ​\lang\Omega V|=\lang V|\Omega^{\dagger}, as ​\lang av|=\lang v|a^*.

Hermitian, Anti - Hermitian, and Unitary Operators

Definition 13 An operator ​\Omega is Hermitian if ​\Omega^\dagger=\Omega.

Definition 14 An operator ​\Omega is anti-Hermitian if ​\Omega^\dagger=-\Omega.

operator complex
adjoint conjugate
Hermitian pure real number
Anti-Hermitian pure imaginary number

​\Omega=\frac{\Omega+\Omega^\dagger}2+\frac{\Omega-\Omega^\dagger}2, like ​\alpha=\frac{\alpha+\alpha^*}2+\frac{\alpha-\alpha^*}2

个人认为这里讲的非常妙,第一次理解了厄米算子和反厄米算子到底是什么意思。顺便这里接着对之前投影算符的说法,投影算符是厄米算符,所以有:

\lang u|\mathbb{P}v\rang=\lang\mathbb{P}u|v\rang

Definition 15 An operator ​U is unitary if ​UU^{\dagger}=I(inverses,like ​uu^*=1)

Theorem 7 Unitary operators preserve the inner product between the vectors they act on.

Unitary operators are the generalizations of rotation operators.

Theorem 8 If one treats the columns of an ​n\times n unitary matrix as components of n vectors, these vectors are orthonormal. In the same way, the rows may be interpreted as components of n orthonormal vectors.

1.7 Active and Passive Transformations

unitary transformation: ​|V\rangle\to U|V\rangle

大名鼎鼎的幺正变换,又即酉算子。看起来平平无奇但是也非常重要。

  • active transformation:​\langle V^{\prime}|\Omega|V\rangle\to\langle UV^{\prime}|\Omega|UV\rangle=\langle V^{\prime}|U^{\dagger}\Omega U|V\rangle
  • passive transformation:​\Omega\rightarrow U^\dagger\Omega U

1.8 The Eigenvalue Problem

个人觉得很难看懂的一节,但是非常非常重要。特征值意味着它在变换中只会伸缩,这是非常重要的性质。可以参考3blue1brown做的可视化线性代数,理解特征值的重要意义,只看特征值这一节也行,但顺便一提全系列都很不错。

The eigenvalue problem, the action of it is simply that of vescaling:​\Omega|V\rangle=\omega|V\rangle.


看了别的课程来这边补充一下,这个本征值具体在量子力学里怎么用呢?大概是这样的:

\hat{\Omega}|\phi_i\rangle=\omega_i|\phi_i\rangle

其中​\hat{\Omega}是个厄米算符,然后就有

\hat{\Omega}=\sum_i\omega_i|\phi_i\rangle\langle\phi_i|

注意​\omega_i是本征值。学过量子力学后的你对这个一定不陌生。


The Characteristic Equation and the Solution to the Eigenvalue Problem

​(\Omega-\omega I)|V\rangle=|0\rangle\rightarrow |V\rangle=(\Omega-\omega I)^{-1}|0\rangle\rightarrow ^{(M^{-1}=\frac{\text{cofactor }M^T}{\det M})}\rightarrow det(\Omega-\omega I)=0\rightarrow \langle i|\Omega-\omega I|V\rangle=0\rightarrow \sum_j(\Omega_{ij}-\omega\delta_{ij})v_j=0

characteristic equation: ​\sum_{m=0}^{n}c_{m}\omega^{m}=0

characteristic polynomial: ​P^n(\omega)=\sum_{m=0}^nc_m\omega^m

Theorem 9 The eigenvalues of a Hermitian operator are real.

Theorem 10 To every Hermitian operator ​\Omega there exists (at least) a basis consisting of its orthonormal eigenvectors. It is diagonal in this eigenbasis and has its eigenvalues as its diagonal entries.

  • eigenspace

Degeneracy

然而这一段我完全没看懂。整段垮掉。

Theorem 11 The eigenvalues of a unitary operator are complex numbers of unit modulus.

Theorem 12 The eigenvectors of a unitary operator are mutually orthogonal. (We assume there is no degeneracy.)


Diagonalization of Hermitian Matrices

If ​\Omega is a Hermitian matrix, there exists a unitary matrix ​U (built out of the eigenvectors of ​\Omega) such that ​U^{\dagger}\Omega U is diagonal.

Simultaneous Diagonalization of Two Hermitian Operators

Theorem 13. If ​\Omega and ​\Lambda are two commuting Hermitian operators, there exists (at least) a basis of common eigenvectors that diagonalizes them both.