# arrow_back An inner product on a real vector space $V$ is a function $\langle\cdot, \cdot\rangle: V \times V \rightarrow \mathbb{R}$ satisfying

151 views
An inner product on a real vector space $V$ is a function $\langle\cdot, \cdot\rangle: V \times V \rightarrow \mathbb{R}$ satisfying

An inner product on a real vector space $V$ is a function $\langle\cdot, \cdot\rangle: V \times V \rightarrow \mathbb{R}$ satisfying
(i) $\langle\mathbf{x}, \mathbf{x}\rangle \geq 0$, with equality if and only if $\mathbf{x}=\mathbf{0}$
(ii) $\langle\alpha \mathbf{x}+\beta \mathbf{y}, \mathbf{z}\rangle=\alpha\langle\mathbf{x}, \mathbf{z}\rangle+\beta\langle\mathbf{y}, \mathbf{z}\rangle$
(iii) $\langle\mathrm{x}, \mathbf{y}\rangle=\langle\mathbf{y}, \mathbf{x}\rangle$

for all $\mathrm{x}, \mathrm{y}, \mathrm{z} \in V$ and all $\alpha, \beta \in \mathbb{R}$. A vector space endowed with an inner product is called an inner product space.
Note that any inner product on $V$ induces a norm on $V$ :
$$\|\mathrm{x}\|=\sqrt{\langle\mathrm{x}, \mathrm{x}\rangle}$$
One can verify that the axioms for norms are satisfied under this definition and follow (almost) directly from the axioms for inner products. Therefore any inner product space is also a normed space (and hence also a metric space). ${ }^{4}$

Two vectors $\mathrm{x}$ and $\mathrm{y}$ are said to be orthogonal if $\langle\mathrm{x}, \mathrm{y}\rangle=0$; we write $\mathrm{x} \perp \mathrm{y}$ for shorthand. Orthogonality generalizes the notion of perpendicularity from Euclidean space. If two orthogonal vectors $\mathrm{x}$ and $\mathrm{y}$ additionally have unit length (i.e. $\|\mathrm{x}\|=\|\mathrm{y}\|=1$, then they are described as orthonormal.
The standard inner product on $\mathbb{R}^{n}$ is given by
$$\langle\mathbf{x}, \mathbf{y}\rangle=\sum_{i=1}^{n} x_{i} y_{i}=\mathbf{x}^{\top} \mathbf{y}$$
The matrix notation on the righthand side (see the Transposition section if it's unfamiliar) arises because this inner product is a special case of matrix multiplication where we regard the resulting $1 \times 1$ matrix as a scalar. The inner product on $\mathbb{R}^{n}$ is also often written $\mathrm{x} \cdot \mathrm{y}$ (hence the alternate name dot product). The reader can verify that the two-norm $\|\cdot\|_{2}$ on $\mathbb{R}^{n}$ is induced by thişinnerte $V$ product.
by SIlver Status
(18,455 points)

## Related questions

Let $C[-1,1]$ be the real inner product space consisting of all continuous functions $f:[-1,1] \rightarrow \mathbb{R}$, with the inner product $\langle f, g\rangle:=\int_{-1}^{1} f(x) g(x) d x$.Let $C-1,1$ be the real inner product space consisting of all continuous functions $f:-1,1 \rightarrow \mathbb{R}$, with the inner product $\ ... close 0 answers 7 views Let \(V$ be the real vector space of continuous real-valued functions on the closed interval $[0,1]$, and let $w \in V$. For $p, q \in V$, define $\langle p, q\rangle=\int_{0}^{1} p(x) q(x) w(x) d x$.Let $V$ be the real vector space of continuous real-valued functions on the closed interval $0,1$, and let $w \in V$. For $p, q \in V$, defi ...
Let $\mathcal{P}_{2}$ be the space of polynomials $p(x)=a+b x+c x^{2}$ of degree at most 2 with the inner product $\langle p, q\rangle=\int_{-1}^{1} p(x) q(x) d x$.
Let $\mathcal{P}_{2}$ be the space of polynomials $p(x)=a+b x+c x^{2}$ of degree at most 2 with the inner product $\langle p, q\rangle=\int_{-1}^{1} p(x) q(x) d x$.Let $\mathcal{P}_{2}$ be the space of polynomials $p(x)=a+b x+c x^{2}$ of degree at most 2 with the inner product $\langle p, q\rangle=\int_{-1}^ ... close 0 answers 8 views close Notice: Undefined index: avatar in /home/customer/www/mathsgee.com/public_html/qa-theme/AVEN/qa-theme.php on line 993 Show that \langle v|A| w\rangle=\sum_{i j} A_{i j} v_{i}^{*} w_{j} 0 answers 53 views Show that \langle v|A| w\rangle=\sum_{i j} A_{i j} v_{i}^{*} w_{j}Show that \langle v|A| w\rangle=\sum_{i j} A_{i j} v_{i}^{} w_{j} ... close 0 answers 8 views In a complex vector space (with a hermitian inner product), if a matrix \(A$ satisfies $\langle X, A X\rangle=0$ for all vectors $X$, show that $A=0$. [The previous problem shows that this is false in a real vector space].In a complex vector space (with a hermitian inner product), if a matrix $A$ satisfies $\langle X, A X\rangle=0$ for all vectors $X$, show that \ ...