终其一生,我们只不过在寻找自己

0%

MSBD5004 Mathematical Methods for Data Analysis Homework 2

MSBD5004 Mathematical Methods for Data Analysis Homework 2

基础证明,感觉题不多,还是继续打出来了。

Q1

Let $(V,|\cdot|)$ be a normed vector space.
(a) Prove that, for all $x, y \in V$,

(b) Let $\left\{x_{k}\right\}_{k \in \mathbb{N}}$ be a convergent sequence in $V$ with limit $x \in V$. Prove that

(Hint: Use part (a).)
(c) Let $\left\{x^{(k)}\right\}_{k \in \mathbb{N}}$ be a sequence in $V$ and $x, y \in V .$ Prove that, if

then $x=y .$ (In other words, the limit of the same sequence in a normed vector space is unique.)

(a)

Using the norm definition, we have that $|\boldsymbol{a}+\boldsymbol{b}| \leq|\boldsymbol{a}|+|\boldsymbol{b}|$.
$if :\boldsymbol{a}=|\boldsymbol{x}-\boldsymbol{y}| ,\boldsymbol{b}=|\boldsymbol{y}|, then |\boldsymbol{x}| \leq |\boldsymbol{x}-\boldsymbol{y}| +|\boldsymbol{y}|$
$if :\boldsymbol{a}=|\boldsymbol{y}-\boldsymbol{x}| ,\boldsymbol{b}=|\boldsymbol{x}|, then |\boldsymbol{y}| \leq |\boldsymbol{y}-\boldsymbol{x}| +|\boldsymbol{x}|$
From above we can get
So, $| |\boldsymbol{x}|-|\boldsymbol{y}| |\leq|\boldsymbol{x}-\boldsymbol{y}|$.◾

(b)

Using the converges definition, we can get

Useing the conclusion of (a),

Then, we get $\lim _{k \rightarrow \infty}\left|\boldsymbol{x}_{k}\right|=|\boldsymbol{x}|$.◾

(c)

Proof: Assume that as $k \rightarrow \infty$ then $x^{(k)} \rightarrow x$ and also $x^{(k)} \rightarrow y .$

  • Let $\epsilon>0$ be given and choose $N_{1} \in \mathbb{N}$ such that
    $\left|x^{(k)}-x\right|<\frac{\epsilon}{2}$. And also choose $N_{2} \in \mathbb{N}$ such that
    $\left|x^{(k)}-y\right|<\frac{\epsilon}{2}$.
  • Now choose $N=\max \left(N_{1}, N_{2}\right)$. Not that the choose the maximum of $N_{1}$ and $N_{2}$, then it will be true that regardless $\left|x^{(k)}-x\right|<\frac{\epsilon}{2}$ and
    $\left|x^{(k)}-y\right|<\frac{\epsilon}{2}$. Now consider the following inequality:
  • By the triangle inequality, we get that:Therefore $|x-y|<\epsilon .$ $x-y$ is a number, and $\forall \epsilon>0,|x-y|<\epsilon$. So that $x=y$.◾

Q2

  1. Let $V$ be a vector space, and $\langle\cdot, \cdot\rangle$ be an inner product on $V$. Use the definition of inner product to prove the following.
    (a) Prove that $\langle \boldsymbol{0}, x\rangle=\langle x, \boldsymbol{0}\rangle= 0$ for any $x \in V .$ Here $\boldsymbol{0}$ is the zero vector in $V$.
    (b) Prove that the second condition
    is equivalent to ②

(a)

There are 3 condition of definition of inner product:

Use (3) if $y= \boldsymbol{0}$, we can get $\langle \boldsymbol{0}, x\rangle=\langle x, \boldsymbol{0}\rangle$.
Use (2) if $y=x, x_1=x_2=\boldsymbol{0},\alpha=\beta=1$, we can get $\langle \boldsymbol{0}, x\rangle=2*\langle \boldsymbol{0}, x\rangle$, so $\langle \boldsymbol{0}, x\rangle=0$
From above, we can get $\langle \boldsymbol{0}, x\rangle=\langle x, \boldsymbol{0}\rangle= 0$. ◾

(b)



Proof:

  • $①\rightarrow②$ :
    set $\alpha=\beta=1$, we can get first equation of ②$\forall x_{1}, x_{2}, x, y \in V, \alpha \in \mathbb{R}.$.
    set $x_1=x, \beta=0$, we can get second equation of ②$\forall x_{1}, x_{2}, x, y \in V, \alpha \in \mathbb{R}.$.

  • $②\rightarrow①$ :
    set $x_1= \alpha x_1, x_2= \beta x_2$, we can get

Q3

$\mathbb{R}^{m \times n}$ is a vector space over $\mathbb{R}$. Show that $\langle\boldsymbol{A}, \boldsymbol{B}\rangle=\operatorname{trace}\left(\boldsymbol{A}^{T} \boldsymbol{B}\right) \text { for } \boldsymbol{A}, \boldsymbol{B} \in \mathbb{R}^{m \times n}$ is an inner product on $\mathbb{R}^{m \times n} .$ Here trace(.) is the trace of a matrix, i.e., the sum of all diagonal entries.

Proof:
For every $A=(A_{ij}) \in \mathbb{R}^{m\times n}$ we have and Since for every $X,Y \in \mathbb{R}^{n\times n}$, and $\lambda \in \mathbb{R}$, therefore, for every $A,B, C \in \mathbb{R}^{m\times n}$, and $\lambda \in \mathbb{R}$ we have

Q4

Consider the polynomial kernel $K(\boldsymbol{x}, \boldsymbol{y})=\left(\boldsymbol{x}^{T} \boldsymbol{y}+1\right)^{2}$ for $\boldsymbol{x}, \boldsymbol{y} \in \mathbb{R}^{2} .$ Find an explicit feature map $\phi: \mathbb{R}^{2} \rightarrow \mathbb{R}^{6}$ satisfying $\langle\phi(\boldsymbol{x}), \phi(\boldsymbol{y})\rangle= K(\boldsymbol{x}, \boldsymbol{y}),$ where the inner product the standard inner product
in $\mathbb{R}^{6}$.
Solution as follow:

Q5

(You don’t need to do anything for this question.) A good Matlab code and demonstration of kernel K-means can be found at http://www.dcs.gla.ac.uk/~srogers/firstcourseml/matlab/chapter6/kernelkmeans.html Read the code. Run the code in Matlab, if possible, to see how kernel K-means works for nonlinear data.

-------------    你的留言  是我更新的动力😊    -------------