Scroll Top
19th Ave New York, NY 95822, USA

数学代写|ENGG5501 Convex Optimization

MY-ASSIGNMENTEXPERT™可以为您提供cuhk.edu.hk ENGG5501 Convex Optimization凸优化课程的代写代考辅导服务!

数学代写|ECE515 Information theory

ENGG5501课程简介

Announcements
NEW: Here is the solution to the practice final examination.
NEW: Here are the solutions to Homework 4 and Homework 5.
NEW: The final examination will be held on December 20, 2022, from 2:00pm to 4:00pm, in ERB LT. You can bring the course handouts, homeworks, homework solutions, and the notes you took during lectures to the exam. No other material will be allowed. If you have questions about the rules of the exam, please clarify with the teaching staff as soon as possible.

Prerequisites 

Here is the midterm solution.
Welcome to ENGG 5501! Students who are interested in taking the course but have not yet registered (or are not able to register) should contact the course instructor.
To better facilitate discussions and Q&As, we have set up a forum on Piazza. Please follow this link to sign up.

ENGG5501 Convex Optimization HELP(EXAM HELP, ONLINE TUTOR)

问题 1.

(10pts). Let $P \in \mathbb{R}^{n \times n}$ be an orthogonal projection matrix (i.e., $P^2=P$ and $P=P^T$; see Handout $\mathrm{B}$, Section 1.6). Show that $|P| \leq 1$, where $|P|$ is the largest singular value of $P$.

Since $\$ P \$$ is a projection matrix, its eigenvalues can only be either $\$ 0 \$$ or $\$ 1 \$$. Let $\$ \backslash$ lambda $\$$ be an eigenvalue of $\$ P \$$ and $\$ \mathrm{~V} \$$ be a corresponding eigenvector such that $\$ P v=$ Vambda $v \$$. Then we have:
$$
|P|^2=\left|P^2\right|=|P(P v)|=|P(\lambda v)|=|\lambda P v|=|\lambda|^2|v|^2 \leq|v|^2,
$$
where we used the fact that $\$|P v|=\left|P^{\wedge} 2 v\right|=|P v| \$$, since $\$ P^{\wedge} 2=P \$$.
Therefore, we have $\$|P|^{\wedge} 2 \backslash$ leq $|v|^{\wedge} 2 \$$ for any eigenvector $\$ V \$$ of $\$ P \$$. Taking the maximum over all eigenvectors, we obtain:
$$
\begin{aligned}
& |P|^2 \leq \max {v \neq 0} \frac{|P v|^2}{|v|^2}=\max {v \neq 0} \frac{\langle P v, P v\rangle}{\langle v, v\rangle}=\max {v \neq 0} \frac{\left\langle P^2 v, v\right\rangle}{\langle v, v\rangle}= \ & \max {v \neq 0} \frac{\langle P v, v\rangle}{\langle v, v\rangle}=\max _{v \neq 0} \frac{\langle v, P v\rangle}{\langle v, v\rangle}=\max |v|=1\langle v, P v\rangle
\end{aligned}
$$

Since $\$ P \$$ is an orthogonal projection, we have $\$ P^{\wedge} T=P \$$, which implies that $\$ P \$$ is a symmetric matrix. Therefore, $\$ \backslash$ langle $v, P v \backslash$ rangle = \langle $\mathrm{Pv}$, $v \backslash$ rangle $=(P v)^{\wedge} T v=v^{\wedge} T P^{\wedge} T v=v^{\wedge} T P v \$$. Hence, we have:
$$
\begin{aligned}
& |P|^2 \leq \max {|v|=1}\langle v, P v\rangle=\max {|v|=1} v^T P v \leq \max _{|v|=1}|P||v|^2= \
& |P| .
\end{aligned}
$$
The inequality on the right-hand side follows from the Cauchy-Schwarz inequality. Therefore, we have $\$|P| \backslash$ leq $1 \$$, as desired.

问题 2.

(b) (15pts). Give an explicit description of $N(x)$. Simplify your answer as much as possible. Show all your work. Hint: For each $i \in\{1, \ldots, n\}$, consider the cases $x_i>0$ and $x_i=0$ separately.

A projection matrix $\$ P \$$ is a matrix that satisfies $\$ P^{\wedge} 2=P \$$ and projects vectors onto its range. One property of projection matrices is that their eigenvalues are either 0 or 1 . Another important property is that the norm of a projection matrix is at most 1 , i.e., $\$|P| \backslash$ |eq $1 \$$.

To find an example of a projection matrix $\$ P \backslash$ in $\backslash$ mathbb ${R}^{\wedge}{2 \backslash$ times 2$} \$$ with $\$|P|>1 \$$, we need to find a matrix that satisfies $\$ P^{\wedge} 2=P \$$ and has a norm greater than 1. One way to do this is to find a matrix with eigenvalues that are greater than 1.

For example, consider the matrix $\$ P=\backslash$ begin ${$ bmatrix $} 2 \& 0 \backslash 0 \& 1$ lend ${$ bmatrix $} \$$. We can check that $\$ P^{\wedge} 2=P \$$ since
$$
P^2=\left[\begin{array}{ll}
2 & 0 \
0 & 1
\end{array}\right]\left[\begin{array}{ll}
2 & 0 \
0 & 1
\end{array}\right]=\left[\begin{array}{ll}
4 & 0 \
0 & 1
\end{array}\right]=P
$$
Furthermore, the eigenvalues of $\$ P \$$ are 2 and 1 , so it is a projection matrix. To compute the norm of $\$ P \$$, we can use the fact that the norm of a matrix is equal to the square root of the largest eigenvalue of $\$ A^{\wedge} T A \$$, where $\$ A \$$ is the matrix. In this case, we have
$$
P^T P=\left[\begin{array}{ll}
2 & 0 \
0 & 1
\end{array}\right]\left[\begin{array}{ll}
2 & 0 \
0 & 1
\end{array}\right]=\left[\begin{array}{ll}
4 & 0 \
0 & 1
\end{array}\right]
$$
The largest eigenvalue of $\$ P^{\wedge} T P \$$ is 4 , so the norm of $\$ P \$$ is $\$|P|=$ Isqrt $\left{\backslash\right.$ lambda_ $\left.{\mid \max }\left(P^{\wedge} T P\right)\right}=2 \$$. Therefore, we have found an example of a projection matrix $\$ P \$$ with $\$|P|>1 \$$, namely $\$ P=\backslash$ begin ${$ bmatrix $} 2 \& 0 \backslash 0$ \& 1 \end } { \text { bmatrix } } \text { . }

数学代写|ECE515 Information theory

MY-ASSIGNMENTEXPERT™可以为您提供UNIVERSITY OF ILLINOIS URBANA-CHAMPAIGN MATH2940 linear algebra线性代数课程的代写代考和辅导服务! 请认准MY-ASSIGNMENTEXPERT™. MY-ASSIGNMENTEXPERT™为您的留学生涯保驾护航。

Related Posts

Leave a comment