【CLP】Gaussian Projection Reduction & Randomized Binary Reduction
Gaussian Projection Reduction
A randomized procedure to produce an approximate SDP solution with a desired low rank
d
d
d. Again, let
X
∗
\mathbf{X}^*
X∗ be an optimal solution of SDP with rank
r
>
d
r>d
r>d and we factorize
X
∗
\mathbf{X}^*
X∗ as
X
∗
=
(
V
∗
)
T
V
∗
,
V
∗
∈
E
r
×
n
\mathbf{X}^*=(\mathbf{V}^*)^T\mathbf{V}^*,\quad \mathbf{V}^*\in \mathbb{E}^{r\times n}
X∗=(V∗)TV∗,V∗∈Er×n
We then generate i.i.d Gaussian random variables
ξ
i
j
\xi_i^j
ξij with mean 0 and variance
1
/
d
1/d
1/d,
i
=
1
,
…
,
r
;
j
=
1
,
…
,
d
i=1,\dots, r; j=1,\dots, d
i=1,…,r;j=1,…,d, and we define
X
^
=
(
V
∗
)
T
[
∑
j
=
1
d
ξ
j
(
ξ
j
)
T
]
V
∗
//rank is
d
\hat{\mathbf{X}}=(\mathbf{V}^*)^T\bigg[\sum_{j=1}^d\xi^j(\xi^j)^T\bigg]\mathbf{V}^*\quad\text{//rank is $d$}
X^=(V∗)T[j=1∑dξj(ξj)T]V∗//rank is d
Note that the rank of
X
^
\hat{\mathbf{X}}
X^ is
d
d
d and
E
(
X
^
)
=
(
V
∗
)
T
I
V
∗
=
X
∗
//approximate
\mathbb{E}(\hat{\mathbf{X}})=(\mathbf{V}^*)^T\mathbf{I}\mathbf{V}^*=\mathbf{X}^*\quad\text{//approximate}
E(X^)=(V∗)TIV∗=X∗//approximate
We can further show that
X
^
\hat{\mathbf{X}}
X^ would be a good rank-
d
d
d approximate SDP solution in many cases.
Randomized Binary Reduction
As QP optimization, we want to produce a vector
x
\mathbf{x}
x where each entry is either
−
1
-1
−1 or
1
1
1. A procedure to achieve this is as follows. Let
X
∗
\mathbf{X}^*
X∗ be any optimal solution of SDP and we factorize
X
∗
\mathbf{X}^*
X∗ as
X
∗
=
(
V
∗
)
T
V
∗
,
V
∗
∈
E
n
×
n
\mathbf{X}^*=(\mathbf{V}^*)^T\mathbf{V}^*,\quad \mathbf{V}^*\in\mathbb{E}^{n\times n}
X∗=(V∗)TV∗,V∗∈En×n
Then, we generate a random
n
n
n-dimensional vector
ξ
\xi
ξ where each entry is a i.i.d Gaussian random variable with mean 0 and variance 1. Define
x
^
=
s
g
n
(
(
V
∗
)
T
ξ
)
\hat{\mathbf{x}}=sgn((\mathbf{V}^*)^T\xi)
x^=sgn((V∗)Tξ)
It was proved by Sheppard:
E
[
x
^
i
x
^
j
]
=
2
π
arcsin
(
X
i
j
∗
)
,
i
,
j
=
1
,
2
,
…
,
n
\mathbb{E}[\hat{x}_i\hat{x}_j]=\frac{2}{\pi}\arcsin(\mathbf{X}^*_{ij}),i,j=1,2,\dots, n
E[x^ix^j]=π2arcsin(Xij∗),i,j=1,2,…,n
Consider the primal homogeneous binary quadratic maximization problem
z
∗
:
=
max
x
T
Q
x
s
.
t
.
x
j
=
{
1
,
−
1
}
,
j
=
1
,
…
,
n
z^*:=\max \mathbf{x}^T\mathbf{Q}\mathbf{x}\\ s.t.\quad x_j=\{1, -1\}, j=1,\dots, n
z∗:=maxxTQxs.t.xj={1,−1},j=1,…,n
where we assume
Q
\mathbf{Q}
Q is positive semidefinite. Then, the SDP relaxation would be
z
S
D
P
:
=
max
Q
⋅
X
s
.
t
.
I
j
⋅
X
=
1
,
j
=
1
,
…
,
n
X
∈
S
+
n
z^{SDP}:=\max \mathbf{Q}\cdot \mathbf{X}\\ s.t.\quad \mathbf{I}_j\cdot \mathbf{X}=1, j=1,\dots, n\\ \mathbf{X}\in \mathcal{S}_+^n
zSDP:=maxQ⋅Xs.t.Ij⋅X=1,j=1,…,nX∈S+n
and let
X
∗
\mathbf{X}^*
X∗ be any optimal solution, from which we produced a random binary vector
x
^
\hat{\mathbf{x}}
x^. And, we evaluate the expected objective value
E
(
x
^
Q
x
^
)
=
E
(
Q
⋅
x
^
x
^
T
)
=
Q
⋅
E
(
x
^
x
^
T
)
=
Q
⋅
2
π
arcsin
[
X
∗
]
=
2
π
(
Q
⋅
arcsin
[
X
∗
]
)
\mathbb{E}(\hat{\mathbf{x}}\mathbf{Q}\hat{\mathbf{x}})=\mathbb{E}(\mathbf{Q}\cdot\hat{\mathbf{x}}\hat{\mathbf{x}}^T)=\mathbf{Q}\cdot\mathbb{E}(\hat{\mathbf{x}}\hat{\mathbf{x}}^T)=\mathbf{Q}\cdot\frac{2}{\pi}\arcsin[\mathbf{X}^*]=\frac{2}{\pi}(\mathbf{Q}\cdot\arcsin[\mathbf{X}^*])
E(x^Qx^)=E(Q⋅x^x^T)=Q⋅E(x^x^T)=Q⋅π2arcsin[X∗]=π2(Q⋅arcsin[X∗])
where
arcsin
[
X
∗
]
∈
S
n
\arcsin[\mathbf{X}^*]\in\mathcal{S}^n
arcsin[X∗]∈Sn, and further
arcsin
[
X
∗
]
−
X
∗
⪰
0
\arcsin[\mathbf{X}^*]-\mathbf{X}^*\succeq \bf{0}
arcsin[X∗]−X∗⪰0
from
Q
⪰
0
\mathbf{Q}\succeq \bf{0}
Q⪰0, we have
Q
⋅
arcsin
[
X
∗
]
≥
Q
⋅
X
∗
=
z
S
D
P
≥
z
∗
\mathbf{Q}\cdot\arcsin[\mathbf{X}^*]\geq \mathbf{Q}\cdot \mathbf{X}^*=z^{SDP}\geq z^*
Q⋅arcsin[X∗]≥Q⋅X∗=zSDP≥z∗
相關文章
- Crest Factor Reduction(CFR)REST
- ValueError: output parameter for reduction operation logical_and has too many dimensions ?Error
- 論文解讀(DCRN)《Deep Graph Clustering via Dual Correlation Reduction》
- 論文翻譯五:A New Method of Automatic Modulation Recognition Based on Dimension Reduction
- 論文翻譯:2020_Joint NN-Supported Multichannel Reduction of Acoustic Echo, Reverberation and Noise
- 論文解讀(LLE)《Nonlinear Dimensionality Reduction by Locally Linear Embedding》以及論文通俗解釋
- Madagascar Projection ChangeProject
- CMPEN/EE 454 oncamera projectionProject
- 論文翻譯六:A novel underwater acoustic signal denoising algorithm for Gaussian/non-Gaussian impulsiveGo
- 04.AutoMapper 之投影(Projection)APPProject
- Binary Colouring
- Traversals of binary tree
- MySQL Binary LogMySql
- Threes.js入門篇之4 - World View ProjectionJSViewProject
- Leetcode 67 Add BinaryLeetCode
- B. Quasi Binary
- Leetcode Binary Tree PathsLeetCode
- 【PR】3D Gaussian Splatting for Real-Time Radiance Field Rendering3D
- 【翻譯】擬合與高斯分佈 [Curve fitting and the Gaussian distribution]
- [LintCode] Check Full Binary Tree
- 257-Binary Tree Paths
- 543-Diameter of Binary Tree
- 563-Binary Tree Tilt
- 655-Print Binary Tree
- 654-Maximum Binary Tree
- 814-Binary Tree Pruning
- 110-Balanced Binary Tree
- Four-bit binary counter
- 545. Boundary of Binary Tree
- 257. Binary Tree Paths
- Construct String from Binary TreeStruct
- 226-Invert Binary Tree
- C. Binary String Copying
- B. Nezzar and Binary String
- [LintCode] Binary Tree Level Order
- Binary &Op是什麼
- go Exercise: Equivalent Binary TreesGoUI
- 論文閱讀《Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising》CNN