遇到这个错误我很震惊PCAfromsklearn
ValueError:不支持复杂数据
在尝试拟合复值数据之后。这只是未实现的事情吗?我应该继续使用 SVD“手动”执行它还是它们是复杂值的捕获?
遇到这个错误我很震惊PCAfromsklearn
ValueError:不支持复杂数据
在尝试拟合复值数据之后。这只是未实现的事情吗?我应该继续使用 SVD“手动”执行它还是它们是复杂值的捕获?
显然,此功能是故意遗漏的,请参见此处。恐怕您必须使用 SVD,但这应该相当简单:
def pca(X):
mean = X.mean(axis=0)
center = X - mean
_, stds, pcs = np.linalg.svd(center/np.sqrt(X.shape[0]))
return stds**2, pcs
我的实现完全模仿了原始 PCA,因此任何处理 PCA 的现有代码都可以无缝运行。
class ComplexPCA:
def __init__(self, n_components):
self.n_components = n_components
self.u = self.s = self.components_ = None
self.mean_ = None
@property
def explained_variance_ratio_(self):
return self.s
def fit(self, matrix, use_gpu=False):
self.mean_ = matrix.mean(axis=0)
if use_gpu:
import tensorflow as tf # torch doesn't handle complex values.
tensor = tf.convert_to_tensor(matrix)
u, s, vh = tf.linalg.svd(tensor, full_matrices=False) # full=False ==> num_pc = min(N, M)
# It would be faster if the SVD was truncated to only n_components instead of min(M, N)
else:
_, self.s, vh = np.linalg.svd(matrix, full_matrices=False) # full=False ==> num_pc = min(N, M)
# It would be faster if the SVD was truncated to only n_components instead of min(M, N)
self.components_ = vh # already conjugated.
# Leave those components as rows of matrix so that it is compatible with Sklearn PCA.
def transform(self, matrix):
data = matrix - self.mean_
result = data @ self.components_.T
return result
def inverse_transform(self, matrix):
result = matrix @ np.conj(self.components_)
return self.mean_ + result