Polynomial autoencoder
Summary
The post presents a closed-form polynomial autoencoder that extends PCA for embedding compression. It keeps PCA as the encoder and adds a quadratic decoder trained with ridge regression to capture nonlinear structure in transformer-like embeddings. Results on BEIR/FiQA show poly-AE improves NDCG@10 over PCA at various budgets, with caveats about corpus size, transductive fitting, and applicability to non-MRL models.