DS1 spectrogram: Do Quantum Transformers Help? A Systematic VQC Architecture Comparison on Tabular Benchmarks

Do Quantum Transformers Help? A Systematic VQC Architecture Comparison on Tabular Benchmarks

April 27, 20262604.23931

Authors

Chi-Sheng Chen,En-Jui Kuo

Abstract

Variational quantum circuits (VQCs) are a leading approach to quantum machine learning on near-term devices, yet it remains unclear which circuit architecture yields the best accuracy-parameter trade-off on classical tabular data. We present a systematic empirical comparison of four VQC families -- multi-layer fully-connected (FC-VQC), residual (ResNet-VQC), hybrid quantum-classical transformer (QT), and fully quantum transformer (FQT) -- across five regression and classification benchmarks.

Our key findings are: (i)~FC-VQCs achieve 90-96% of the $R^2$ of attention-based VQCs while using 40-50% fewer parameters, and consistently outperform equal-capacity MLPs (mean $R^2{=}0.829$ vs.\ MLP$_{720}$'s $0.753$ on Boston Housing, 3-seed average); (ii)FC-VQC's Type4 inter-block connectivity provides partial cross-token mixing that approximates the role of attention -- explicit quantum self-attention yields only marginal gains on most datasets while significantly increasing parameter count; (iii)expressibility saturates at circuit depth${\approx},3$, explaining why shallow VQCs already cover the Hilbert space effectively; (iv)~LayerNorm on the fully quantum transformer improves classification accuracy, suggesting normalization is important when all operations are quantum; (v)~in our noise study on Boston Housing, FQT degrades gracefully under depolarizing noise while QT collapses. All results are validated across three random seeds.

These findings provide practical architectural guidance for deploying VQCs on near-term quantum hardware.

Resources

Stay in the loop

Every AI paper that matters, free in your inbox daily.

Details

  • © 2026 takara.ai Ltd
  • Content is sourced from third-party publications.