information applications. Microwave absorption by TLSs is not clearly understood yet because of the complexity of their interactions leading to the spectral diffusion. Here, the theory of the non-linear absorption in the presence of spectral diffusion is developed using the generalized master equation formalism. The theory predicts that the linear absorption regime holds while a TLS Rabi frequency is smaller than their phase decoherence rate. At higher external fields, a novel non-linear absorption regime is found with the loss tangent inversely proportional to the intensity of the field. The theory can be generalized to acoustic absorption and lower dimensions realized in superconducting qubits.
Theory of nonlinear microwave absorption by interacting two-level systems
The microwave absorption and noise caused by quantum two-level systems (TLS) dramatically suppress the coherence in Josephson junction qubits that are promising candidates for a quantum