Abstract:
Suppression of the critical temperature in homogeneously disordered superconducting films is a consequence of the disorder-induced enhancement of Coulomb repulsion. We demonstrate that for the majority of thin films studied now this effect cannot be completely explained under the assumption of two-dimensional diffusive nature of electron motion. The main contribution to the suppression of $T_c$ arises from the correction to the electron-electron interaction constant coming from small scales of the order of the Fermi wavelength that leads to the critical temperature shift $\delta T_c/T_{c0} \sim - 1/k_Fl$, where $k_F$ is the Fermi momentum and l is the mean free path. Thus almost for all superconducting films that follow the fermionic scenario of $T_c$ suppression with decreasing the film thickness, this effect is caused by the proximity to the three-dimensional Anderson localization threshold and is controlled by the parameter $k_Fl$ rather than the sheet resistance of the film.