A Brief History of Superconductivity
Before the discovery of superconduction, it was already known that cooling
a metal increased its conductivity - due to decreased electron-phonon
interactions (detailed in the Theory section).
In 1913, it was discovered that lead went superconducting at 7.2K. It
was then 17 years until niobium was found to superconduct at a higher
temperature of 9.2K.
It was not until 1933 that physicists became aware of the other property of superconductors - perfect diamagnetism. This was when Meissner and Oschenfeld discovered that a superconducting material cooled below its critical temperature in a magnetic field excluded the magnetic flux. This effect has now become known as the Meissner effect (- you can see a QuickTime video of this in action from this link).
The limit of external magnetic field strength at which a superconductor
can exclude the field is known as the critical field strength,
Fritz and Heinz London proposed equations to explain the Meissner effect and predict how far a magnetic field could penetrate into a superconductor, but it was not until 1950 that any great theoretical progression was made, with Ginzburg-Landau theory, which explained superconductivity and provided derivation for the London equations.
Ginzburg-Landau theory has been largely superseded by BCS theory, which
deals with superconduction in a more microscopic manner.
The highest known temperature at which a material went superconducting
increased slowly as scientists found new materials with higher values
of Tc, but it was in 1986 that a Ba-La-Cu-O system was found
to superconduct at 35K - by far the highest then found. This was interesting
as BCS theory had predicted a theoretical limit of about 30-40K to Tc
(due to thermal vibrations).