A quantum limit in physics is a limit on measurement accuracy at quantum scales.
[
] Depending on the context, the limit may be absolute (such as the
Heisenberg limit
In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of Inequality (mathematics), mathematical inequalities asserting a fundamental limit to the accuracy with which the values fo ...
), or it may only apply when the experiment is conducted with naturally occurring
quantum states (e.g. the standard quantum limit in interferometry) and can be circumvented with advanced state preparation and measurement schemes.
The usage of the term standard quantum limit or SQL is, however, broader than just interferometry. In principle, any linear measurement of a quantum mechanical
observable of a system under study that does not
commute with itself at different times leads to such limits. In short, it is the
Heisenberg uncertainty principle
In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the values for certain pairs of physic ...
that is the cause.
A more detailed explanation would be that any measurement in
quantum mechanics involves at least two parties, an Object and a Meter. The former is the system whose observable, say
, we want to measure. The latter is the system we couple to the Object in order to infer the value of
of the Object by recording some chosen observable,
, of this system, ''e.g.'' the position of the pointer on a scale of the Meter. This, in a nutshell, is a model of most of the measurements happening in physics, known as ''indirect'' measurements (see pp. 38–42 of
). So any measurement is a result of interaction and that acts in both ways. Therefore, the Meter acts on the Object during each measurement, usually via the quantity,
, conjugate to the readout observable
, thus perturbing the value of measured observable
and modifying the results of subsequent measurements. This is known as
back action (quantum) of the Meter on the system under measurement.
At the same time, quantum mechanics prescribes that readout observable of the Meter should have an inherent uncertainty,
, additive to and independent of the value of the measured quantity
. This one is known as ''measurement imprecision'' or ''measurement noise''. Because of the
Heisenberg uncertainty principle
In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the accuracy with which the values for certain pairs of physic ...
, this imprecision cannot be arbitrary and is linked to the back-action perturbation by the
uncertainty relation:
:
where
is a standard deviation of observable
and
stands for
expectation value of
in whatever
quantum state the system is. The equality is reached if the system is in a ''minimum uncertainty state''. The consequence for our case is that the more precise is our measurement, ''i.e'' the smaller is
, the larger will be perturbation the Meter exerts on the measured observable
. Therefore, the readout of the meter will, in general, consist of three terms:
:
where
is a value of
that the Object would have, were it not coupled to the Meter, and