Re: [Corpora-List] QM analogy and grammatical incompleteness

From: Dominic Widdows (widdows@maya.com)
Date: Sun Dec 18 2005 - 07:00:08 MET

  • Next message: Rob Freeman: "Re: [Corpora-List] QM analogy and grammatical incompleteness"

    Dear Rob,

    > For instance, famously, you can perfectly describe the momentum or the
    > position of a particle, but not both at the same time. This is
    > Heisenburg's
    > Uncertainty Principle.

    It is, and the Uncertainty Principle is perhaps the clearest formal
    expression we have of the homely truth "you can't know everything at
    once". A more classical version of the the same argument would follow
    from the observation that, if you had a machine that tried to run a
    Laplacian / deterministic model of the universe, its physical size and
    the speed of light would limit the amount of information it could
    synchronously process.

    > So it is not so much the fact of going from a continuous quality to a
    > discrete
    > quality which is interesting, it is the necessary incompleteness of
    > description in terms of discrete qualities abstracted from a
    > distribution
    > which is where I think we should be focusing, in analogy with the
    > Uncertainty
    > Principle of physics.

    Is this similar to asking whether all such quantization is a "lossy"
    transformation? Is this what you mean by incompleteness?

    > Dominic, I have only read the publically available chapter of your
    > book. You
    > mention a "vector model" for quantum mechanics. Do you have anything
    > on the
    > Web which talks about that? I can only recall ever having met
    > descriptions of
    > QM in terms of functions.

    The article at http://plato.stanford.edu/entries/qm/ looks like a good
    place to begin for QM and vectors.

    In broad strokes, the history of vectors and functional analysis became
    very closely linked in the 1840s and 1850s, partly through Hamilton's
    work on quaternions and the theory of analytic functions on 4-space.
    Functions over the real numbers form a vector space - you can add two
    functions together, and multiply any function by a scalar. As a result,
    mathematicians came to realize that Fourier analysis could be described
    in vectors - each of the functions sin(nx) and cos(nx) (for x a real
    number, n an integer) is a basis vector, and any piecewise smooth
    function can be expanded (uniquely) as a vector, using these functions
    as a basis. The Fourier series coefficients are thus interpreted as the
    coordinates of a vector in this basis. This vector space is clearly
    infinite-dimensional, because a Fourier series expansion can be
    infinitely long. (Note again that this means you will never work with
    complete information once you've quantized your functions.) Make your
    functions complex-valued, and introduce a metric based on
    complex-conjugation, and you've got Hilbert spaces, around 1900 I
    think.

    In the 1930's, Paul Dirac, John von Neumann, and others, used this
    formulation of functional analysis as the basis for formal quantum
    theory, much of which boils down to the analysis of self-adjoint
    operators on Hilbert space. Each function is a state-vectors, can be
    normalized and operated on. The resulting operator algebra (group of
    linear transformations under composition) is non-commutative, and this
    how the formal theory accounts for the Uncertainty Principle - the
    lower bound on the uncertainty of two observables is given by the
    magnitude of the commutator of their self-adjoint matrices.

    Clear as mud? ;)

    > I agree completely with your message, but would only add that while
    > quantum
    > analogies can be very informative for lexis, where I think it really
    > gets
    > interesting is in syntax, which responds very nicely to a kind of
    > "quantum"
    > analysis in terms of generating new quantum qualities (particles?), a
    > new one
    > for each new sentence.

    This may be partly because composition is so far modelled more robustly
    in syntax that it is in (parts of) semantics? Just trying to figure out
    what compositional rules apply a list of noun-noun compounds extracted
    from a corpus is very hard - and this is just combining 2 "particles"!
    Some of the most interesting structures that arise in QM involve
    entanglement, and I dare say that some of the structures in syntax are
    as rich in this "multiply composed, new particles / systems arise"
    property. I don't have the expertise to do any proper analysis here,
    though.

    Best wishes,
    Dominic



    This archive was generated by hypermail 2b29 : Sun Dec 18 2005 - 07:13:21 MET