I met the New Scientist article which I read this morning with mixed feelings. This is the bitter-sweet taste of being right. I guess this is what intellectualists generally feel when they eventually see their (good) ideas finally vindicated, yet initially thought of by the community of experts (and purported, self-proclaimed experts) to be at best proposterous if not nonsensical.
Below is my idea, with justifications of a physics freshman (at the time I had first year thermodynamics and quantum physics under my belt), and the audacity of a philosopher, which I posted years ago in the Philosophy Forums. Needless to say, as it is clear, my idea was met with scorn to say the least.
My Philosophy Forums entry, from 4th March 2010 (emphasis added):
* * *
Subject: Uncertainity and Entropy 04/03/10
This rumination does not purport to be scientific so I guess it may be free of the objection of pseudoscience. This disclaimer I think is useful in order to appreciate the creative aspect/potential of the idea behind the post and will hopefully make it more digestable to all the physycists cringing at the eclectic application of physical concepts in what follows.
The main idea has the following conceptual chain: uncertainity*knowledge*information*entropy. Hence a connection between Heisenberg's principle (quantum mechanics) and the second law of thermodynamics (classical mechanics).
Some useful quotes from Knight: as for photons - "The fact that waves are spread out makes it meaningless to specify an exact frequency and an exact arrival time simultaneously. This is an inherent feature of weaviness that applies to all waves", for matter particles - "Our knowledge about a particle is inherently uncertain. Why? Because of the wave-like nature of matter. The "particle" is spread out in space, so there simply is not a precise value of its position x. Similarly, the de Broglie relationship between momentum and wavelength implies that we cannot know the momentum of a wave packet any more exactly than we can know its wavelength or frequency".
Now, insofar as knowlwdge is knowledge of something, i.e. information (see Shannon), hence there is, I suggest, a positive correlation between knowledge and entropy.
I conclude that the uncertainity principle is (also) a classical statement about the entropy of a system in which the agent's knowledge (neural firing pattern(?)) is taken to be part of the experimental system. The random nature (conforming to some wavefunction, where position and momentum are random variables) of the "potential observable" within the broader system "potential observable + brain of scientist" would proceed to a lower entropy state if the experimenter yielded more precise information then the uncertainity principle allowes, and thereby violate the second law of thermodynamics.
* * *
And here is the link to the patronising, stubborn and contemptous replies I received from the pseudo-experts on the forum (I think one of them is even a moderator in the matters of physics).
And below is the New Scientist article which both did and didn't make my day this morning ;)
From the New Scientist 23rd June 2012 (emphasis added):
Stephanie Wehner and Esther Hänggi at the National University of Singapore's Centre for Quantum Technology have taken a new tack, recasting the uncertainty principle in the language of information theory.
First, they suggest that the two properties of a single object that cannot be known simultaneously can be thought of as two streams of information encoded in the same particle. In the same way that you can't know a particle's momentum and location to an arbitrarily high level of accuracy, you also can't completely decode both of these messages. If you figure out how to read message 1 more accurately, then your ability to decrypt message 2 becomes more limited.
Next the pair calculate what happens if they loosen the limits of the uncertainty principle in this scenario, allowing the messages to be better decoded and letting you access information that you wouldn't have had when the uncertainty principle was in force.
Wehner and Hänggi conclude that this is the same as getting more useful energy, or work, out of a system than is put in, which is forbidden by the second law of thermodynamics. That is because both energy and information are needed to extract work from a system.
To understand why, imagine trying to drive a piston using a container full of heated gas. If you don't know in which direction the gas particles are moving, you may angle the piston wrongly and get no useful work out of the system. But if you do know which way they are moving, you will be able to angle the piston so that the moving particles drive it. You will have converted the heat into useful work in the second scenario, even though the same amount of energy is available as in the first scenario.
Being able to decode both of the messages in Wehner and Hänggi's imaginary particle suddenly gives you more information. As demonstrated by the piston, this means you have the potential to do more work. But this extra work comes for free so is the same as creating a perpetual motion machine, which is forbidden by thermodynamics (arxiv.org/abs/1205.6894v1).
"The second law of thermodynamics is something which we see everywhere and basically no one is questioning," says Mario Berta, a theoretical physicist from the Swiss Federal Institute of Technology in Zurich, who was not involved in the work. "Now we know that without an uncertainty principle we could break the second law."
Jessica Giggs "To be quantum is to be uncertain", New Scientist v214 n2870 (online here)
Ironically, being a philosophy major, I identify more with being part of a community which includes peolple like Stephanie and Esther, rather than one that gives authoritarian roles (as moderators on Philosophy Forums) to such pseudo-intellectuals as the ones that have been shown (above) to be resistant to new ideas.