Born to learn?

June 22, 2010

There’s a debate in linguistics on the extent to which humans acquire language via learning and statistical generalization and the extent to which language acquisition aided by innate biases. Theoretical arguments have been devised to demonstrate that general-purpose learning would require to much data and that innate bias is necessary. An eprint claims to provide mathematical and empirical support against such impossibility results and in favor of the possibility of language acquisition through general-purpose learning. The learning method being advanced is a Minimum Description Length type method.

A. S. Hsu, N. Chater and P. M. B. Vitanyi. The probabilistic analysis of language acquisition: Theoretical, computational, and experimental analysis. arXiv:1006.3271 (2010)

Advertisements

Update on entropy and memory loss

June 21, 2010

A while back I noted an interesting paper about the arrow of time and memory loss in quantum mechanics. Unfortunately for that idea a new commentary by Jennings and Rudolph [PRL 104:148901, 2010] refutes it by exhibiting a counter-example.


Prevalance of global warming contrarians among experts

June 21, 2010

A new survey sheds more light on who the anthropogenic global warming contrarians are. The abstract is self-explanatory and the fulltext is open access.

W. R. L. Anderegg et al. Expert credibility in climate change. PNAS, doi: 10.1073/pnas.1003187107

Abstract: Although preliminary estimates from published literature and expert surveys suggest striking agreement among climate scientists on the tenets of anthropogenic climate change (ACC), the American public expresses substantial doubt about both the anthropogenic cause and the level of scientific agreement underpinning ACC. A broad analysis of the climate scientist community itself, the distribution of credibility of dissenting researchers relative to agreeing researchers, and the level of agreement among top climate experts has not been conducted and would inform future ACC discussions. Here, we use an extensive dataset of 1,372 climate researchers and their publication and citation data to show that (i) 97–98% of the climate researchers most actively publishing in the field support the tenets of ACC outlined by the Intergovernmental Panel on Climate Change, and (ii) the relative climate expertise and scientific prominence of the researchers unconvinced of ACC are substantially below that of the convinced researchers.


Generic optimization in infinitely large search domains

January 2, 2010

In 1995 and the following few years, Wolpert and Macready published a technical reports and a journal article on what has come to be called the No Free Lunch theorems. These theorems say that all optimization algorithms yield the same average performance in optimizing a completely random function. Read the rest of this entry »


A condensed matter interpretation of the Standard Model

December 29, 2009

Condensed matter physics provides mathematical analogies with particle physics. Quasiparticles, i.e. particle-like excitations of a given ground state, often share many physical properties with more fundamental particles. Among the many attempts to find deeper insights into the Standard Model is the exportation of analogies in the other direction: from condensed-matter physics to fundamental physics. Here’s one such line of work:

I. Schmelzer. A condensed matter interpretation of SM fermions and gauge fields. arXiv:0908.0591

Abstract: We present the bundle Aff(3) x C x /(R^3), with a geometric Dirac equation on it, as a three-dimensional geometric interpretation of the SM fermions. Each C x /(R^3) describes an electroweak doublet. The Dirac equation has a doubler-free staggered spatial discretization on the lattice space Aff(3) x C (Z^3). This space allows a simple physical interpretation as a phase space of a lattice of cells in R^3. We find the SM SU(3)_c x SU(2)_L x U(1)_Y action on Aff(3) x C x /(R^3) to be a maximal anomaly-free special gauge action preserving E(3) symmetry and symplectic structure, which can be constructed using two simple types of gauge-like lattice fields: Wilson gauge fields and correction terms for lattice deformations. The lattice fermion fields we propose to quantize as low energy states of a canonical quantum theory with Z_2-degenerated vacuum state. We construct anticommuting fermion operators for the resulting Z_2-valued (spin) field theory. A metric theory of gravity compatible with this model is presented too.

A recent follow up paper focuses on neutrinos:

I. Schmelzer. Neutrinos as pseudo-acoustic ether phonons. arXiv:0912.3892

Abstract: Recently [arXiv:0908.0591] the author has proposed a condensed matter model which gives all fermions and gauge fields of the standard model of particle physics. In the model, the inertness of right-handed neutrinos is explained by an association with translational symmetry. We argue that this association may be used as well to explain the small neutrino masses. They appear to be pseudo-Goldstone particles associated with an approximate translational symmetry of a subsystem. Then we propose to explain the masslessness of SU(3)_c x U(1)_em with an unbroken SU(3)x U(1) gauge symmetry of the model. We also detect a violation of a necessary symmetry property in the lattice Dirac equation and present a fix for this problem.


Convex analysis and thermodynamics

November 23, 2009

A previous post briefly reviewed convex analysis. Here I’ll review the application of convexity in basic thermodynamics.

Equilibrium states

The concept of thermodynamic equilibrium is a generalization of mechanical equilibrium, where all forces and torques cancel each other. Informally, the idea is that a system in thermodynamic equilibrium has stable, unchanging macroscopic properties, which may be characterized by an n-tuple of extensive variables. Read the rest of this entry »


I, AIXI

September 10, 2009

A new cool eprint has appeared:

J. Veness, et al. A Monte Carlo AIXI Approximation. eprint: 0909.0801

The question that defines the context for this article is: How should probabilities be assigned? One way, with much to recommend itself, is take them to be algorithmic probabilities or universal priors. Suppose one has observed the first N values of a discrete time series, maybe a byte stream, and wishes to predict or make a bet about the next value. Is there a general probability measure appropriate for all cases that fit this abstract setting and, if so, which one? Read the rest of this entry »