top of page

GRAND CHALLENGES

Each year, we issue a new Grand Challenge in the field of mathematics, physics, or the biological sciences. We then devise a dedicated team of fellows who are highly motivated and well-equipped to pursue the research question. The results undergo a full peer-review process and then are made freely available to the public. 

PREVIOUS AND ONGOING GRAND CHALLENGES

2018-2022: The physical basis of consciousness

A mechanistic explanation of consciousness has long been elusive to scientists and philosophers. It is simply not understood what perceptual experience is, or how mental information content is produced by the operations of neural networks. The goal of this Grand Challenge was to devise a theoretical framework for understanding consciousness in accordance with physical laws. This effort successfully generated a theory which satisfies these requirements by combining the laws of neuroscience, information theory, and computational physics. This work is currently undergoing peer review. The pre-prints are posted here.

Ongoing: Tackling computational complexity

The complexity of computational problems can be classified by the time resources required to identify a solution. Computational problems which can be solved in polynomial time by a deterministic Turing machine are classed as P. By contrast, NP problems can be verified in polynomial time but cannot be solved in polynomial time with deterministic computing methods. NP-hard problems include decision problems, search problems, and optimization problems which can be reduced in polynomial time from L to H, where H is a harder computation than L. NP-complete problems are problems which are both NP and NP-hard. The outstanding question is whether problems in this computational complexity class can be practically solved with non-deterministic computing methods. This effort will focus on studying the operational laws and algebras that guide non-deterministic computing at ambient temperatures.

Ongoing: Living in an information-filled world

In the nineteenth century, Ludwig Boltzmann and Willard Gibbs determined the mathematical law for entropy, a quantifiable amount of disorder in thermodynamic systems. Later, Claude Shannon derived an astonishingly similar equation to measure information, the amount of disorder or non-compressibility in a dataset. Several years afterwards, John von Neumann extrapolated this equation into higher dimensions for use in quantum mechanics. Yet a functional link between these laws has remained out of reach, despite the common element of probabilistic mechanics which provides the foundation for thermodynamics, computing, and quantum theory. This Grand Challenge addresses the following questions: How are information and entropy related? Do they play any significant role in the structure and operation of our universe? What are the implications of living in a probabilistic world, which generates information and entropy? How do biological organisms gain a more ordered system state over time, through physically-instantiated Bayesian inference process?

Ongoing: The emergent structure of the universe

A long-standing challenge in physics is reconciling quantum mechanics and general relativity. At quantum scales, a fundamental uncertainty in the position and momentum of a particle renders difficulty in measuring the curvature of space-time at its exact location. Meanwhile, at cosmological scales, the curvature of space-time appears irregular and dynamic. The goal of this Grand Challenge is to devise a theoretical framework that describes the curvature of the universe using mechanical laws, rather than empirically-derived constants which are not valid at every scale - with a metric tensor that emerges directly from an extended model of particle physics.

 

bottom of page