Wednesday, April 29, 2015

Entanglement: Quantum Inspired Computing (QuIC) Approach to Precompilation of Entanglers

Entanglement and Entanglers

Quantum entanglement is an observable phenomenon that his dependent on strong correlations being measurable between superficially disparate things.  In a nutshell, entangled particles are precompiled (i.e. prepared) in states in which entanglement is observable and measurable.

I think of entanglement and quantum state preparation in particular as a precompilation step.  The preparation process is, in my mind, just precompilation.

"The best way to predict the future is to invent it" - Alan Kay

Note of Caution

I personally hate redefining or using new terms except and unless where they contribute a conceptual mental model that could not otherwise be easily implemented.

My reading recommendation is to review Baez's Crackpot Index or Siegels Quack Index.

With those words of wisdom and caution, I would like to add a few conceptual colors to terms such as Entangler and Precompilation.

Entangler

An entangler is a process that accepts un-entangled data representations and outputs entangled representations.  The representations entering and leaving the entangler may have the same or different representation models as long as the qualities of entanglement are observable and measurable after the entangler has operated on the inputs.

Precompiler

So what is precompilation?

Precompilation is a way to think of the concept of preprocessing and partial evaluation as a step of setting up the state of a computation so that compilation and execution are efficient.

I have not seen a lot of dialog on quantum computing and the concept of precompilation though, implicitly, any state preparation could be considered precompilation.

A compiler, in the quantum sense, then is the configuration of the computing material that has been prepared and then configured into a form that runs the computational model by either generating or accepting data which produces the results of a simulation or computation.

Generic Quantum Inspired Precompilation

Okay so here is the juice:  taking some inspiration from Pitowski Correlation Polytopes and the relationships between bicomplex numbers and the vertices of various interesting polytopes such as the Birkhoff Polytope, an extended formulation of the Permutohedron, then it struck me that using Pitowski and a few other gems (which I will get into next post) that you formulate a series of approximations that appear to provide some of the inspirational features of quantum computing.

For example, by association a complementary probability density to the vertices of a correlation polytope using an extended formulation, then an approach using an Estimation of Distribution Algorithm (EDA) would seem to fit the purpose of getting from joint probabilities of concurrent or seriated events as components of a solution space:  the applications would apply to Travelling Salesman, Matrix Permanents, Consecutive Ones, Seriation, K-Sums, and many, many other problem sets.

In short, by precompiling a probability distribution with a correlation polytope in terms of the extended formulation of a combinatorial polyhedron the opportunity seems to exist that we could do some quite clever coding to produce a non-commutative geometrical probability density that could be useful for problem solving through approximation.

In other words, the propositions at the vertices of such a polytope are characterized by the probability density as a function of the geometry rather then the other way around.   In the limit, these distributions are spherical directional non-commutative probability densities on the sphere (or torus or some other similar smooth manifold).

I would also bet that the tangent spaces would be better at linearity than the actual surface kissing point of the vertices but I will defer those thoughts to another day.

For now, let's just call these new objects (Probability) Density Polytopes since they are a combination of correlation polytope with probability distribution.  If anyone out there knows what these things can be called, please let me know!

Density polytopes could have differing kinds of distributions and functions so they could come in a variety of specifications: von Mises Fisher, Mallows, Watson, Gaussian distributions, etc...

The Entangler

Therefore, the entangler in this case becomes the EDA itself and the objects become the populations of density polytopes.  The features of quantum superposition and interference can occur as the by products of choosing  probability distribution function mixtures and ensuring a means to filter and detect those potentially useful quantum features.  In any case, it would seem that the total system is classical and superficially only is inspired by quantum thinking.

Until next time: the Density Polytope.




Thursday, April 2, 2015

QuIC (Quantum Inspired Computing): Amplituhedral structures and Grassmanians

QuIC (Quantum Inspired Computing):  Amplituhedral structures and Grassmanians

Last time I looked at some structures based on the Bloch Sphere and its variants.  While the Amplituhedron is not yet a popular or well-known structure for performing quantum inspired computation, it is an amazing machinery for simplifying thousands of pages of equations that would be needed for scattering amplitude calculations (which you can read about in terms of the Amplituhedron here) into just a few (with the commensurate speedup in computation).

The reference work is:


And the best background available is this (download the "other formats" compressed file to get the Mathematica notebook): 


Of course, there is comedy online in the form of Scott Aaronson's Unitarihedron post (more here:  http://www.scottaaronson.com/blog/?p=1537 )

With all the kidding aside, there is perhaps a use for such a structure and it is possibly in representing a probabilistic infon that is itself a multivariate probability density over permutations  (the objects of the permutation can represent things like rules, identifications, tracks, stories, scenarios, and other combinatorially complex data).

One possibility is to combine or be inspired by geometric approaches (which the Amplituhedron is) with other models such as stabilizer states, and possible extensions of novel graph based formalisms, within some physically inspired model (such as the Orbital Angular Momentum). 

The main problem is to identify the appropriate surrogate structures that would enable a quantum inspired computing software implementation to be realized.  A rather nice view is presented in Gil Kalai's blog (and all of his blogs are a must-read).

Okay, so here is the prospective connection as a data structure:  we can represent any permutation and combinations in terms of geometric structues that have a local neighborhood that represents a probability density, as long as this object (for example, a Permutohedron or Combinohedron kind of object) lives in a real subspace that acts as its embedding (this is what the the Grassmannian in effect provides). 

For example, for the permutations of the set {1,2,3}, there are six in total and the geometry is that of a polygon, a simple hexagon.  The permutations live at the vertices but we can can define this object (the hexagon) in terms of its lines and subspaces (in 2D or 3D) through which its vertices are defined implicitly.

For example, we can represent any point within the hexagon in terms of an equation that amounts to a linear combination of its vertices or any of its sides in terms of its partioning k-planes (hyperplanes, which in this case are lines) that act as the generators of the hexagon.   This is expressed as a positivity condition.

Finally, we note that any combination or permutation can be associated to this type of geometry and that if the faces represented probability densities, then, in the case of the hexagon, it is the lines connecting the vertices and their subspaces (i.e. the triangles within) that determine the probability distributions of the permutation space.

In this way, we can use the Amplituhedron-like model to connect arbitrary, possibly hierarchical and nested structures (aka an "amplitu-nestohedron" like structure) that represent conditional probability densities of factorially large spaces and for which, thanks to the Grassmannian, we can always build some embedding.  And then, we can ask wild questions as food for thought on whether there is a Majorana Representation or an inspiration for one can be connected to such a geometric representation given that in the Majorana representation, a spin-J system state is modelled as a set of 2J points on the Bloch sphere.

Ideally, for a quantum inspired computation model we might like to see how we can integrate the concepts from Bloch-Spheres and its variants into models inspired by the Amplituhedron.  Of course, realizing that this is a stretch and not an easy task, it is left as an exercise for the reader ;)

Until next time!