Sunday, May 10, 2015

Classical Mental Models are Incommensurate with the Quantum Mindset

The Classical and Quantum Mindset

My blog today is to draw the reader into questioning one's own mindset in attempting to conceive and imagineer new quantum inspired or quantum computing models.  I will get back to the probability density polytope shortly from the previous post.

I would like to start out with a quote from a really good book on Quantum Computing:

"Algorithm design for quantum computers is hard because designers face two difficult problems not faced in the construction of algorithms for classical computers. First, our human intuition is rooted in the classical world. If we use that intuition as an aid to the construction of algorithms, then the algorithmic ideas we come up with will be classical ideas. To design good quantum algorithms one must ‘turn off’ one’s classical intuition for at least part of the design process, using truly quantum effects to achieve the desired algorithmic end. Second, to be truly interesting it is not enough to design an algorithm that is merely quantum mechanical. The algorithm must be better than any existing classical algorithm! Thus, it is possible that one may find an algorithm which makes use of truly quantum aspects of quantum mechanics, that is nevertheless not of widespread interest because classical algorithms with comparable performance characteristics exist. The combination of these two problems makes the construction of new quantum algorithms a challenging problem for the future."   Nielsen, Michael A.; Chuang, Isaac L. (2010-12-09). Quantum Computation and Quantum Information. Cambridge University Press. Kindle Edition. 

In simple terms, this hits to the heart of the Church-Turing Thesis (which has never been well challenged, disproved nor proved) - so it would seem that it will be very hard to build efficient true quantum algorithms,  even harder to build a quasi-quantum, pseudo-quantum algorithm, and plausibly moderately hard to build a quantum-inspired algorithm.

Ultimately, and at the foundation of current systems, it must hold from whichever variety of Church-Turing Thesis that you would like, that a conception of any programming language must be a Turing machine:  if, however, the foundation is changed, then conceptually we are in a place where the current history of science has not yet ventured and I have no idea about what that might look like - I admit it can exist, but, that the foundations must be radically not as they are today (in the conception of theories in modern physics).

However, in attempting to get outside the box of classical thinking, it does serve some purpose to expound upon a simple concept: the Qubit and the Bit.

In reviewing the traditional understanding of a Qubit, it is stated often that a qubit is just like a classical bit only that it can probabilistically flip between a state of "0" and a state of "1".   Therefore, the question oft comes up why we can not just use a statistical (hidden) variable with a probability distribution function that represents either "0" or "1" by a certain probability.

Here is the answer: because in the classical case, we have already chosen how the probability will outcome while in the quantum case we are forced to measure and interact.

This, in a deep way, implies that there is no way to describe a qubit classically. This point goes back to the essence of the quote from Nielsen and Chuang's text on Quantum Computation and Quantum Information.

Another difference is that a bit is a point-instance object while a qubit is a system: it is a quantum-mechanical system and can hold more information than just one bit. The outcome of this is that you cannot redefine a qubit as an interpolation between a function that outputs "0" and "1" in the same way that you can describe, say, a fair coin toss.  They completely not related.

However, my personal view is that while we cannot yet work effectively and efficiently at the level of individual qubits (though research is making progress), what about an ensemble?  An ensemble will exhibit statistical properties just like any ensemble and the relevance of individual qubits becomes subsumed within the ensemble, so that, at least from the algorithmic viewpoint, we have indirect access to qubits as a result of the statistics.  Is that possible?

The implication of that last statement is that ensembles, as long as they are made up of quantum systems, store individual qubits but can be indirectly manipulated via classical techniques at the level of the ensemble.  In the real world, for example, even though we do not have access to the individual quantum systems of a Superconductor, nonetheless, we have access to the ensemble effects of the quantum systems (i.e. its superconductivity properties) and indeed companies like D-Wave have taken advantage of this.

What about the research on individual qubits? Qubits are idealized mathematical objects and in nature, discoveries on how to implement the qubit abstraction have resulted in, for example:  the states of electrons orbiting an atom, or the nuclear spin in magnetic fields or as the polarizations of a photon.

And in between the pure ensemble model and individual qubit model is the model of the topological qubit that can be implemented in terms of states of matter (either via an ensemble or via a unit), such as for example, in  Graphene.  I believe, the future, that the novel functionally graded doped versions of Graphene will result in topological qubits (a personal belief of mine).

What does all this tell us about the Classical versus the Quantum mental models or mindsets that we use.  I claim that it tells us the following:  that we need to base our thinking on mathematical abstractions first, and then seek to discover design models for these abstractions in terms of real world physics, or equivalently, for us computer geeks, in terms of Turing Machine models.

Until Next Time!


Monday, May 4, 2015

Quantum Inspired Computing - the Density Polytope Quanton (and Kyndi's Computation Model)

Quantons (aka Kyndi's Model for Quantum Inspired Computing)


First, I would like to start off with a quote:

"Tensor networks provide a mathematical tool capable of doing just that. In this view, space-time arises out of a series of interlinked nodes in a complex network, with individual morsels of quantum information fitted together like Legos. Entanglement is the glue that holds the network together. If we want to understand space-time, we must first think geometrically about entanglement, since that is how information is encoded between the immense number of interacting nodes in the system." - Tensor Networks

Why I mention this is that the MERA (multiscale entanglement renormalization ansatz) model has potentially interesting surrogates in terms of classical structures. One of the computer science components at the heart of the concepts are variable orderings and permutations with respect to network structures (specifically tree tensor network diagrams).  However, none of this matters if it cannot be done fast.   So the crux is to identify the mappings between structural regularity and analytic precision between representations like tensor networks and surrogates that are faster but essentially equivalent. 

One way to approach this is through geometry.

The idea of tensor networks, and in general, matroids, could be seen from the viewpoint of multiplihedra, associahedra and related polytopes (zonotopes) have a very interesting structure. 

More on tensor networks and matroids in another posting - there are several intertwined concepts that still need to be surfaced.  Mathematical ideas without understanding, in my view, is junk factoids - so I will do my best to craft a path from where we are (with introducing ideas, specifically an interesting object, the Quanton) and where we need to get - so please bear with me.

One of the interesting structures is the Birkhoff polytope because it corresponds to permutations (i.e. it is what is called an extended representation of the Permutohedron). 

Given a Birkhoff polytope, for example, for 4 integers, it will encode the 24 outcomes of the permutation of the 4 integers as an extended representation of the 4-Permutohedron.  Of course, this structure, has 24 vertices, but, there are also 24! ways of choosing the ordering of vertices (in other words, there are a very large number of different Permutohedra!).

The interesting element is when the polytopes are embedded into manifolds, which could be (hyper)spherical or (hyper)torii - imagine each vertex is associated with a probability density, and tangent-density polytope is the tangent space at the vertex of the polytope associated to a probability density (the tangent space of the sphere providing a nice linear subspace if you like).

What this really is about, is precompiling a number of general relationships about a data structure, such as a certain kind of Orbitope so that the operations are effectively "parallel" across the various general relationships that are codified within the structure: to do this, it usually means you have to also come up with a way to associate a nicely computable manifold with the orbitope - that is where the hard work comes in.  However, once you can precompile an orbitope and its embedding, then some unique and powerful properties, including the ability to effectively approximate some specialized quantum operations, most notable of which are Topological Quantum Computing and the operations of braiding to build gates.

I will get into how this can be achieved as a sort of "poor-man's" surrogate for a real topological quantum computing fabric (pun intended) in which the gate processes are woven (punning again!) together to provide a computing model along the lines of permutational quantum computing.

An excercise for the reader is the following question: what orbitopes admit an analytic representation in terms of "good enough" on a manifold as embeddings for simplified calculations that are reversible?  More on this in the near future.  I have a conference talk to give and promised I would save this for later.

Quantum, Classical and Quantum Inspired Computing Revisited


Today, physical theory tend to dominate the progress in the field of quantum computing as theoretical quantum science is only really in its infancy. The basic thrust, unproven, is that real physically implemented quantum computers that run quantum algorithms will solve the computational complexity problems and open realms of possibilities beyond classical computing. Unfortunately it isn't so. 

The realms of possibilities still remain to be discovered in the classical realm, though perhaps, taking inspiration from the quantum world.

With respect to so-called pure or true quantum algorithms there are no proofs that I have found anywhere that show that any quantum algorithm will always outclass a well designed algorithm for the task.

If you want quantum-like computing then, perhaps, it will be achievable by new algorithms and methods as well as through more processor cores on a chip:  we still don't understand the dynamics of dense core communications, behaviours, quasi-chaos, distribution and access of resources in such systems.   Again, we are in our infancy.

Building a real hardware quantum computer is rife with experimental difficulties such as fault tolerance, scalability (beyond a few qubits) and a demonstration that entanglement can surmount complexity issues (since you can have classical systems that perform with quantization, superpositioning and parallelism - all that really makes purified quantum computing unique is entanglement).  Of course, some approaches, like Topological Quantum computing, promise to alleviate many issues.

But even for entanglement, we can ask if there are surrogate ideas in the form of correlation functions (not in the sense of quantum, but mimicking the fact that two observables can be strongly related) and whether or not these can be maximized in number to act-as-if there were deeper parallelism.

We suffer significant limitations even in classical complexity theory and no proofs to guide the design of algorithms for classically hard problems though there are lots of proofs for upper and lower bounds --- where we still have to innovate is in proofs about design. Proof of classical hardness is notoriously difficult and there have been many false starts but proofs about design are still non-existent (anyone out there know if this is truly the case? Any pointers welcome!).

Quantons


So in the realm of quantum-inspired, we are really forging a bridge between interconnected, strongly co-dependent data structures on which certain kinds of computations are extremely efficient and that mimic what our ideas might be about the benefits of quantum computing:  in other words, the hard work is about clever algorithm design by drawing upon analogies from the hard core science of quantum and field theoretic physics.

So what are quantons?

Well, my blog time is over for today but I promise to introduce and explore the quantum inspired concept of Quantons.

Until next time.

Wednesday, April 29, 2015

Entanglement: Quantum Inspired Computing (QuIC) Approach to Precompilation of Entanglers

Entanglement and Entanglers

Quantum entanglement is an observable phenomenon that his dependent on strong correlations being measurable between superficially disparate things.  In a nutshell, entangled particles are precompiled (i.e. prepared) in states in which entanglement is observable and measurable.

I think of entanglement and quantum state preparation in particular as a precompilation step.  The preparation process is, in my mind, just precompilation.

"The best way to predict the future is to invent it" - Alan Kay

Note of Caution

I personally hate redefining or using new terms except and unless where they contribute a conceptual mental model that could not otherwise be easily implemented.

My reading recommendation is to review Baez's Crackpot Index or Siegels Quack Index.

With those words of wisdom and caution, I would like to add a few conceptual colors to terms such as Entangler and Precompilation.

Entangler

An entangler is a process that accepts un-entangled data representations and outputs entangled representations.  The representations entering and leaving the entangler may have the same or different representation models as long as the qualities of entanglement are observable and measurable after the entangler has operated on the inputs.

Precompiler

So what is precompilation?

Precompilation is a way to think of the concept of preprocessing and partial evaluation as a step of setting up the state of a computation so that compilation and execution are efficient.

I have not seen a lot of dialog on quantum computing and the concept of precompilation though, implicitly, any state preparation could be considered precompilation.

A compiler, in the quantum sense, then is the configuration of the computing material that has been prepared and then configured into a form that runs the computational model by either generating or accepting data which produces the results of a simulation or computation.

Generic Quantum Inspired Precompilation

Okay so here is the juice:  taking some inspiration from Pitowski Correlation Polytopes and the relationships between bicomplex numbers and the vertices of various interesting polytopes such as the Birkhoff Polytope, an extended formulation of the Permutohedron, then it struck me that using Pitowski and a few other gems (which I will get into next post) that you formulate a series of approximations that appear to provide some of the inspirational features of quantum computing.

For example, by association a complementary probability density to the vertices of a correlation polytope using an extended formulation, then an approach using an Estimation of Distribution Algorithm (EDA) would seem to fit the purpose of getting from joint probabilities of concurrent or seriated events as components of a solution space:  the applications would apply to Travelling Salesman, Matrix Permanents, Consecutive Ones, Seriation, K-Sums, and many, many other problem sets.

In short, by precompiling a probability distribution with a correlation polytope in terms of the extended formulation of a combinatorial polyhedron the opportunity seems to exist that we could do some quite clever coding to produce a non-commutative geometrical probability density that could be useful for problem solving through approximation.

In other words, the propositions at the vertices of such a polytope are characterized by the probability density as a function of the geometry rather then the other way around.   In the limit, these distributions are spherical directional non-commutative probability densities on the sphere (or torus or some other similar smooth manifold).

I would also bet that the tangent spaces would be better at linearity than the actual surface kissing point of the vertices but I will defer those thoughts to another day.

For now, let's just call these new objects (Probability) Density Polytopes since they are a combination of correlation polytope with probability distribution.  If anyone out there knows what these things can be called, please let me know!

Density polytopes could have differing kinds of distributions and functions so they could come in a variety of specifications: von Mises Fisher, Mallows, Watson, Gaussian distributions, etc...

The Entangler

Therefore, the entangler in this case becomes the EDA itself and the objects become the populations of density polytopes.  The features of quantum superposition and interference can occur as the by products of choosing  probability distribution function mixtures and ensuring a means to filter and detect those potentially useful quantum features.  In any case, it would seem that the total system is classical and superficially only is inspired by quantum thinking.

Until next time: the Density Polytope.




Thursday, April 2, 2015

QuIC (Quantum Inspired Computing): Amplituhedral structures and Grassmanians

QuIC (Quantum Inspired Computing):  Amplituhedral structures and Grassmanians

Last time I looked at some structures based on the Bloch Sphere and its variants.  While the Amplituhedron is not yet a popular or well-known structure for performing quantum inspired computation, it is an amazing machinery for simplifying thousands of pages of equations that would be needed for scattering amplitude calculations (which you can read about in terms of the Amplituhedron here) into just a few (with the commensurate speedup in computation).

The reference work is:


And the best background available is this (download the "other formats" compressed file to get the Mathematica notebook): 


Of course, there is comedy online in the form of Scott Aaronson's Unitarihedron post (more here:  http://www.scottaaronson.com/blog/?p=1537 )

With all the kidding aside, there is perhaps a use for such a structure and it is possibly in representing a probabilistic infon that is itself a multivariate probability density over permutations  (the objects of the permutation can represent things like rules, identifications, tracks, stories, scenarios, and other combinatorially complex data).

One possibility is to combine or be inspired by geometric approaches (which the Amplituhedron is) with other models such as stabilizer states, and possible extensions of novel graph based formalisms, within some physically inspired model (such as the Orbital Angular Momentum). 

The main problem is to identify the appropriate surrogate structures that would enable a quantum inspired computing software implementation to be realized.  A rather nice view is presented in Gil Kalai's blog (and all of his blogs are a must-read).

Okay, so here is the prospective connection as a data structure:  we can represent any permutation and combinations in terms of geometric structues that have a local neighborhood that represents a probability density, as long as this object (for example, a Permutohedron or Combinohedron kind of object) lives in a real subspace that acts as its embedding (this is what the the Grassmannian in effect provides). 

For example, for the permutations of the set {1,2,3}, there are six in total and the geometry is that of a polygon, a simple hexagon.  The permutations live at the vertices but we can can define this object (the hexagon) in terms of its lines and subspaces (in 2D or 3D) through which its vertices are defined implicitly.

For example, we can represent any point within the hexagon in terms of an equation that amounts to a linear combination of its vertices or any of its sides in terms of its partioning k-planes (hyperplanes, which in this case are lines) that act as the generators of the hexagon.   This is expressed as a positivity condition.

Finally, we note that any combination or permutation can be associated to this type of geometry and that if the faces represented probability densities, then, in the case of the hexagon, it is the lines connecting the vertices and their subspaces (i.e. the triangles within) that determine the probability distributions of the permutation space.

In this way, we can use the Amplituhedron-like model to connect arbitrary, possibly hierarchical and nested structures (aka an "amplitu-nestohedron" like structure) that represent conditional probability densities of factorially large spaces and for which, thanks to the Grassmannian, we can always build some embedding.  And then, we can ask wild questions as food for thought on whether there is a Majorana Representation or an inspiration for one can be connected to such a geometric representation given that in the Majorana representation, a spin-J system state is modelled as a set of 2J points on the Bloch sphere.

Ideally, for a quantum inspired computation model we might like to see how we can integrate the concepts from Bloch-Spheres and its variants into models inspired by the Amplituhedron.  Of course, realizing that this is a stretch and not an easy task, it is left as an exercise for the reader ;)

Until next time!

Tuesday, March 17, 2015

Quantum Inspired Computing (QUIC) : The Bloch Sphere and its Variations

 Quantum Inspired Computing (QUIC) - 1

Last time we made a list of the types of models for data structures for quantum entities.  The first item on the list was entitled "Bloch-Hyperspheres in the sense of extensions of the Bloch-Sphere".

Let's unpack what that means in the context I am thinking about.

In terms of a short list it means how to represent quantum entities in software using models drawn from:

1. The traditional Bloch Sphere (aka Poincaré sphere aka Qubit ).
2. The Riemann Sphere
3. The Majorana Representation ( and as points on the Riemann Sphere )
4. The Quasi-Probabilistic Frame Representation
5. Other Possibilities for representation (including ad-hoc computer languages)

1. Bloch-Sphere Qubit

 The two-level quantum bit is called a Qubit and the data representation that is useful computations is known as the Bloch or PoincarĂ© sphere depending on whether or not you are talking to mathematicians or physicists.  There is already a lot of information online about the Bloch Sphere that the reader can find many articles and introductions to this basic representation.

The problem with the Bloch sphere is that it is solely a representation for a two-level system.

2. The Riemann Sphere

The Riemann Sphere generalizes the Bloch Sphere and can handle d-dimensional quantum entities ( Qudits ).   The advantage is that the representation is a well-known mapping so all the tools of algebra and geometry as well as directional statistics can be used.

3. Majorana Representation 

The Majorana Representation provides a fully generalized representation for quantum entities. The representation dates back to 1932 and has a permutation and symmetry underpinning for which there are nice mathematical properties useful in software such as ease of implementation and visualization.  In fact, a good and easy to read review can be found here. The Majorana representation forms a complete visually appealing representation of the wave function and therefore carries complete quantum information about the state of the system.

4. Quasi-Probabilistic Frame Representation 

The paper at Arvix by C. Ferrie and his thesis provides the best introduction to the idea of extending probabilistic frames to the quantum case.   The representations hinge on the connection between classsical phase space and quantum states through quasi-probability distributions.  A tutorial can be found here.  As stated in the tutorial, "It furnishes a third, alternative, formulation of quantum mechanics, independent of the conventional Hilbert space, or path integral formulations."

5. Other Possibilities for representation  

There are several other more esoteric and less well understood models for representation that the daring may seek.  For example, the paper Quantum Theta Functions and Gabor Frames for Modulation Spaces expresses one of these more esoteric approaches.  For an overview of using Haskell to represent quantum computing, see this paper or to logic programming in pure prolog as a way to compute or this interesting paper for quantum inspired Interclausal Variables.

Which Choice?

At this point it is still too early to state a choice because choices will invariably tied to contexts of data processing or the types of problems being addressed.

Until next time!

Sunday, March 1, 2015

Data and Algorithm Part-1 (Continued): Quantum Inspired System Semantics (QUISS) - Overview

Quantum Inspired Computing (QUIC)

There is a lot of information available on the Internet for the definitions of Qubits, Qutrits and Qudits so there is no need to repeat those here.

What is not discussed is that these objects are all geometric in nature and have more than one dimensionality to them.

There are many options for quantum-inspired computing (QUIC) and for a QUIC list of options, we can choose to represent the properties of data using one or some combination or all of our own short selection of ten such options:
  1. Bloch-Hyperspheres in the sense of extensions of the Bloch-Sphere;
  2. Amplituhedral structures and Grassmanians in the sense of Trnka, Postnikov, Bourjaily et.al in which volumes produce probabilities
  3. Polyhedral and Combinatohedral structures (e.g. Permutohedron) in which directional probabilities are represented by permutation polyhedra where each vertex represents a permutation (there are N! vertices for an N-element permutation);
  4. Topological structures such as Topoi, simplices and generalized maps in which involutions and functions with higher order symmetries (complex involutions) define the skeletal structures.
  5. Non-Binary Base Numbers (Complex, Figural, Tree, Functional and Mixed Radix) in which properties of big numbers that can represent the Goedel numberings of various structures are combined with probabilities (for example, the real parts and the imaginary parts treated as on single whole but entwining different conceptual bases);
  6. Quantum random walk and quasi-quantum like stochastic walks on classical structures like graphs or lattices represent properties of the data of interest.
  7. Field Structured Representations and quasi-quantum/analog representations such as particle swarms which are represented in the complex plane as well as the real plane.
  8. Quantum-like entanglement defined as any correlation in complementary bases of representation of information.  For example, measures of discord or mutual ignorance may co-correlate with measures of informativeness and these may produce some quantum-like effects (though not true quantum entanglement ... we are, after all, working on QUIC).
  9. Virtual machine designs and architectures that represent quantum like properties such as variables that entangle (as high-order "sharing")  or produce uncertainty between clause definitions or reflective, simulations of quantum particles as computational analogs of quantum processes (such as treating text in terms of Bose-Einstein condensates)
  10. Genetic, parallel, distributed systems as quantum analogs or real quantum systems.
These are just some ideas for some top level concepts for representing quantum inspired data structures that encapsulate the desirable properties of being able to quantize, superpose and correlate at higher dimensions (i.e. entangle) information in a possibly useful way.

For other sources of data structures and examples the following sources are also very useful and inspiring:

Quantum Computing since Democritus  and its companion website online course.
Quantum Machine Learning: What Quantum Computing Means to Data Mining
Principles of Quantum Artificial Intelligence

There are, of course many sources around but we shall attempt to look into a few algorithms and reprsentations in the coming months to see what the possibilities can be.




Sunday, February 15, 2015

Data and Algorithm Part-1: Quantum Inspired System Semantics (QUISS) - Overview

 

Introduction


The next set of posts will introduce several elements that will build a mindset, or mental model and a point of view about quantum inspired system semantics that will focus on the actual path to building operational quantum-like data representations, processing and algorithms.

Quantum Inspired System Semantics or QUISS for short follows the KISS principle.  That is, I will do my best to get to the point quickly and efficiently.  However, let's not forget what Feynman and Nielsen said about trying to explain quantum computing or other challenging emerging science.  If it could be written and explained to everyone or anyone, then it would not be new but commonplace and old-hat.

In terms of the approach, we will take things starting at the beginning and provide some concretely usable concepts, in several cases, implementations as well, and example data input/outputs and value.

Bits, Bytes and the Qubit

 

Traditional computing uses the familiar abstraction of the Bit over its implementation as an electrical circuit, or even a mechanical circuit (like Turing's original designs) whose voltage or mechanical lever is either above or below a certain threshold.

The concept of aggregating a number of bits, in order, using the binary base numbering system was all that was needed to complete the most basic and fundamental data structure:  the byte.  It is on bits and bytes that all of modern computing rests.

From a linguistic perspective it was natural, therefore, that a name that had as its past, the word "bit" in it would be chosen as the simplest term for a Quantum datum.   But unlike the word "Bit", which is crisp clear and simple, with an underlying abstraction built on electrical or mechanical circuits, adding the word "Quantum" changes everything so much to the point that the word "QuBit" leads one into a morass.

The best explanation is not the Wiki Qubit, but the David Deutch article on "It from Bit".

A Qubit is neither crisp, clear nor simple: it is a superposition of states of objects representing states, such as state vectors whose elements are some other sorts of mathematical objects that have nothing to do with voltage thresholds or the positions of mechanical levers.

In fact, the Qubit concept as you find it online incorporates two notions:  the first is that there is a probability scale between the elements that make up a qubit and second that knowledge of the first element automatically implies knowledge of a second.  So, as an example,  consider the question whether if it will rain tonight: instead of a classical "yes" / "no" stated as binary {1} or {0}, you have some range from [0, 1] representing the probability of the outcome that it will rain and not rain.  In other words, the underlying elements are not probabilities but wavefunctions whose amplitudes when subjected to squaring eliminate the complex plane to produce a real value that is then interpreted as the probability.

The issue is that the underlying abstraction for a qubit is not a circuit but a complex vector space, and then, the underlying abstraction of that complex vector space is the mental model of a quantum entity as inferred from field theory and other branches of physics.

A quantum observable is neither a real variable, like a classical degree of freedom, or probability measurement nor a discrete variable like a classical bit, or a random choice, but a much more fascinating, and deeply significant object that entangles and integrates all the discrete and continuous aspects in all the dimensional as well as non-dimensional but measurable phenomena.

The implication is actually staggering:  how does a quantum entity make the transition from one state to another?  Does it just "jump" between states? The answer is incredible and given by quantum theory is that it makes it
continuously but is only observable discretely!

The point here is that the Qubit is nothing like the Bit and not even remotely so.  It would have, in my opinion, been better to chose some other completely different name to avoid the confusion that this now produces. 

Qubits, Qutrits, Qudits

The key goal in representation is to figure what data structures are most useful and enable suporpositioning, quantization and entanglement.  If we take a step back to see the forest from the trees, we realize the combinatorial structures and geometries have a utility all their own:  Alexander Grothedieck developed a theory of "places", formally called Topose Theory, as a replacement and as an advancement of the theory of categories where the intuitive patterns of geometry could be localized and captured in model for reuse.

The major idea we are presenting here is that geometry is intimately tied into combinatorics which is the key to the quantum probabilistic description of information structure.
 

[TO BE CONTINUED]