# P≠NP

*255*pages on

this wiki

Now that I have your full and undivided attention...

## P→NP

**P≠NP**, but **P → NP** as a matter of Dimensions of Complexity. All problems can be categorized by the number of Dimensions of the Challenge (think Domain), the Dimensions of the Solution (think Range), and the difference between these numbers. If you've ever wondered why GPUs are so good at helping CPUs solve certain problems, this is why.

## GPUs Rescue the Travelling Salesman!

GPUs use Hamiltonian operators to handle 3D environments, which allows the simultaneous comparison of three values in an orthogonal relationship. This vastly improves the computational power of an algorithm which can make good use of such comparisons -- but does it go far enough? In the Travelling Salesman problem, each route can be compared against any other route in a right triangle, giving an imaginary value to the preference of one route over the other, whether they are deemed exclusive, or can be used in the same solution. If we could compare the solution value of **all** routes simultaneously, all that would remain would be to compute a route that makes the best use of these comparisons, that is, that gives the optimum value of solution for the problem stated. This puts the number of comparisons to made somewhere on the order of .

If , then we only need H operators for D dimensions. (I'm definitely certain that's a silly-ass guess.)

## Doing the Impossible

If we think about the **absolutely impossible problem** of Trisecting the Angle (**P;D**), there is something to be learned here as well, in that if we think of a 3D version of the problem which **can** be solved, and then projecting a **shadow** of that solution into the original 2D space, we arrive at an **approximation** of a solution in 2D, but we just can't get an **exact** solution. So the Challenge is posed in 2D, there is a transformation into 3D, and a subsequent solution, and then that solution is projected into a 2D Solution ("Range", remember?), where the best solution **to be had** is then available.

## Fermat's Last Theorem

Our options in cubes are:

, or

(yep, 27 + 64 + 125 = 216 -- check it yourself)

so that if we use Hamiltonian operations, problems like FLT can be re-checked with greater confidence, and even thornier problems can be resolved using roughly the same methods.

## Cubology and Group Theory

If a maximum of 27 moves is necessary to solve any given cube, (25 moves being the most frequent), what then are H and D for this problem, and then any problem in the same class?

## Further Considerations

I plan to make a scan of all of my musings available as soon as is practical, but in the meantime, let's look at the problem set to us in terms of whether P and NP are equal or not.
All previous attempts at constructing mathematical expressions have been rooted in a high regard for simplicity of expression and computation, rather than a prioritization of accuracy and precision; as we really haven't **historically** had this much computing power at our fingertips, that's to be expected. We are now in a position, however, to demand ever higher, even seemingly arbitrary levels. Remember that the square root of 2 was an **absolute scandal** in Pythagoras' day! (I was there, that's how I know. P;D)

## Economics and Computability

In Economics, we would like to control some of the fluctuations of values that can cause economic hardship, or difficulties in designing and enforcing foreign, domestic, or trade policy. If, however, we are projecting our efforts from a space with a small number of dimensions, into a space with a far larger number of dimensions, the nature and magnitude of the effort applied will have to be selected with a great deal of effort as well or the effort may be entirely diluted in the larger space. We might easily stir the cauldron, but is our control transient, and largely illusory? Thus, computational effort can be shown to be irreplaceable in many policy decisions, academic fields, and other similar endeavors.

As a rough guess, something like 90% of global computing power either concerns negotiable instruments directly, or tracks data associated with such instruments. That is, money has become a grammar of computability, and Economics becomes a study of those forms of Computability which can be translated directly and easily, to and from that grammar. This is probably the first and currently best application of orthogonal value computation, but probably not the last. All of the social sciences could benefit from analytic and synthetic tools which presume individuality of views (many tiny levers of power, with an extremely large number of dimensions) rather than a small set of objective standards (a few big levers of power in few dimensions). Our increasing Autonomy as Individuals is reflected in our growing awareness of the miniaturization of institutions and institutional power. (INSERT EVIL CHUCKLE HERE)