Complexity Analysis

Time ago i was trying to decide my first software as a service. Unfortunately is a hard thing but im a strong person. So as a researcher i'm gonna tell you about what is this topic and what is the important thing there.

I'm gonna start with three basic models.

  1. Exponential Time
  2. Linear Time
  3. Log Time

These categories describe how long you take for a task. Most cases are 2 and 3. Today i will mention 1.

Exponential Time refers to algorithms which expected time grows even bigger depending on data. for example the file size could be the input, this category belong to unreliable problems for which most but maybe all engineers have no idea how to reduce time per task.

This problem is called P vs NP millennium problem. The question can be reduced to. ¿is it possible to have the solution from 1 to 2? We know that get the output is possible, but not about efficient execution time, or how long in a short way.

Exponential Time is the barrier between Big Data and classic software. By the simple fact a small amount of data requires years or days per task.

The bad new is find an efficient way to get results can be inconvenient for secureness, it can break ciphers like RSA, ECC . There's no limit once we pass then.

The good new is, we are close to it. Keep reading :).