Once an algorithm is given for a problem and decided (somehow) to be correct, the next step is to determine how much in a way of resources , processes that run on real computers have finite resources, processes consume mainly two resources.
- The processing time and
- Space or Memory Use
|Space and Time Battle in the CPU|
An algorithm that solves a problem but requires a year is hardly of any use. Likewise an algorithm that requires several gigabytes of main memory is not (currently, as at the time of this writing) useful on most machine, The method of determining the efficiency of algorithms that allows us to rate them independently of platform-dependent timing is called complexity analysis.
When run with the same problem or data sets, processes that consumes less of the resources are of higher quality than processes that consume more, and so are the corresponding algorithms. Some algorithms consumes an amount of time or memory that is below threshold of tolerance, for example most users are happy with any algorithm that loads a file in less than one seconds.
When choosing algorithms, we often have to settle for a space/time tradeoff, an algorithm can be designed to gain faster run times at the cost of using extra space (memory) , or the other way around. Some users might be wiling to pay for more memory to get a faster algorithm. Whereas others would rather settle for a slower algorithm that economize on memory.
In any case, because efficiency is a desirable feature of algorithms, it is important to pay attention to the potential of some algorithms for poor performance. The computer and compiler used can’t do anything about these performance. The major factor to consider are the algorithm logic and the size of input
Measuring the run time of an algorithm
Measuring the memory used by an algorithm
Big-O Notation/Order of Complexity