# Values and principles

## Sustainability

The overarching goal of *greeNsort ^{®}* is providing algorithms that are sustainable in the sense of minimizing \(eFootprint\) and that are suitable to replace less sustainable algorithms in many places. Hence the following values guided the development:

## Generality

The new algorithms should be applicable in as many contexts as possible. This implies comparison sorting on any data types, including the possibility to sort elements of varying size.

## Robustness

The new algorithms should perform robustly on different types of hardware, whatever the particular features regarding random access, cache-size, branch prediction etc., hence comparisons are done between algorithms not tuned to specific hardware. This implies that *greeNsort ^{®}* does not assume a specific machine model like a random-access model with constant costs for single-element access or a disk-model with constant costs for access to blocks of data. It is simple assumed, that there is a monotonic relation between access distance and access-cost.

## Resilience

The algorithmic portfolio should include algorithms that perform in extreme situations and on old hardware with little (RAM) resources.

## Scalability

The new algorithms should reliably perform \(N \log{N}\) in the worst-case, hence only divide&conquer algorithms are considered.

## Reliability

The new algorithms should reliably perform \(N \log{N}\) or better for any patterns of input data. Deterministic algorithms have advantages.

## Adaptivity

The new algorithms should be adaptive, but not tuned to best cases. The main task of sorting algorithms is to sort data that is not already sorted, hence algorithms are developed to perform well in worst (or average) cases.

## Concurrency

The new algorithms should allow proper and easy parallelization, hence the algorithms are described and demonstrated as top-down divide&conquer, not bottom-up. Experts can still implement without recursion.