Loading [MathJax]/extensions/tex2jax.js
Frag­ile Com­plex­ity of Com­par­i­son-Based Al­go­rithms
Rolf Fager­berg
Uni­ver­sity of South­ern Den­mark

We ini­ti­ate a study of al­go­rithms with a focus on the com­pu­ta­tional com­plex­ity of in­di­vid­ual el­e­ments, and in­tro­duce the frag­ile com­plex­ity of com­par­i­son-based al­go­rithms as the max­i­mal num­ber of com­par­isons any in­di­vid­ual el­e­ment takes part in. We give a num­ber of upper and lower bounds on the frag­ile com­plex­ity for fun­da­men­tal prob­lems, in­clud­ing Min­i­mum, Me­dian, Se­lec­tion, Sort­ing and Heap Con­struc­tion. The re­sults in­clude both de­ter­min­is­tic and ran­dom­ized upper and lower bounds, and demon­strate a sep­a­ra­tion be­tween the two set­tings for a num­ber of prob­lems. The depth of a com­para­tor net­work is a straight-for­ward upper bound on the worst case frag­ile com­plex­ity of the cor­re­spond­ing frag­ile al­go­rithm. How­ever, we prove that frag­ile com­plex­ity is a dif­fer­ent and strictly eas­ier prop­erty than the depth of com­para­tor net­works, in the sense that for some prob­lems a frag­ile com­plex­ity equal to the best net­work depth can be achieved, but with less total work, and that with ran­dom­iza­tion, even a lower frag­ile com­plex­ity is pos­si­ble.

Joint work with Pey­man Af­shani, David Ham­mer, Riko Jacob, Irina Kos­tit­syna, Ul­rich Meyer, Manuel Pen­schuck, and Nodari Sitchi­navay.