Simple algorithm execution time in image

Let's say we have an algorithm that gives an image and a set of "centers" changes each pixel in the image one by one after calculating which center is closest to our runtime by O (n) or O (cn) where c is the number of centers and if c were a constant fraction of n, would it become O (1)?