2
votes

Was reading a data structures and algorithms book where the author said that almost never will the time complexity of an algorithm or program change across various programming languages. When or what features of a programming language may cause that rare instance to have a change in the big o notation for time complexity?

1
Often if you do not have a programming language that deliberately slows down processing, it will not matter much. Furthermore it depends on how you implement an algorithm. For example in functional programming languages, most data is immutable, but one uses data structures like a fingertree to perform efficient "updates". For example a Turing machine with k tapes it will run in at most the T(n)^2 number of steps: users.cs.duke.edu/~reif/courses/complectures/books/AB/…Willem Van Onsem

1 Answers

1
votes

Well, that really depends on what you mean by the complexity.

Let's take the case of the much maligned bubble sort, for example. This is basically O(n2) time complexity since. However, this relies on the fact that the underlying operations that need to be done (comparing and swapping items) are all O(1).

You could argue that, if the swapping was an O(n) operation (such as it could only be done by deleting and inserting, with the subsequent shifting that would be required for both those operations), the complexity becomes O(n3).

As an aside, I know of no such languages that do this, but it's easy to envision a very simple scripting language where all arrays are associative arrays (with strings as keys, and unsorted) and the lookup for a given element is therefore a sequential search.

You could equally argue that the underlying operations have nothing to do with the complexity of the algorithm, they're just an extra cost that's been introduced by the underlying limitations. I would tend to go with this definition since you're discussing the complexity of the algorithm itself, not how it works on different underlying systems.