I am currently learning about big O notation in my algorithms class and I stumbled upon this particular problem that perplexed me. Can we represent O(n lg(n)) as c * n* lg n if we know the formal definition holds? Meaning, if f(n) <= cnlg n, and the definition holds true for some constants, then O(n lg n) can be represented as c* n* lg n? And if my assumption is true, then we could do:
= lg(O(n lg n))
= lg(c* n* lg n)
= lg(c) + lg(n) +lg(lg(n))
If lg(n) is the highest order term in this case, then would this simplify to be O(lg(n))? Since all the lower order terms would eventually be overlapped by the highest order term?

ngrows arbitrarily large, thenlgnterm will behave like a constant timesn, so you would be left withlg(c*n). So I think your reasoning is correct. - Tim Biegeleisen