The best-case complexity of any algorithm is the minimum amount of time that the algorithm will take to accomplish its task. We know that the best case complexity of algorithms like merger sort, quick sort, etc is Ω(n log(n)), which defines the lower bound of these algorithms.
As we know that in asymptotic notations -
O(n) + O(n log(n)) = O(n log(n))
also,
Ω(n) + Ω(n log(n)) = Ω(n log(n))
So, if in these sorting algorithms, we first traverse the entire array in O(n) time to determine if the array is already sorted in ascending or descending order, then asymptotically their average case and worst case complexities would remain the same. But their best case complexity would now become Ω(n).
Logically there is definitely a flaw in my way of understanding these asymptotic notations otherwise someone would definitely have had pointed this out when Asymptotic notations were being developed or becoming popular to measure sorting algorithms. Am I correct in assuming that this is a plausible flaw in asymptotic notations or am I missing some rule of Asymptotic notations?