The perils of using a monotonically increasing key (like a traditional timestamp) are pretty clearly laid out in the docs.
What is less clear, at the time of this writing, is the likely impact of using a monotonically decreasing pattern in a key, which is an approach suggested when regularly retrieving "the most recent records first".
Can anyone speak with authority on the effects of using decreasing keys compared to increasing keys, perhaps: "comparable hotspotting", "reduced hotspotting", or "no hotspotting but causes other undesirable/catastrophic behavior"?
P.S. Granted I may not (and may never) have "big" enough data to suggest Bigtable as an appropriate datastore choice, it is unclear to me why Bigtable is described as "a natural fit" for a time series data when the "best practices" for a likely reader (i.e. use range scans over keys -- probably clustered by timestamps) seem directly inconvenienced by the "best practices" for a likely writer (i.e. don't use timestamps except to the extent that keys can be "de-clustered" by promoted fields, salt-shards, or random-entropy) but maybe I'm missing something... or perhaps is this just the "state of the art"?