99% of the time, any floating-point data type of your choice should be sufficient.
Yes, I know they can't always precisely and accurately represent values out to 15 decimal places, but that doesn't matter. Lat/long to 15 decimal places is specifying a position with a (claimed) accuracy on the order of the width of a couple hydrogen atoms. You only need six decimal places to get accuracy to within a meter, which is sufficient for most cases. Very few should need more than centimeter-level accuracy (8 places) and 10 or even 100 meter accuracy will be good enough for many applications.
And, of course, that's all assuming that the original data source is that accurate in the first place. My phone's GPS rarely claims accuracy better than within 10m (and even then, half the time its map doesn't show my actual location within the 10m circle where it claims I should be), so storing a position to greater precision would be pointless and misleading. Granted, you may be using a more accurate data source than the cheap GPS receiver integrated into a cellphone, but, even so, it's unlikely to actually be accurate all the way down to a molecular scale.