1
votes

i've been trying to figure this out and i have absolutely no clue why this happens. Here is what i'm trying to do:

I am building a little gps application and i want things like coordinates, altitude, speed, etc. displayed on my main view. I also want to show the current horizontal accuracy.

I do this by using the "location.horizontalAccuracy" property and it all displays fine except that beforehand i set the "locationManager.desiredAccuracy" property to 100 but i get results far more accurate than 100 meters it's mostly around 80 or something.

that is really frustrating because at some point i want the user to be able to choose the accuracy and make it as inaccurate or as accurate as he wants. Any ideas?

I also tried the "LocateMe" sample-app from apple which lets you choose the accuracy at the beginning and the same problem happens there if i look at the location details i the results are always more accurate than what i set them out to be.

Also, i thought that the interval at which the

"- (void)locationManager:(CLLocationManager *)manager didUpdateToLocation:(CLLocation *)newLocation fromLocation:(CLLocation *)oldLocation"

method is called depends on the "locationManager.distanceFilter" is that right??

Please help me out here cause i'm going crazy!... Thanks in advance!

1

1 Answers

1
votes

The desired accuracy property is not there to add "noise" to the samples, it's mostly used to determine whether the GPS receiver needs to be on (it's turned off if it's not needed in order to save power). Obviously if it's set small enough for the GPS to turn on, you'll get as accurate samples as possible. If you want inaccurate coordinates (for privacy protection or whatnot), add inaccuracy yourself.