0
votes

I'm using two separate iOS libraries that make use of the device's camera.

The first one, is a library used to capture regular photos using the camera. The second one, is a library that uses ARKit to measure the world.

Somehow, after using the ARKit code, the regular camera quality (with the exact same settings and initialization code) renders a much lower quality (a lot of noise in the image, looks like post-processing is missing) preview and captured image. A full app restart is required to return the camera to its original quality.

I know this may be vague, but here's the code for each library (more or less). Any ideas what could be missing? Why would ARKit permanently change the camera's settings? I could easily fix it if I knew which setting is getting lost/changed after ARKit is used.

Code sample for iOS image capture (removed error checking and boilerplate):

- (void)initializeCaptureSessionInput
{
    AVCaptureDevice *captureDevice = [self getDevice];
    [self.session beginConfiguration];
    NSError *error = nil;
    AVCaptureDeviceInput *captureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
    self.session.sessionPreset = AVCaptureSessionPresetPhoto;
    [self.session addInput:captureDeviceInput];
    self.videoCaptureDeviceInput = captureDeviceInput;
    [self.previewLayer.connection setVideoOrientation:orientation];
    [self.session commitConfiguration];
}


- (void)startSession
{

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    if ([self.session canAddOutput:stillImageOutput]) {
        stillImageOutput.outputSettings = @{AVVideoCodecKey : AVVideoCodecJPEG, AVVideoQualityKey: @(1.0)};
        [self.session addOutput:stillImageOutput];
        [stillImageOutput setHighResolutionStillImageOutputEnabled:YES];
        self.stillImageOutput = stillImageOutput;
    }

    [self.session startRunning];
}


[self initializeCaptureSessionInput];
[self startSession];


AVCaptureConnection *connection = [self.stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[connection setVideoOrientation:orientation];

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:connection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
    // photo result here...
}]

Code for ARKit:

private var sceneView = ARSCNView()
... other vars...
... init code ...

let configuration = ARWorldTrackingConfiguration()
            
configuration.planeDetection = [.vertical, .horizontal]

// this should technically use Lidar sensors and greatly
// improve accuracy
if #available(iOS 13.4, *) {
    if(ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh)){
        configuration.sceneReconstruction = .mesh
    }
} else {
    // Fallback on earlier versions
}

//sceneView.preferredFramesPerSecond = 30
sceneView.automaticallyUpdatesLighting = true
//sceneView.debugOptions = [.showFeaturePoints]
sceneView.showsStatistics = false
sceneView.antialiasingMode = .multisampling4X

// Set the view's delegate and session delegate
sceneView.delegate = self
sceneView.session.delegate = self

// Run the view's session
arReady = false
arStatus = "off"
measuringStatus = "off"
sceneView.session.run(configuration)

Image samples:

high quality: https://zinspectordev2.s3.amazonaws.com/usi/2/16146392129fa3017be37a4b63bbfd0e753a62c462.JPEG

low quality: https://zinspectordev2.s3.amazonaws.com/usi/2/1614639283607613c3083344f39adc3c40c74f0217.JPEG

2
Confirmed to be a bug with the iphone 7 since it doesn't happen on a 12 pro.Cristiano Coelho

2 Answers

0
votes

It happens because ARKit's maximum output resolution is lower than the camera's. You can check ARWorldTrackingConfiguration.supportedVideoFormats for a list of ARConfiguration.VideoFormat to see all available resolutions for the current device.

0
votes

No work arounds found. However, this is definitely an apple bug as it doesn't happen in newer devices. Looking forward for an iphone 7 update.