- #STANDARD APPLE VIDEO FORMAT HOW TO#
- #STANDARD APPLE VIDEO FORMAT SOFTWARE#
- #STANDARD APPLE VIDEO FORMAT PROFESSIONAL#
It says image quality should be prioritized first and foremost, while the potential slowness of the capture process can be tolerated. If a balance needs to be struck between photo quality and delivery speed, balanced should be used. With speed, you are telling the framework that the speed of the capture is what you care about the most, even at the expense of photo quality. There are three quality prioritization levels to choose from: speed, balanced, and quality. We haven't had a chance to talk about this important API at previous WWDCs, so let's now take a moment to see how it works.
#STANDARD APPLE VIDEO FORMAT HOW TO#
It's a very easy way to tell AVCapturePhotoOutput how to prioritize quality in your photo captures. To solve this problem, in iOS 13, we introduced AVCapturePhotoOutput.QualityPrio ritization. Consequently, the name isAutoStillImageStabilization Enabled has become quite obsolete as a proxy for high photo quality. In addition to still image stabilization, we now have a far richer set of techniques to draw from, such as a variety of multi-image fusion technologies, including Smart HDR and Deep Fusion. But over the years, we have been continuously evolving our photo quality-enhancing algorithms. Historically, if you wanted to capture photos of the best possible quality, you would set the isAutoStillImageStabilization Enabled property on your AVCapturePhotoSettings to true, and that's because still image stabilization was the main method for getting higher quality photos.
Now that we know how to take photos on iOS in general, let's see how high quality photos can be taken. Please check it out if you haven't already. We had a very detailed discussion on these APIs in the 2016 session, Advances in iOS Photography. The captured photo will be represented as an AVCapturePhoto object that you will receive in your delegate method. Further customization can be done using the AVCapturePhotoSettings object passed to the capturePhoto method. Then, an AVCaptureDeviceInput is instantiated based on that device, and it will provide input data to the session.Īn AVCapturePhotoOutput will then be added to the graph as the recipients of the photos.Īnd all these elements are then connected together using an AVCaptureConnection.Īfter the session started running, we can capture photos by calling the capturePhoto method on the AVCapturePhotoOutput instance. Since we are taking photos, we will use a camera as our AVCaptureDevice.
We will start with an AVCaptureSession object, around which we can build our object graph. Before we dive into photo quality, however, let's have a brief refresher on how photos are taken on iOS in general. This diversity in use cases calls for an easy way to specify where you want to land on the scale of quality versus performance. In order to avoid frame drops, the developer might prefer lower resolution frames so there are fewer pixels to process per frame. And this custom rendering might be computationally expensive.
A social app, on the other hand, might need to apply face effect overlays on top of the video frames being streamed. For example, apps dedicated to taking still photos will demand the absolute best quality that the cameras can provide. Different scenarios call for different levels of photo quality.
#STANDARD APPLE VIDEO FORMAT PROFESSIONAL#
iPhone is the most popular camera in the world, and for many years, developers have been taking advantage of its powerful camera systems to provide a diverse set of world-class experiences, from professional photography apps to video streaming tools. Today I will be walking you through some exciting photo quality improvements we made with our most popular video formats, and how your applications can make use of them to deliver an even better experience.
#STANDARD APPLE VIDEO FORMAT SOFTWARE#
I'm an engineer on the Camera Software team.