Initial setup to edit, play and export video

Hi there! 👋

Welcome to my first post about video editing with Swift and AVFoundation. Today, we’ll start with setting up the environment. It’s a basic but important step for the subsequent tasks.

You can use a sample project’s skeleton as a basis for building your video editor. I will do the same here, replacing the skeleton project with each post’s sample code. To download sample code click “View Contents” at the bottom of post.

I believe it’s important to begin by discussing the requirements and what we want from the video editor in a broad sense.

Common tasks

Basically, when editing videos with the AVFoundation framework and Swift, you will encounter three common tasks.

  • Playback: This involves viewing the edited video in action.
  • Editing: In this task, we aim to have a set of routines that can modify the video. An important requirement here is to avoid long processing times. Changes applied to the video should appear almost immediately.
  • Export: This is the final step in video editing. Exporting entails saving the video to a file and sharing it with the Photos Library or any other app on the platform.

In the next sections we will solve all three tasks.

Playback

Let’s create a new project in Xcode named “EnvironmentSetup” similar to the sample code. It should be a SwiftUI application in Swift.

First, we will tackle the Playback task. Don’t forget to import AVFoundation:

import AVFoundation

We will address the Playback task by utilizing AVPlayer and AVPlayerItem, which are fundamental but highly powerful components.

Now, let’s write the initial lines of code for the video editor:

class VideoEditor {
    var avPlayerItem: AVPlayerItem? = nil
    var avPlayer: AVPlayer? = nil

    private var asset: AVAsset
}

The main concept here is to always have AVPlayer and AVPlayerItem readily available for display. In our sample code, the AVPlayerItem is supported by AVAsset, so we also store the AVAsset.

Considering that the video editor can edit a single video or be based on one, let’s write an initializer for the VideoEditor class.

init(url: URL) {
    let url = url
    let asset = AVAsset(url: url)
    let item = AVPlayerItem(asset: asset)

    self.asset = asset
    self.avPlayerItem = item
    self.avPlayer = AVPlayer(playerItem: item)
}

Great! Based on this, we can already output the video to the UI and play it. Somewhere in your SwiftUI code, remember to import the necessary module:

import AVKit

And then, somewhere in your SwiftUI view code, you can use the imported module to incorporate the video playback functionality:

struct MyAppView: View {
    let videoEditor = VideoEditor(url: /* URL of local video */)

    var body: some View {
        VideoPlayer(player: videoEditor.avPlayer)
    }
}

We have only added a few lines of code, but now we have a fully functional video player in our app! It supports play, pause, rate change, displays the progress, and allows jumping to a specific time.

Editing

Of course, we won’t be able to address all the editing tasks in this chapter. However, in this solution to the task, we will create a basic code structure for editing tasks using AVVideoComposition.

AVVideoComposition is an aggregation of parameters for each video track in the resulting video. These parameters include frame rate, crop rectangle, opacity, and transformations. The collection of parameters is referred to as an instruction. Each instruction is linked to a video track and has its own timing for when the instruction takes effect. Essentially, instructions determine how each video frame is composed.

Let’s update the code for the VideoEditor class and add a private method.

private func buildComposition() -> AVVideoComposition {
    let composition = AVMutableVideoComposition(propertiesOf: asset)

    // TODO: setup your composition

    return composition
}

This method creates an AVVideoComposition based on the settings of the current AVAsset, which was created from the URL passed into the initializer. It already includes an instruction to display the video as it is, without any modifications.

Let’s add another method to the VideoEditor class.

func buildVideo() {
    guard let item = avPlayerItem else {return}

    item.videoComposition = buildComposition()
}

This is where the magic happens! AVVideoComposition can be assigned to AVPlayerItem, and the video currently playing in AVPlayer will be composed based on the instructions provided by AVVideoComposition.

Let’s also update the initializer to build the video:

init(url: URL) {
    ...

    buildVideo()
}

Build and run the project. You will see the video playing without any changes. Let’s now attempt to modify some basic properties of AVVideoComposition.

composition.renderScale = 2.0
composition.renderSize = CGSize(width: 500.0, height: 400.0)
composition.frameDuration = CMTime(value: 1, timescale: 1)

Build the project once again and run it. You will notice that the video appears different from the original. This is because AVVideoComposition applies changes to the video!

Export

As a final task, after the editing process, you may want to save or share the video. This can be accomplished using AVAssetExportSession. It allows you to save the video to a file using specified audio and video settings. The key feature of AVAssetExportSession is its ability to be linked with AVVideoComposition as well.

Let’s add some code to the VideoEditor class to handle this functionality.

var exportSession: AVAssetExportSession? {
    let preset = AVAssetExportPreset1920x1080
    guard let session = AVAssetExportSession(asset: asset, presetName: preset)
    else {return nil}

    session.outputURL = ... // specify output file here
    session.outputFileType = .mov
    session.videoComposition = buildComposition()

    return session
}

The exportSession property provides a preconfigured AVAssetExportSession with an output resolution of 1920×1080. Pay attention to following line:

session.videoComposition = buildComposition()

It’s important to specify the AVVideoComposition to apply changes to video being written.

Now we are ready to export the video. Somewhere in your UI code you invoke it:

func exportVideo() {
    guard let session = videoEditor.exportSession else {return}

    session.exportAsynchronously {
        guard let url = session.outputURL else {return}
        // `url` points to recently saved video
    }
}

Conclusion

We just solved three common tasks for video editing with Swift and AVFoundation. It’s important to notice and remember that each frame of result video being composed using instructions from AVVideoComposition. 

Throughout the post, we provided code snippets and explanations to guide you through the process of building a basic video editor in Swift. While this post covers only the foundational aspects of video editing, it provides a solid starting point for further exploration and customization.

Let’s repeat core points that we mentioned above.

  • An important requirement is to avoid long processing times.
  • The main concept is to always have AVPlayer and AVPlayerItem readily available for display.
  • AVVideoComposition is an aggregation of parameters for each video track in the resulting video. These parameters include frame rate, crop rectangle, opacity, and transformations. The collection of parameters is referred to as an instruction. Each instruction is linked to a video track and has its own timing for when the instruction takes effect. Essentially, instructions determine how each video frame is composed.

Tags

Swift, SwiftUI, AVFoundation, AVVideoComposition, AVPlayer, AVPlayerItem, AVAsset, AVAssetExportSession, AVKit, Xcode