Subscribe to our Monthly Newsletter
For the last few months I have been working as part of a team to create an awesome instructional video app for a client here at Jackrabbit Mobile. In the process I learned a lot about video and audio editing, and I wanted to share a little about what I’ve learned so far.
I’ll be splitting this tutorial up into a few parts — in the first part (this tutorial) we’ll make a simple video playing app.
1. Setup Your View
Create a new project in Xcode. The first thing we’re going to do is create our own subclass to be used as a “player” view.
Essentially this class will simply use an AVPlayerLayer to use the output from an AVPlayer. Apple uses one of these in their Stitched Stream Player example. One thing to mention is that you can also set the fill of the view here, so if you don’t want your video to use up all the space (and consequently crop some of it), you can change the videoGravity to AVLayerVideoGravityResizeAspect.
Now we can setup the majority of our view. In your storyboard, add a view to your main view controller and pin it to each side. Change its class to the JRMPlayerView you just created.
Then add a button to the main view (not the JRMPlayerView) with constraints to align it to the X axis of it’s superview, as well as a top space to superview (around 50). Set the button’s title to “Upload.” Your storyboard should look like this so far:
That’s good enough for now. Next let’s set up video importing.
2. Upload a video
We’ll need to import a few things to be able to access our camera. Include the following files and add the two delegates below:
Next, ctrl+drag the Upload button you created in the storyboard to create both a property and Touch Up Inside event. We’ll also add a property for a UIImagePickerController — a class that will manage our camera and photo album.
We’ll add the following code to our event outlet:
So, when a user touches the upload button:
- We’ll initialize our UIImagePickerController.
- Then we’ll set it up with a mediaType: in our case we’re only going to allow users to upload or take a video (kUTTypeMovie).
- We’ll set the delegate to ourself (remember that UIImagePickerControllerDelegate earlier?) so that we’ll get a callback whenever the user finishes.
- Then we’ll create an UIAlertController. This will present an action sheet that will come from the bottom up and allow the user to pick three options: Cancel, Camera, and Photo Album.
- In our handler for the camera, we’ll set the source to UIImagePickerControllerSourceTypeCamera, and then present the UIImagePickerController. Similarly, we’ll set the source type to UIImagePickerControllerSourceTypePhotoLibrary in the photo album’s handler before presenting.
(If you are using this in production, don’t forget to verify for permissions and add any additional error handling here!)
Go ahead and run it (you’ll need to run it on your own device, since the simulator doesn’t have a camera or videos in it’s library). You should be able to press the Upload button > pick Camera (and only see an option to record) or pick Photo Album (and only see videos in your local photo album). Select or take a video and voila!…nothing happens. That’s because we didn’t set a callback for our UIImagePickerController! Let’s do that now.
The delegate method didFinishPickingMediaWithInfo will be called after a user selects a video from the album or presses “Choose” after taking their own video. We’ll need to dismiss the view controller ourselves in this callback. We’ll then need to get the local URL for this video from the dictionary sent by the controller. For now let’s just print it in the console to verify everything is working. Go ahead and test it out now.
3. Create the Player
Head back to the storyboard. Drag a button on to the screen, and like the upload button, align it horizontally and this time pin it to the bottom of the view. Set it’s title to “Play” and check the box for “Hidden.”
Ctrl drag the play button to your view controller to create a property and an action. While you’re at it, drag the JRMPlayerView into the controller to create a property for it as well. We’ll need these for later.
Now head back to the didFinishPickingMediaWithInfo method. We’ll need to create an AVPlayerItem using the url of the chosen video. The AVPlayerItem class “represents the presentation state of an asset that’s played by an AVPlayer object, and lets you observe that state.” As briefly mentioned before, an AVPlayer is Apple’s way of controlling media playback. Its AVPlayerLayer (which we’ve set up in our JRMPlayerView class) displays the visual content of that playback. We’ll need to create two more properties: one for the player item and the other for the player:
Then we’ll initialize both with the URL that came back from the image picker, and then set the player for our JRMPlayerView that we set up earlier:
Go ahead and run it and check it out. After choosing, the play button should be visible, the upload button hidden, and you should see a still image of the first frame of your video. Now let’s get it to play!
4. Pressing play
Unfortunately, Apple doesn’t make it easy to tell whether a AVPlayer is playing or not. So we’re going to create a BOOL property called isPlaying and keep track of it manually (you can see other ways of doing it in this stackoverflow post). Head back to your play/pause button action outlet:
It’s pretty simple, really. Since we’ve already got the player all set up, all we need to do is play or pause the player and change the title of the button. Check it out! Here is me testing my play and pause with my adorable dog, Demeco.
But you say, “Caroline this is great, but when it gets to the end of the video the button still says ‘Pause’ and I can’t play it again!” Well don’t get in a tizzy — we’re about to change all that.
The AVPlayer does not subscribe to a delegate protocol, but it does use key value observing. Update your didFinishPickingMediaWithInfo to add your view controller as an observer on the AVPlayerItemDidPlayToEndTimeNotification notification.
Create the method itemDidFinishPlaying — this will be called when the player reaches the end of it’s file. Inside this method we’ll simply seek back to the start of the player, mark our isPlaying BOOL, and reset the button. Don’t forget to remove the observer in your dealloc method as well.
Now you should be able to play your video all the way through, and replay from the beginning. At this point, you have a functional video player! You’re awesome. You can stop here or keep going to learn how to add a slider and a time label.
5. Creating a seeking slider
If you want to really add some jazz to your video player, you’d allow the user to seek through playback — so that’s exactly what we’re going to do. Drag out a slider on to your view controller in Storyboard (make sure it doesn’t accidentally get added as a subview of your player view). Pin it to the left and right of its superview, and to the top of the Play button.
Go ahead and set its current value to 0 and check the “Hidden” box. Ctrl+drag twice onto your view controller’s .m file to create a property as well as a “valueDidChange” action.
Next we need to unhide the slider and set it’s length (the slider’s maximum value) after the video is finished uploading. So add the following two lines to the end of your imagePickerController:didFinishPickingMediaWithInfo method:
Next we’re going to implement our slider’s valueDidChange action. Add one simple line:
Alright, so I know CMTime can be scary, so let’s walk through this. The CMTimeMakeWithSeconds takes two values: the seconds (duh), and a timescale. A timescale?? It’s essentially a resolution. You’re storing time as a rational number — value/timescale = seconds. Using NSEC_PER_SEC (nanoseconds per second) essentially means max resolution. If you’re still confused, you can see a better explanation and discussion here.
The toleranceBefore:toleranceAfter parameters will simply allow for more “detailed” scrubbing. See Apple’s explanation below:
“The time seeked to will be within the range [time-toleranceBefore, time+toleranceAfter] and may differ from the specified time for efficiency. Pass kCMTimeZero for both toleranceBefore and toleranceAfter to request sample accurate seeking which may incur additional decoding delay.”
Build and run your app and you should be able to scrub through your video.
The last part of the slider is keeping it going with the play/pause button. For this we’re going to need a timer. Go ahead and create a new property:
We’re going to initialize and invalidate it in the play/pause IBActionOutlet:
These lines simply create a timer that calls the updateSlider method every .1 seconds, and then invalidates (i.e. stops) it when the video player is paused. Let’s get rid of that warning and add our updateSlider method:
It’s pretty simple. We’re simply adding the interval of our timer (.1 seconds) to our slider’s value (remember the slider is on a scale of seconds).
The last bit we’ll need to do is add two lines to the end of our itemDidFinishPlaying method. This will 1) invalidate the timer and 2) set the value of the slider back to zero.
Check it out! You can now play, pause, and scrub your way through the hundreds of videos of your dog. Or…you know, something else (if you have a life).
6. Creating a duration label
Last but not least, I’ll be showing you how to create a current time label. Drag a view onto the storyboard and set it’s alpha to .5. Drag a label into the view and pin it to all four edges. Then pin its container view to the top and to the right like so:
Next, unhide the timeLabelContainerView in your didFinishPickingMediaWithInfo method like you did with the play pause button and the slider. Next lets create an updateTimeLabelMethod. We’ll use a helper method to format our milliseconds (from our player’s currentTime value) into a readable time label.
Let’s also update our imagePickerController:didFinishPickingMediaWithInfo method to show the total duration of the video when it’s first uploaded:
All that’s left is to call [self updateTimeLabel]; in both your updateSlider method as well as your sliderValueChanged method. Piece of cake!
Here’s the final app and a video of a squirrel getting the better of Demeco (this project made me realize I might be a crazy dog lady…):
So that’s it! You can now make a fully functioning video player for your apps. You can find the full project on my Github page. I hope you enjoyed this intro to Apple’s AVPlayer. Keep a lookout for the part two coming in the next few months where you’ll learn how to mark up the video and export it back to your photo albums.
Caroline Harrison is a computer science graduate from the University of Texas at Austin an iOS developer here at Jackrabbit Mobile.
Let’s Make Your Idea Reality
We deliver value to partners through mobile strategy expertise, user-centered design, and lean methodology. Take a look around our work portfolio and drop us a line, we’d love to chat.