iPhone Life magazine

Coding tips: creating PNG (JPG) and QuickTime files – a benchmark video generator for iOS + OS X

UPDATE (09/12/2012): I've created two truly 1080p versions of the counter video. They're HERE (non-streaming-optimized) and HERE (optimized for streaming with Subler). The former has a H.264 level of 4.2 (the latter is 4.1) and, consequently,  can't be played in their Web browser. (Neither 4.1 nor 4.2 1080p60 can be synched to iOS devices directly. Playback from third-party apps using hardware decoding works, of course.) HERE's a 4.1-level version of the same file.

Original article:

 

For my forthcoming article on all things video playback (also see a related article HERE [backup mirror HERE]), I've decided to make some serious video benchmarks similar to what I've done back in the Windows Mobile days when evaluating H.264 codecs (see my H.264 bible [alternative backup link 1, 2, 3, 4 – use them for non-messed up UTF-8 encoding and want to see the pictures!]) or when comparing device controller apps (first article HERE [backup link 1, 2]).

After having noticed that the current Mac-based screen capture tools (at least the ones I've tested: the free, Web-based Screencast-o-matic and the latest ScreenFlow) are just too slow for creating a 30 fps test video (let alone 60 fps!) I can later convert to different video formats for objective benchmarks and also not having found anything similar (just a video file with a big counter running at 30/60 frames per second), I've decided to create a video strictly programmatically, without any help from any screen recorder tool or, even worse, physical camera. Fortunately, it has turned out to be as easy a task as was creating a counter in C# (see the source code HERE) back in the Windows Mobile days.

iOS on-screen counter

If you take a look at my C# source and/or the screen output it makes, you'll already guess I've made an attempt of re-creating it for iOS as a native app. As has already been explained, finally, I haven't made much use of it as the screen recorders simply aren't up to the task of recording this video and recording the stuff with a physical camera wouldn't have been ideal or even scientific. Nevertheless, if you would like to know how the native iOS equivalent of my old C# counter looks like, it's very-very simple: it's just an iPad View-Based Application with a big UILabel occupying almost the entire screen surface with a text size of 300 points. The counter in the label is increased every 1/60th second in a recursive method in the View Controller: -(void)incCounter {static int cnt = 0; label.text = [NSString stringWithFormat:@"%i", ++cnt]; [self performSelector:@selector(incCounter) withObject:nil afterDelay:1.0/60];}, where label is declared (and synthesized) as @property (retain) IBOutlet UILabel* label; and the method is first called ([self incCounter];) from - (void)viewDidLoad. All this is available as a full iOS project HERE.

Because of the OS X screen recorder problems I've outlined above, I quickly abandoned the project and went with generating the movie file programmatically.

Programmatically generating a movie file

First of all, easily generating a movie file is only possible under OS X, via QTKit. This is the easiest way already producing decent and standards-compliant output without using third-party code or learning the intricacies of some video formats so that you can create them, bit-by-bit, yourself. Using QTKit is explained in section “Creating an empty movie and adding images to it” in Technical Note TN2138 - QTKit Frequently Asked Questions. An example, QTKitCreateMovie, shows how 60 JPG images stored in the bundle need to be added to the image. In the app, the QTMovie extension QTMovieExtensions is doing the real work. The method addImagesAsMPEG4 uses QTMovie's addImage to add the JPG files. The additional stuff it locally does is just iterating over the array of JPGs passed as an NSArray. Note that the caller of addImagesAsMPEG4 doesn't have the number of actual images wired-in; it uses NSArray *imagesArray = [[NSBundle mainBundle] pathsForResourcesOfType:@"jpg" inDirectory:nil]; to populate the array. Of particular interest is timeValue, which sets the frame rate: the default 30 stands for 20 fps; 20 stands for 30 (this is what we'll need for 30 fps videos) and 100 for 6. This means that, should you want to make a video of your JPG files (instead of the shipped ones in the bundle), all you would need to do is just deleting the old JPG files from the bundle and copying your ones there instead. That is, apart from changing the above-mentioned timeValue (should you want to change the framerate), little needs to be done to fine-tune the source.

However, as you'll see, when there's more than a handful (say, 100-200) source images, this example project becomes exceedingly slow. That is, don't start throwing at more than a hundred images; otherwise, it'll just stall. Read on to see how it should be rewritten. First, however, I'll explain how the source image files themselves need to be created programmatically so that you don't need to create them in an image editor tool, number-by-number.

Creating source counter images

Programmatically creating these files under iOS (and, then, just copying the files to the bundle of the above-introduced QTKitCreateMovie example) is done the following way. First, saving an existing, filled-in UIImage is done by simply doing the following way: NSString *pngPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Test.png"]; [UIImagePNGRepresentation(image) writeToFile:pngPath atomically:YES];.

That is, all we need to do is just creating a UIImage with the right content. This is easy. To copy the content of the current view to a UIImage, use the following instructions:

UIGraphicsBeginImageContext([self.view bounds].size);
[[self.view layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

Now, all you need to do is adding the previous instructions to the code that increases the number in the UILabel and just save each new screen image to a separate file with a constantly increasing filename suffix. The main worker method of our first application (that is, the one that just presented an increasing counter on-screen but didn't save it to any file) becomes the following:

-(void)incCounter
{
static int cnt = 0;
label.text = [NSString stringWithFormat:@"%i", ++cnt];
UIGraphicsBeginImageContext([self.view bounds].size);
[[self.view layer] renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSString *pngPath = [NSString stringWithFormat:@"/cnt%04d.png", cnt];
[UIImagePNGRepresentation(image) writeToFile:pngPath atomically:YES];
[self performSelector:@selector(incCounter) withObject:nil afterDelay:1.0/60];
}

It stores the files in the root (feel free to modify the path at NSString *pngPath = [NSString stringWithFormat:@"/cnt%04d.png", cnt]; so that the (numerous) files are stored elsewhere.)

The source of this app is HERE as a full project.

You can safely run the tool to generate any number of files. (You, in practice, won't need more than several hundred or some thousand images for video benchmarking, though.)

A much better QuickTime generator tool

If you generate more than a few hundred source image files, the original QuickTime movie generator application (don't forget to change all occurrences of JPG to PNG in it to read PNG files, should you want to give it a try!), QTKitCreateMovie, will quickly become useless. The sole reason for this is that it'll try to cache all the image files in the memory (it uses an array to store them and pass around). Even 100 Mbytes of PNG images will quickly result in the app taking up more than 2 Gbyte of memory and it'll never finish as the runtime system / garbage collector will force it to start allocating memory for the same images again and again, forcing the app in an endless cycle.

Therefore, what you need is just creating a barebone OS X application that doesn't cache these kinds of source images in memory. Fortunately, it's much easier than you may think if you don't want a menu-based application. And why would you want one? After all, all you need is just converting the files without any human intervention, not fancy “About” and several kinds of Open / Save menu items...

First, create a new project of type OS X Application / Command Line Tool. Make sure it's not the default “C” but “Foundation” so that you can freely access the Objective-C class libraries (and can write Objective-C code). Then, make the contents of main (and the only) .m file the following:


#import <Foundation/Foundation.h>
#import <Cocoa/Cocoa.h>
#import <QTKit/QTKit.h>

int main (int argc, const char * argv[]) {
NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
NSDictionary *myDict = nil;
myDict = [NSDictionary dictionaryWithObjectsAndKeys:@"mp4v",
QTAddImageCodecType,
[NSNumber numberWithLong:codecHighQuality],
QTAddImageCodecQuality,
nil];

long long timeValue = 20; // 100 = 6 fps, 10 = 60 fps, 20 = 30 fps
long timeScale = 600;
QTTime duration = QTMakeTime(timeValue, timeScale);
QTMovie *mMovie =[[QTMovie alloc] initToWritableFile:@"tmp30fps.tmp" error:NULL];

for (int i=1; i<3340; i++){
NSURL *fileUrl = [NSURL fileURLWithPath:[NSString stringWithFormat:@"/Users/werner/imgsrcs/cnt%04d.png", i]];
NSImage *anImage = [[NSImage alloc] initWithContentsOfURL:fileUrl];
[mMovie addImage:anImage
forDuration:duration
withAttributes:myDict];
[anImage release];
}


myDict = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES]
forKey:QTMovieFlatten];
[mMovie writeToFile:@"30fps-2.mov" withAttributes:myDict];
[pool drain];
return 0;
}

Here, change “long long timeValue = 20;” to e.g. 10 if you need 60 fps video and so on. Also, feel free to modify the input (here: NSURL *fileUrl = [NSURL fileURLWithPath:[NSString stringWithFormat:@"/Users/werner/imgsrcs/cnt%04d.png", i]];) and the output ([mMovie writeToFile:@"30fps-2.mov" withAttributes:myDict];) filepath and/or name. You can also change the name of the temporary file – but it's not strictly needed.

The full project (including the full source) of the project is HERE.

After having created some test videos with the tool, all you need to do is transcoding it to any format / any resolution you want in order to test playback efficiency and speed. You'll need to resize it, should you want to test video playback at different source resolutions (e.g., 720p, 1080p, 480p, PAL SD etc.)

Feel free to (re)use the sources for your own projects.

 

Want to master your iPhone and iPad? Sign up here to get our tip of the day delivered right to your inbox.
Email icon
Want more? Get our weekly newsletter:

Werner Ruotsalainen is an iOS and Java programming lecturer who is well-versed in programming, hacking, operating systems, and programming languages. Werner tries to generate unique articles on subjects not widely discussed. Some of his articles are highly technical and are intended for other programmers and coders.

Werner also is interested in photography and videography. He is a frequent contributor to not only mobile and computing publications, but also photo and video forums. He loves swimming, skiing, going to the gym, and using his iPads. English is one of several languages he speaks.