如何在视频上添加覆盖文本,然后重新编码?

时间:2022-01-07 06:23:33

I want to edit video from my iOS application. I want some text on the source video for language subtitles. I then want to save video with that text overlaid. text not just only display purpose. but when i open edited video it show updated video.

我想从我的iOS应用中编辑视频。我想要一些语言字幕的源视频上的文字。然后我想要保存视频和文本叠加。文本不仅仅是显示目的。但当我打开编辑的视频时,它显示了更新的视频。

Is this possible in an iOS application? If so, how?

在iOS应用程序中这可能吗?如果是这样,如何?

3 个解决方案

#1


9  

- (void)addAnimation
{       
    NSString *filePath = [[NSBundle mainBundle] pathForResource:videoName ofType:ext];

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:filePath]  options:nil];

    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];

    [compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];

    CGSize videoSize = [clipVideoTrack naturalSize];

    UIImage *myImage = [UIImage imageNamed:@"29.png"];
    CALayer *aLayer = [CALayer layer];
    aLayer.contents = (id)myImage.CGImage;
    aLayer.frame = CGRectMake(videoSize.width - 65, videoSize.height - 75, 57, 57);
    aLayer.opacity = 0.65;
    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
    videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
    [parentLayer addSublayer:videoLayer];
    [parentLayer addSublayer:aLayer];

    CATextLayer *titleLayer = [CATextLayer layer];
    titleLayer.string = @"Text goes here";
    titleLayer.font = CFBridgingRetain(@"Helvetica");
    titleLayer.fontSize = videoSize.height / 6;
    //?? titleLayer.shadowOpacity = 0.5;
    titleLayer.alignmentMode = kCAAlignmentCenter;
    titleLayer.bounds = CGRectMake(0, 0, videoSize.width, videoSize.height / 6); //You may need to adjust this for proper display
    [parentLayer addSublayer:titleLayer]; //ONLY IF WE ADDED TEXT

    AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
    videoComp.renderSize = videoSize;
    videoComp.frameDuration = CMTimeMake(1, 30);
    videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
    AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
    instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
    videoComp.instructions = [NSArray arrayWithObject: instruction];

    AVAssetExportSession *assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];//AVAssetExportPresetPassthrough
    assetExport.videoComposition = videoComp;

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString* VideoName = [NSString stringWithFormat:@"%@/mynewwatermarkedvideo.mp4",documentsDirectory];


    //NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:VideoName];
    NSURL *exportUrl = [NSURL fileURLWithPath:VideoName];

    if ([[NSFileManager defaultManager] fileExistsAtPath:VideoName])
    {
        [[NSFileManager defaultManager] removeItemAtPath:VideoName error:nil];
    }

    assetExport.outputFileType = AVFileTypeQuickTimeMovie;
    assetExport.outputURL = exportUrl;
    assetExport.shouldOptimizeForNetworkUse = YES;

    //[strRecordedFilename setString: exportPath];

    [assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         dispatch_async(dispatch_get_main_queue(), ^{
             [self exportDidFinish:assetExport];
         });
     }
     ];
}

-(void)exportDidFinish:(AVAssetExportSession*)session
{
    NSURL *exportUrl = session.outputURL;
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

    if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportUrl])
    {
        [library writeVideoAtPathToSavedPhotosAlbum:exportUrl completionBlock:^(NSURL *assetURL, NSError *error)
         {
             dispatch_async(dispatch_get_main_queue(), ^{
                 if (error) {
                     UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Video Saving Failed"
                                                                    delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
                     [alert show];
                 } else {
                     UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Saved" message:@"Saved To Photo Album"
                                                                    delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
                     [alert show];
                 }
             });
         }];

    }
    NSLog(@"Completed");
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"AlertView" message:@"Video is edited successfully." delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
    [alert show];
}

#2


2  

One way is to create your text overlay as a CoreAnimation CATextLayer, attach it to an AVAssetExportSession's videoComposition, then export your video. The resulting video will have the overlay rendered onto it.

一种方法是将文本覆盖创建为CoreAnimation CATextLayer,并将其附加到AVAssetExportSession的videoComposition文件中,然后导出视频。产生的视频将会被渲染到它上面。

This brings some benefits:

这带来了一些好处:

  1. you don't have to stop at CATextLayer - you can construct CALayer trees containing CAGradientLayer, CAShapeLayer, whatever.
  2. 你不必停在CATextLayer——你可以构建包含CAGradientLayer、CAShapeLayer等内容的CALayer树。
  3. being Core Animation layers, many of their properties are animatable, so you get smooth, iOS-style animations in your video for free.
  4. 作为核心动画层,它们的许多属性都是可动画的,所以你可以在你的视频中免费获得光滑的、ios风格的动画。

Sounds great, right? There is one little side effect: depending on the export preset you use, your video will inevitably be re-encoded at a constant framerate - for me it was 30fps. To keep file sizes small, I'd deliberately lowered my framerate by omitting redundant frames, so for the sake of a static banner, this was a dealbreaker for me.

听起来不错,对吧?有一个小的副作用:根据你使用的导出预设,你的视频将不可避免地被以一个恒定的帧速率重新编码——对我来说是30fps。为了减小文件大小,我故意省略了冗余的帧,从而降低了帧的大小,所以为了静态的banner,这对我来说是一个破坏因素。

There is some Apple sample code called AVEditDemo that demonstrates this feature, among other things. There are instructions for finding it here.

有一些苹果的示例代码叫做AVEditDemo,它演示了这一特性。这里有找到它的说明。

#3


1  

Using Chaitali Jain code the new videos will be saved without audio. Is there someone, who has an idea on this issue? Thanks!

使用Chaitali Jain编码,新的视频将在没有音频的情况下保存。有人对这个问题有想法吗?谢谢!

#1


9  

- (void)addAnimation
{       
    NSString *filePath = [[NSBundle mainBundle] pathForResource:videoName ofType:ext];

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:filePath]  options:nil];

    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:clipVideoTrack atTime:kCMTimeZero error:nil];

    [compositionVideoTrack setPreferredTransform:[[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] preferredTransform]];

    CGSize videoSize = [clipVideoTrack naturalSize];

    UIImage *myImage = [UIImage imageNamed:@"29.png"];
    CALayer *aLayer = [CALayer layer];
    aLayer.contents = (id)myImage.CGImage;
    aLayer.frame = CGRectMake(videoSize.width - 65, videoSize.height - 75, 57, 57);
    aLayer.opacity = 0.65;
    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
    videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);
    [parentLayer addSublayer:videoLayer];
    [parentLayer addSublayer:aLayer];

    CATextLayer *titleLayer = [CATextLayer layer];
    titleLayer.string = @"Text goes here";
    titleLayer.font = CFBridgingRetain(@"Helvetica");
    titleLayer.fontSize = videoSize.height / 6;
    //?? titleLayer.shadowOpacity = 0.5;
    titleLayer.alignmentMode = kCAAlignmentCenter;
    titleLayer.bounds = CGRectMake(0, 0, videoSize.width, videoSize.height / 6); //You may need to adjust this for proper display
    [parentLayer addSublayer:titleLayer]; //ONLY IF WE ADDED TEXT

    AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
    videoComp.renderSize = videoSize;
    videoComp.frameDuration = CMTimeMake(1, 30);
    videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]);
    AVAssetTrack *videoTrack = [[mixComposition tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
    instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
    videoComp.instructions = [NSArray arrayWithObject: instruction];

    AVAssetExportSession *assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];//AVAssetExportPresetPassthrough
    assetExport.videoComposition = videoComp;

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString* VideoName = [NSString stringWithFormat:@"%@/mynewwatermarkedvideo.mp4",documentsDirectory];


    //NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:VideoName];
    NSURL *exportUrl = [NSURL fileURLWithPath:VideoName];

    if ([[NSFileManager defaultManager] fileExistsAtPath:VideoName])
    {
        [[NSFileManager defaultManager] removeItemAtPath:VideoName error:nil];
    }

    assetExport.outputFileType = AVFileTypeQuickTimeMovie;
    assetExport.outputURL = exportUrl;
    assetExport.shouldOptimizeForNetworkUse = YES;

    //[strRecordedFilename setString: exportPath];

    [assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         dispatch_async(dispatch_get_main_queue(), ^{
             [self exportDidFinish:assetExport];
         });
     }
     ];
}

-(void)exportDidFinish:(AVAssetExportSession*)session
{
    NSURL *exportUrl = session.outputURL;
    ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];

    if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:exportUrl])
    {
        [library writeVideoAtPathToSavedPhotosAlbum:exportUrl completionBlock:^(NSURL *assetURL, NSError *error)
         {
             dispatch_async(dispatch_get_main_queue(), ^{
                 if (error) {
                     UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Video Saving Failed"
                                                                    delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
                     [alert show];
                 } else {
                     UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Video Saved" message:@"Saved To Photo Album"
                                                                    delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
                     [alert show];
                 }
             });
         }];

    }
    NSLog(@"Completed");
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"AlertView" message:@"Video is edited successfully." delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
    [alert show];
}

#2


2  

One way is to create your text overlay as a CoreAnimation CATextLayer, attach it to an AVAssetExportSession's videoComposition, then export your video. The resulting video will have the overlay rendered onto it.

一种方法是将文本覆盖创建为CoreAnimation CATextLayer,并将其附加到AVAssetExportSession的videoComposition文件中,然后导出视频。产生的视频将会被渲染到它上面。

This brings some benefits:

这带来了一些好处:

  1. you don't have to stop at CATextLayer - you can construct CALayer trees containing CAGradientLayer, CAShapeLayer, whatever.
  2. 你不必停在CATextLayer——你可以构建包含CAGradientLayer、CAShapeLayer等内容的CALayer树。
  3. being Core Animation layers, many of their properties are animatable, so you get smooth, iOS-style animations in your video for free.
  4. 作为核心动画层,它们的许多属性都是可动画的,所以你可以在你的视频中免费获得光滑的、ios风格的动画。

Sounds great, right? There is one little side effect: depending on the export preset you use, your video will inevitably be re-encoded at a constant framerate - for me it was 30fps. To keep file sizes small, I'd deliberately lowered my framerate by omitting redundant frames, so for the sake of a static banner, this was a dealbreaker for me.

听起来不错,对吧?有一个小的副作用:根据你使用的导出预设,你的视频将不可避免地被以一个恒定的帧速率重新编码——对我来说是30fps。为了减小文件大小,我故意省略了冗余的帧,从而降低了帧的大小,所以为了静态的banner,这对我来说是一个破坏因素。

There is some Apple sample code called AVEditDemo that demonstrates this feature, among other things. There are instructions for finding it here.

有一些苹果的示例代码叫做AVEditDemo,它演示了这一特性。这里有找到它的说明。

#3


1  

Using Chaitali Jain code the new videos will be saved without audio. Is there someone, who has an idea on this issue? Thanks!

使用Chaitali Jain编码,新的视频将在没有音频的情况下保存。有人对这个问题有想法吗?谢谢!