[PATCH] ui/cocoa: capture screencast with AVAssetWriter

Zhang Chen posted 1 patch 2 years, 3 months ago
Test checkpatch failed
Patches applied successfully (tree, apply log)
git fetch https://github.com/patchew-project/qemu tags/patchew/20220111070258.2983-1-tgfbeta@me.com
Maintainers: Gerd Hoffmann <kraxel@redhat.com>, Peter Maydell <peter.maydell@linaro.org>
meson.build    |   6 +++
ui/cocoa.m     | 132 +++++++++++++++++++++++++++++++++++++++++++++++--
ui/meson.build |   1 +
3 files changed, 136 insertions(+), 3 deletions(-)
[PATCH] ui/cocoa: capture screencast with AVAssetWriter
Posted by Zhang Chen 2 years, 3 months ago
To record screencast, AVAssetWriter APIs were called for each
cocoa_update call.

Commands for start/stop recording were added to View menu.

AVFoundation, CoreMedia and CoreVideo were added as linking
dependencies.

Signed-off-by: Zhang Chen <tgfbeta@me.com>
---
 meson.build    |   6 +++
 ui/cocoa.m     | 132 +++++++++++++++++++++++++++++++++++++++++++++++--
 ui/meson.build |   1 +
 3 files changed, 136 insertions(+), 3 deletions(-)

diff --git a/meson.build b/meson.build
index 886f0a9343..70c38c4135 100644
--- a/meson.build
+++ b/meson.build
@@ -281,6 +281,9 @@ emulator_link_args = []
 nvmm =not_found
 hvf = not_found
 host_dsosuf = '.so'
+avfoundation = []
+coremedia = []
+corevideo = []
 if targetos == 'windows'
   socket = cc.find_library('ws2_32')
   winmm = cc.find_library('winmm')
@@ -292,6 +295,9 @@ if targetos == 'windows'
   host_dsosuf = '.dll'
 elif targetos == 'darwin'
   coref = dependency('appleframeworks', modules: 'CoreFoundation')
+  avfoundation = dependency('appleframeworks', modules: 'AVFoundation')
+  coremedia = dependency('appleframeworks', modules: 'CoreMedia')
+  corevideo = dependency('appleframeworks', modules: 'CoreVideo')
   iokit = dependency('appleframeworks', modules: 'IOKit', required: false)
   host_dsosuf = '.dylib'
 elif targetos == 'sunos'
diff --git a/ui/cocoa.m b/ui/cocoa.m
index 69745c483b..6a0fc24414 100644
--- a/ui/cocoa.m
+++ b/ui/cocoa.m
@@ -25,6 +25,7 @@
 #include "qemu/osdep.h"
 
 #import <Cocoa/Cocoa.h>
+#import <AVFoundation/AVFoundation.h>
 #include <crt_externs.h>
 
 #include "qemu-common.h"
@@ -309,6 +310,12 @@ static void handleAnyDeviceErrors(Error * err)
     BOOL isMouseGrabbed;
     BOOL isFullscreen;
     BOOL isAbsoluteEnabled;
+    AVAssetWriter *capture;
+    AVAssetWriterInput *captureInput;
+    AVAssetWriterInputPixelBufferAdaptor *captureInputAdaptor;
+    BOOL isCapturing;
+    NSDate *captureStart;
+    CVPixelBufferRef captureBuffer;
 }
 - (void) switchSurface:(pixman_image_t *)image;
 - (void) grabMouse;
@@ -332,6 +339,9 @@ static void handleAnyDeviceErrors(Error * err)
 - (float) cdy;
 - (QEMUScreen) gscreen;
 - (void) raiseAllKeys;
+- (void) startCapture;
+- (void) stopCapture;
+- (BOOL) isCapturing;
 @end
 
 QemuCocoaView *cocoaView;
@@ -364,6 +374,10 @@ QemuCocoaView *cocoaView;
     [super dealloc];
 }
 
+- (BOOL) isCapturing {
+    return isCapturing;
+}
+
 - (BOOL) isOpaque
 {
     return YES;
@@ -425,6 +439,81 @@ QemuCocoaView *cocoaView;
     [NSCursor unhide];
 }
 
+- (IBAction) startCapture
+{
+    NSError *err;
+    
+    NSString *outputPath = [NSString stringWithFormat:@"/tmp/capture_%.1f.mp4", [[NSDate now] timeIntervalSinceReferenceDate]];
+    NSURL *fileURL = [NSURL fileURLWithPath:outputPath];
+    capture = [[AVAssetWriter alloc] initWithURL:fileURL fileType:AVFileTypeMPEG4 error:&err];
+    
+    captureInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo
+                                                   outputSettings:@{
+        AVVideoCodecKey: AVVideoCodecTypeH264,
+        AVVideoWidthKey: [NSNumber numberWithInt:screen.width],
+        AVVideoHeightKey: [NSNumber numberWithInt:screen.height],
+    }];
+    NSParameterAssert([capture canAddInput:captureInput]);
+    [capture addInput:captureInput];
+    captureInputAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:captureInput sourcePixelBufferAttributes:nil];
+    
+    OSStatus bufferStatus = CVPixelBufferCreateWithBytes(NULL,
+                                                         screen.width,
+                                                         screen.height,
+                                                         kCVPixelFormatType_32BGRA,
+                                                         pixman_image_get_data(pixman_image),
+                                                         pixman_image_get_stride(pixman_image),
+                                                         NULL,
+                                                         NULL,
+                                                         NULL,
+                                                         &captureBuffer);
+    if (bufferStatus != kCVReturnSuccess) {
+        NSLog(@"err creating pixel buf: %d", bufferStatus);
+        return;
+    }
+    captureStart = [NSDate new];
+    [capture startWriting];
+    [capture startSessionAtSourceTime:kCMTimeZero];
+    isCapturing = TRUE;
+    [self captureFrame];
+}
+
+- (void) stopCapture
+{
+    if (isCapturing) {
+        isCapturing = FALSE;
+        NSDate *now = [NSDate now];
+        NSTimeInterval interval = [now timeIntervalSinceDate:captureStart];
+        CMTime ts = CMTimeMakeWithSeconds(interval, 1000000);
+        [captureInput markAsFinished];
+        [capture endSessionAtSourceTime:ts];
+        pixman_image_ref(pixman_image);
+        [capture finishWritingWithCompletionHandler:^() {
+            NSLog(@"finishWriting");
+            [captureInputAdaptor release];
+            [captureInput release];
+            CFRelease(captureBuffer);
+            pixman_image_unref(pixman_image);
+        }];
+        [captureStart release];
+    }
+}
+
+- (void) captureFrame
+{
+    if (isCapturing && captureBuffer && [captureInput isReadyForMoreMediaData]) {
+        NSDate *now = [NSDate now];
+        NSTimeInterval interval = [now timeIntervalSinceDate:captureStart];
+        CMTime ts = CMTimeMakeWithSeconds(interval, 1000000);
+        CFRetain(captureBuffer);
+        pixman_image_ref(pixman_image);
+        [captureInputAdaptor appendPixelBuffer:captureBuffer
+                           withPresentationTime:ts];
+        CFRelease(captureBuffer);
+        pixman_image_unref(pixman_image);
+    }
+}
+
 - (void) drawRect:(NSRect) rect
 {
     COCOA_DEBUG("QemuCocoaView: drawRect\n");
@@ -573,6 +662,7 @@ QemuCocoaView *cocoaView;
     bool isResize = (w != screen.width || h != screen.height || cdx == 0.0);
 
     int oldh = screen.height;
+    BOOL needsRestartCapture = isResize && isCapturing;
     if (isResize) {
         // Resize before we trigger the redraw, or we'll redraw at the wrong size
         COCOA_DEBUG("switchSurface: new size %d x %d\n", w, h);
@@ -580,6 +670,7 @@ QemuCocoaView *cocoaView;
         screen.height = h;
         [self setContentDimensions];
         [self setFrame:NSMakeRect(cx, cy, cw, ch)];
+        [self stopCapture];
     }
 
     // update screenBuffer
@@ -588,7 +679,9 @@ QemuCocoaView *cocoaView;
     }
 
     pixman_image = image;
-
+    if (needsRestartCapture) {
+        [self startCapture];
+    }
     // update windows
     if (isFullscreen) {
         [[fullScreenWindow contentView] setFrame:[[NSScreen mainScreen] frame]];
@@ -1117,6 +1210,8 @@ QemuCocoaView *cocoaView;
 - (IBAction) do_about_menu_item: (id) sender;
 - (void)make_about_window;
 - (void)adjustSpeed:(id)sender;
+- (IBAction) startCapture:(id)sender;
+- (IBAction) stopCapture:(id)sender;
 @end
 
 @implementation QemuCocoaAppController
@@ -1175,8 +1270,10 @@ QemuCocoaView *cocoaView;
 {
     COCOA_DEBUG("QemuCocoaAppController: dealloc\n");
 
-    if (cocoaView)
+    if (cocoaView) {
+        [cocoaView stopCapture];
         [cocoaView release];
+    }
     [super dealloc];
 }
 
@@ -1569,6 +1666,23 @@ QemuCocoaView *cocoaView;
     COCOA_DEBUG("cpu throttling at %d%c\n", cpu_throttle_get_percentage(), '%');
 }
 
+- (IBAction) startCapture:(id)sender
+{
+    [sender setEnabled:NO];
+    [cocoaView startCapture];
+    if ([cocoaView isCapturing]) {
+        [[[sender menu] itemWithTitle:@"Stop Capture"] setEnabled: YES];
+    }
+}
+
+- (IBAction) stopCapture:(id)sender
+{
+    [sender setEnabled: NO];
+    [cocoaView stopCapture];
+    if (![cocoaView isCapturing]) {
+        [[[sender menu] itemWithTitle:@"Capture"] setEnabled: YES];
+    }
+}
 @end
 
 @interface QemuApplication : NSApplication
@@ -1623,8 +1737,18 @@ static void create_initial_menus(void)
 
     // View menu
     menu = [[NSMenu alloc] initWithTitle:@"View"];
+    [menu setAutoenablesItems: NO];
     [menu addItem: [[[NSMenuItem alloc] initWithTitle:@"Enter Fullscreen" action:@selector(doToggleFullScreen:) keyEquivalent:@"f"] autorelease]]; // Fullscreen
     [menu addItem: [[[NSMenuItem alloc] initWithTitle:@"Zoom To Fit" action:@selector(zoomToFit:) keyEquivalent:@""] autorelease]];
+    [menu addItem:[NSMenuItem separatorItem]];
+    menuItem = [[[NSMenuItem alloc] initWithTitle:@"Capture" action:@selector(startCapture:) keyEquivalent:@""] autorelease];
+    [menu addItem: menuItem];
+    [menuItem setTag:1200];
+    [menuItem setEnabled: YES];
+    menuItem = [[[NSMenuItem alloc] initWithTitle:@"Stop Capture" action:@selector(stopCapture:) keyEquivalent:@""] autorelease];
+    [menu addItem: menuItem];
+    [menuItem setTag:1201];
+    [menuItem setEnabled: NO];
     menuItem = [[[NSMenuItem alloc] initWithTitle:@"View" action:nil keyEquivalent:@""] autorelease];
     [menuItem setSubmenu:menu];
     [[NSApp mainMenu] addItem:menuItem];
@@ -1962,7 +2086,9 @@ static void cocoa_update(DisplayChangeListener *dcl,
     NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
 
     COCOA_DEBUG("qemu_cocoa: cocoa_update\n");
-
+    if ([cocoaView isCapturing]) {
+        [cocoaView captureFrame];
+    }
     dispatch_async(dispatch_get_main_queue(), ^{
         NSRect rect;
         if ([cocoaView cdx] == 1.0) {
diff --git a/ui/meson.build b/ui/meson.build
index 64286ba150..cef7e90319 100644
--- a/ui/meson.build
+++ b/ui/meson.build
@@ -25,6 +25,7 @@ softmmu_ss.add(when: 'CONFIG_LINUX', if_true: files(
   'udmabuf.c',
 ))
 softmmu_ss.add(when: cocoa, if_true: files('cocoa.m'))
+softmmu_ss.add(when: avfoundation, if_true: [avfoundation, coremedia, corevideo])
 
 vnc_ss = ss.source_set()
 vnc_ss.add(files(
-- 
2.30.2


Re: [PATCH] ui/cocoa: capture screencast with AVAssetWriter
Posted by Peter Maydell 2 years, 3 months ago
On Tue, 11 Jan 2022 at 07:09, Zhang Chen <tgfbeta@me.com> wrote:
>
> To record screencast, AVAssetWriter APIs were called for each
> cocoa_update call.
>
> Commands for start/stop recording were added to View menu.

This seems a bit of an odd feature -- why doesn't the OS just
permit screen recording of any application without the app
having to have code for it specifically ?

-- PMM

Re: [PATCH] ui/cocoa: capture screencast with AVAssetWriter
Posted by Chen Zhang 2 years, 3 months ago
Granted that this patch might not fit for main branch, I hope this snippet could help someone in need.

Screen cast feature shipped with macOS does support screen recording, but only for whole screen or selected rectangle, not for a selected window like photo capture feature.

And pixels are not sourced from qemu display frame buffer. This means the screencast would be scaled by contentsScale (for retina screen) and window titlebar would be included.



Best Regards

> 2022年1月11日 下午4:31,Peter Maydell <peter.maydell@linaro.org> 写道:
> 
> On Tue, 11 Jan 2022 at 07:09, Zhang Chen <tgfbeta@me.com> wrote:
>> 
>> To record screencast, AVAssetWriter APIs were called for each
>> cocoa_update call.
>> 
>> Commands for start/stop recording were added to View menu.
> 
> This seems a bit of an odd feature -- why doesn't the OS just
> permit screen recording of any application without the app
> having to have code for it specifically ?
> 
> -- PMM