Share via

MAUI iOS Broadcast extension, writeback pool capacity exceed

Siu Pang 0 Reputation points
2026-03-04T17:37:26.42+00:00

I am developing an iOS broadcast extension with MAUI .net 10. My problem is the ProcessSampleBuffer() only get called every 6 seconds+, although I can get the image from the CMSampleBuffer object.

I tried to monitor the extension with Console and see a lot of these message. Would anyone experience this before?

default 17:24:56.282676+0000 BroadcastExt [INFO] -[RPBroadcastSampleHandler _processPayloadWithAudioSample:type:]:180 Broadcast extension received audio payload from replayd

default 17:24:56.282944+0000 replayd [INFO] -[RPSystemBroadcastSession notifyExtensionOfAudioSampleBuffer:withType:]_block_invoke_2:652 Sending 2 audio payload to broadcast extension...

default 17:24:56.290646+0000 replayd <<<< FigVirtualDisplayProcessor >>>><private> fvdp_ensureWritebackPixelBuffer: frame size: <private>

default 17:24:56.290769+0000 replayd <<<< FigVirtualDisplayProcessor >>>> writebackCacheAllocateBuffer: writeback pool capacity exceeded

default 17:24:56.290967+0000 replayd <<<< FigVirtualDisplayProcessor >>>> writebackCacheAllocateBuffer: 1515: got error -6689

default 17:24:56.291047+0000 replayd <<<< FigVirtualDisplayProcessor >>>><private> fvdp_ensureWritebackPixelBuffer: 4074: got error -6689

Developer technologies | .NET | .NET MAUI
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Nancy Vo (WICLOUD CORPORATION) 880 Reputation points Microsoft External Staff Moderator
    2026-03-05T10:11:07.14+00:00

    Hi @Siu Pang ,

    Thank you for reaching out.

    I recommend make ProcessSampleBuffer() return immediately. We should not do heavy work inside it. Because Apple’s ReplayKit calls this method in a real-time pipeline. It must stay fast. (Base on this official document).

    While this is a non-Microsoft link, it’s official Apple Developer documentation and is safe to visit.

    I suggest doing minimal work inside ProcessSampleBuffer(), don't do heavy processing on Main Thread. Furthermore, remember to always release buffers. This is code example you can refer to:

    public override void ProcessSampleBuffer(CMSampleBuffer sampleBuffer, RPSampleBufferType sampleBufferType)
    {
        if (sampleBufferType == RPSampleBufferType.Video)
        {
    
            sampleBuffer.Retain();
    
            // Process asynchronously - DON'T WAIT!
            Task.Run(() => ProcessVideoFrameAsync(sampleBuffer));
    
            // Return immediately!
        }
    }
    
    private async Task ProcessVideoFrameAsync(CMSampleBuffer sampleBuffer)
    {
        try
        {
            // Do your heavy work here
            var image = GetImageFromBuffer(sampleBuffer);
            await ProcessImageAsync(image);
            await UploadToServerAsync(image);
        }
        finally
        {
            // Release when done
            sampleBuffer.Release();
        }
    }
    

    Hope this helps. If you found my response helpful or informative, I would greatly appreciate it if you could follow this guidance provide feedback. Thank you.


  2. Q&A Assist
    2026-03-04T17:42:58.13+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    The log messages from replayd and FigVirtualDisplayProcessor indicate that the internal writeback pixel buffer pool used by ReplayKit is running out of reusable buffers:

    • writeback pool capacity exceeded
    • writebackCacheAllocateBuffer: ... got error -6689

    When this pool is exhausted, ReplayKit cannot allocate pixel buffers for new video frames, so the broadcast extension receives video samples only intermittently. That is why RPBroadcastSampleHandler.ProcessSampleBuffer(CoreMedia.CMSampleBuffer sampleBuffer, RPSampleBufferType sampleBufferType) is only being called roughly every 6 seconds instead of continuously.

    In a ReplayKit broadcast extension, each incoming CMSampleBuffer represents media samples (video or audio). To keep the pipeline flowing, these buffers must be handled quickly and then released so that ReplayKit can recycle the underlying pixel buffers.

    To avoid exhausting the writeback pool:

    1. Keep ProcessSampleBuffer lightweight
      • In RPBroadcastSampleHandler.ProcessSampleBuffer(...), do only the minimum work needed while the CMSampleBuffer is in scope.
      • If additional processing is required (e.g., encoding, analysis), copy out the needed data and pass that copy to background work instead of holding the original CMSampleBuffer.
    2. Do not retain CMSampleBuffer instances
      • Do not store CMSampleBuffer objects (or objects that retain them) in fields, static variables, or long-lived collections.
      • Avoid designs where CMSampleBuffer is cached for later use; treat it as a short-lived container.
    3. Check for indirect retention
      • Any helper classes or managers that interact with the sample buffers must not keep strong references to them beyond immediate processing.

    RPBroadcastSampleHandler is intended to process ReplayKit buffer objects as they arrive, and CMSampleBuffer is the media container for those samples. Ensuring that each CMSampleBuffer is processed promptly and then allowed to be released will prevent the writeback pool from filling up and should restore normal, frequent calls to ProcessSampleBuffer.


    References:

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.