dart 将抖动动画直接渲染到视频

vlurs2pr  于 2023-01-28  发布在  其他
关注(0)|答案(3)|浏览(149)

考虑到Flutter使用自己的图形引擎,是否有办法将Flutter动画直接渲染为视频,或者以逐帧的方式创建屏幕截图?
一个用例是,这允许更容易地向观众演示。
例如,一位作者希望创建一个Flutter动画教程,在该教程中,他们使用直接使用Flutter渲染的动画GIF/视频构建一个演示应用并撰写配套博客文章。
另一个例子是,UI团队之外的开发人员发现一个复杂的动画中有一个小错误。他们不需要真正学习动画代码,就可以将动画渲染成视频,并编辑带有注解的简短剪辑,然后将其发送给UI团队进行诊断。

h7wcgrx3

h7wcgrx31#

这并不漂亮,但我已经设法让一个原型工作。首先,所有的动画都需要一个主动画控制器,以便我们可以通过步进到任何部分的动画,我们想要的。其次,我们要记录的小部件树必须 Package 在一个RepaintBoundary与一个全局键。RepaintBoundary和它的键可以产生快照的小部件树如下:

Future<Uint8List> _capturePngToUint8List() async {
    // renderBoxKey is the global key of my RepaintBoundary
    RenderRepaintBoundary boundary = renderBoxKey.currentContext.findRenderObject(); 
    
    // pixelratio allows you to render it at a higher resolution than the actual widget in the application.
    ui.Image image = await boundary.toImage(pixelRatio: 2.0);
    ByteData byteData = await image.toByteData(format: ui.ImageByteFormat.png);
    Uint8List pngBytes = byteData.buffer.asUint8List();

    return pngBytes;
  }

然后,可以在一个循环中使用上述方法,该循环将小部件树捕获到pngBytes中,并将animationController向前移动一个deltaT,该deltaT由您想要的帧率指定,如下所示:

double t = 0;
int i = 1;

setState(() {
  animationController.value = 0.0;
});

Map<int, Uint8List> frames = {};
double dt = (1 / 60) / animationController.duration.inSeconds.toDouble();

while (t <= 1.0) {
  print("Rendering... ${t * 100}%");
  var bytes = await _capturePngToUint8List();
  frames[i] = bytes;

  t += dt;
  setState(() {
    animationController.value = t;
  });
  i++;
}

最后,所有这些png帧都可以通过管道传输到ffmpeg子进程中以写入视频。我还没有设法让这部分工作得很好(* 更新 *:向下滚动查看此问题的解决方案),因此我所做的是将所有png帧写入实际的png文件,然后在写入它们的文件夹中手动运行ffmpeg。(注意:我使用flutter desktop来访问我安装的ffmpeg,但是有a package on pub.dev to get ffmpeg on mobile too

List<Future<File>> fileWriterFutures = [];

frames.forEach((key, value) {
  fileWriterFutures.add(_writeFile(bytes: value, location: r"D:\path\to\my\images\folder\" + "frame_$key.png"));
});

await Future.wait(fileWriterFutures);

_runFFmpeg();

下面是我的文件编写器帮助函数:

Future<File> _writeFile({@required String location, @required Uint8List bytes}) async {
  File file = File(location);
  return file.writeAsBytes(bytes);
}

这是我的FFmpeg runner函数:

void _runFFmpeg() async {
  // ffmpeg -y -r 60 -start_number 1 -i frame_%d.png -c:v libx264 -preset medium -tune animation -pix_fmt yuv420p test.mp4
  var process = await Process.start(
      "ffmpeg",
      [
        "-y", // replace output file if it already exists
        "-r", "60", // framrate
        "-start_number", "1",
        "-i", r"./test/frame_%d.png", // <- Change to location of images
        "-an", // don't expect audio
        "-c:v", "libx264rgb", // H.264 encoding
        "-preset", "medium",
        "-crf",
        "10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
        "-tune", "animation",
        "-preset", "medium",
        "-pix_fmt", "yuv420p",
        r"./test/test.mp4" // <- Change to location of output
      ],
      mode: ProcessStartMode.inheritStdio // This mode causes some issues at times, so just remove it if it doesn't work. I use it mostly to debug the ffmpeg process' output
   );

  print("Done Rendering");
}

更新:

自从这个答案发布后,我已经知道了如何将图像直接传输到ffmpeg,而不需要先写出所有的文件。下面是我从我的一个小部件中更新的render函数。一些变量存在于小部件的上下文中,但我希望它们的值可以从上下文中推断出来:

void render([double? pixelRatio]) async {
    // If already rendering, return
    if (isRendering) return;

    String outputFileLocation = "final.mp4";

    setState(() {
      isRendering = true;
    });

    timeline.stop();

    await timeline.animateTo(0.0, duration: const Duration(milliseconds: 700), curve: Curves.easeInOutQuad);
    setState(() {
      timeline.value = 0.0;
    });

    await Future.delayed(const Duration(milliseconds: 100));

    try {
      int width = canvasSize.width.toInt();
      int height = canvasSize.height.toInt();
      int frameRate = 60;
      int numberOfFrames = frameRate * (timeline.duration!.inSeconds);

      print("starting ffmpeg..");
      var process = await Process.start(
          "ffmpeg",
          [
            "-y", // replace output file if it already exists
            // "-f", "rawvideo",
            // "-pix_fmt", "rgba",
            "-s", "${width}x$height", // size
            "-r", "$frameRate", // framrate
            "-i", "-",
            "-frames", "$numberOfFrames",
            "-an", // don't expect audio
            "-c:v", "libx264rgb", // H.264 encoding
            "-preset", "medium",
            "-crf",
            "10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
            "-tune", "animation",
            "-preset", "medium",
            "-pix_fmt", "yuv420p",
            "-vf",
            "pad=ceil(iw/2)*2:ceil(ih/2)*2", // ensure width and height is divisible by 2
            outputFileLocation
          ],
          mode: ProcessStartMode.detachedWithStdio,
          runInShell: true);

      print("writing to ffmpeg...");
      RenderRepaintBoundary boundary = paintKey.currentContext!.findRenderObject()! as RenderRepaintBoundary;

      pixelRatio = pixelRatio ?? 1.0;
      print("Pixel Ratio: $pixelRatio");

      for (int i = 0; i <= numberOfFrames; i++) {
        Timeline.startSync("Render Video Frame");
        double t = (i.toDouble() / numberOfFrames.toDouble());
        // await timeline.animateTo(t, duration: Duration.zero);
        timeline.value = t;

        ui.Image image = await boundary.toImage(pixelRatio: pixelRatio);
        ByteData? rawData = await image.toByteData(format: ui.ImageByteFormat.png);
        var rawIntList = rawData!.buffer.asInt8List().toList();
        Timeline.finishSync();

        if (i % frameRate == 0) {
          print("${((t * 100.0) * 100).round() / 100}%");
        }

        process.stdin.add(rawIntList);

        image.dispose();
      }
      await process.stdin.flush();

      print("stopping ffmpeg...");
      await process.stdin.close();
      process.kill();
      print("done!");
    } catch (e) {
      print(e);
    } finally {
      await timeline.animateTo(beforeValue, duration: const Duration(milliseconds: 500), curve: Curves.easeInOutQuad);
      setState(() {
        isRendering = false;
      });
    }
  }
wfveoks0

wfveoks02#

我使用Erik的答案作为我自己实现的起点,并希望对他的原始答案进行补充。
在将所有的png图像保存到目标位置之后,我使用了Flutter的ffmpeg包来创建所有图像的视频。由于花了一段时间才找到合适的设置来创建一个也可以由QuickTime Player播放的视频,所以我想与您分享它们:

final FlutterFFmpeg _flutterFFmpeg =
    new FlutterFFmpeg(); // Create new ffmpeg instance somewhere in your code

// Function to create the video. All png files must be available in your target location prior to calling this function.
Future<String> _createVideoFromPngFiles(String location, int framerate) async {
  final dateAsString = DateFormat('ddMMyyyy_hhmmss').format(DateTime.now());
  final filePath =
      "$location/video_$dateAsString.mov"; // had to use mov to be able to play the video on QuickTime

  var arguments = [
    "-y", // Replace output file if it already exists
    "-r", "$framerate", // Your target framerate
    "-start_number", "1",
    "-i",
    "$location/frame_%d.png", // The location where you saved all your png files
    "-an", // Don't expect audio
    "-c:v",
    "libx264", // H.264 encoding, make sure to use the full-gpl ffmpeg package version
    "-preset", "medium",
    "-crf",
    "10", // Ranges 0-51 indicates lossless compression to worst compression. Sane options are 0-30
    "-tune", "animation",
    "-preset", "medium",
    "-pix_fmt",
    "yuv420p", // Set the pixel format to make it compatible for QuickTime
    "-vf",
    "pad=ceil(iw/2)*2:ceil(ih/2)*2", // Make sure that height and width are divisible by 2
    filePath
  ];

  final result = await _flutterFFmpeg.executeWithArguments(arguments);
  return result == 0
      ? filePath
      : ''; // Result == 0 indicates that video creation was successful
}

如果您使用的是libx264,请确保遵循flutter_ffmpeg软件包的说明:您必须使用full-gpl版本,该版本包含x264库。
根据动画的长度、所需的帧速率、像素比率和设备内存,在写入文件之前保存所有帧可能会导致内存问题。因此,根据您的使用情况,您可能需要暂停/恢复动画并分多批写入文件,以免超出可用内存。

eivnm1vs

eivnm1vs3#

更新2023

我现在已经开发了一个高级的render package,它极大地优化了重画边界捕捉的方法和@Erik W.
使用Render小部件 Package 小部件:

import 'package:render/render.dart';

final controller = RenderController();

@override
Widget build(BuildContext context) {
   return Render(
      controller: controller,
      child: Container(),
   );
}

然后用控制器捕捉运动:

final result = await renderController.captureMotion(
     duration,
     format: Format.gif,
);

final file = result.output;

相关问题