unity3d 与Agora共享Oculus任务屏幕

gt0wga4j  于 2023-02-13  发布在  Go
关注(0)|答案(2)|浏览(149)

我Agora.io在Unity中使用www.example.com来执行屏幕共享,当有两台台式电脑参与时效果很好。现在我试图用Oculus Quest和一台电脑实现同样的效果。电脑将有一个原始图像纹理来显示Oculus屏幕视图。不幸的是,根本没有输入,只有一个黑屏。但提醒你,当两台电脑甚至一部安卓手机连接时效果很好。它显示屏幕视图。只有当Oculus Quest连接时它才不工作。我甚至给出了Oculus实现这一点所需的所有权限,但它不工作。
编辑:我知道我必须改变屏幕宽度和屏幕高度为一个自定义渲染纹理,并将其附加到一个相机。我也这样做了,但这次的输出是空的,即使在桌面模式。

using System;
using System.Collections;
using System.Collections.Generic;
using System.Globalization;
using System.Runtime.InteropServices;
using agora_gaming_rtc;
using UnityEngine;
using UnityEngine.UI;
public class ScreenShare : MonoBehaviour {
    Texture2D mTexture;
    Rect mRect;
    [SerializeField]
    private string appId = "Your_AppID";
    [SerializeField]
    private string channelName = "agora";
    public IRtcEngine mRtcEngine;
    int i = 100;
    void Start () {
        Debug.Log ("ScreenShare Activated");
        mRtcEngine = IRtcEngine.getEngine (appId);
        // enable log
        mRtcEngine.SetLogFilter (LOG_FILTER.DEBUG | LOG_FILTER.INFO | LOG_FILTER.WARNING | LOG_FILTER.ERROR | LOG_FILTER.CRITICAL);
        // set callbacks (optional)
        mRtcEngine.SetParameters ("{\"rtc.log_filter\": 65535}");
        //Configure the external video source
        mRtcEngine.SetExternalVideoSource (true, false);
        // Start video mode
        mRtcEngine.EnableVideo ();
        // allow camera output callback
        mRtcEngine.EnableVideoObserver ();
        // join channel
        mRtcEngine.JoinChannel (channelName, null, 0);
        //Create a rectangle width and height of the screen
        mRect = new Rect (0, 0, Screen.width, Screen.height);
        //Create a texture the size of the rectangle you just created
        mTexture = new Texture2D ((int) mRect.width, (int) mRect.height, TextureFormat.BGRA32, false);
    }
    void Update () {
        //Start the screenshare Coroutine
        StartCoroutine (shareScreen ());
    }
    //Screen Share
    IEnumerator shareScreen () {
        yield return new WaitForEndOfFrame ();
        //Read the Pixels inside the Rectangle
        mTexture.ReadPixels (mRect, 0, 0);
        //Apply the Pixels read from the rectangle to the texture
        mTexture.Apply ();
        // Get the Raw Texture data from the the from the texture and apply it to an array of bytes
        byte[] bytes = mTexture.GetRawTextureData ();
        // Make enough space for the bytes array
        int size = Marshal.SizeOf (bytes[0]) * bytes.Length;
        // Check to see if there is an engine instance already created
        IRtcEngine rtc = IRtcEngine.QueryEngine ();
        //if the engine is present
        if (rtc != null) {
            //Create a new external video frame
            ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame ();
            //Set the buffer type of the video frame
            externalVideoFrame.type = ExternalVideoFrame.VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA;
            // Set the video pixel format
            externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_BGRA;
            //apply raw data you are pulling from the rectangle you created earlier to the video frame
            externalVideoFrame.buffer = bytes;
            //Set the width of the video frame (in pixels)
            externalVideoFrame.stride = (int) mRect.width;
            //Set the height of the video frame
            externalVideoFrame.height = (int) mRect.height;
            //Remove pixels from the sides of the frame
            externalVideoFrame.cropLeft = 10;
            externalVideoFrame.cropTop = 10;
            externalVideoFrame.cropRight = 10;
            externalVideoFrame.cropBottom = 10;
            //Rotate the video frame (0, 90, 180, or 270)
            externalVideoFrame.rotation = 180;
            // increment i with the video timestamp
            externalVideoFrame.timestamp = i++;
            //Push the external video frame with the frame we just created
            int a = rtc.PushVideoFrame (externalVideoFrame);
            Debug.Log (" pushVideoFrame =       " + a);
        }
    }
}
um6iljoc

um6iljoc1#

你如何管理渲染纹理?2它是否链接到一个摄像机?3你应该将渲染纹理分配给一个摄像机,并从它那里获取数据。4下面是一个来自不同项目例子,你可以看看渲染纹理数据是如何被使用的。
还请注意,您正在遵循一个过时的教程,其中API在SDK更新后略有变化。该示例也在这方面过时。像素格式应该使用RGBA而不是BGRA来实现跨平台兼容性。

externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_RGBA;
t40tm48m

t40tm48m2#

从Vulkan切换到OpenGLES3。
我猜ReadPixels()函数在带有Vulkan API的Quest 2中不起作用

相关问题