本文整理了Java中com.jme3.texture.FrameBuffer
类的一些代码示例,展示了FrameBuffer
类的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。FrameBuffer
类的具体详情如下:
包路径:com.jme3.texture.FrameBuffer
类名称:FrameBuffer
[英]FrameBuffer
s are rendering surfaces allowing off-screen rendering and render-to-texture functionality. Instead of the scene rendering to the screen, it is rendered into the FrameBuffer, the result can be either a texture or a buffer.
A FrameBuffer
supports two methods of rendering, using a Texture or using a buffer. When using a texture, the result of the rendering will be rendered onto the texture, after which the texture can be placed on an object and rendered as if the texture was uploaded from disk. When using a buffer, the result is rendered onto a buffer located on the GPU, the data of this buffer is not accessible to the user. buffers are useful if one wishes to retrieve only the color content of the scene, but still desires depth testing (which requires a depth buffer). Buffers can be copied to other framebuffers including the main screen, by using Renderer#copyFrameBuffer(com.jme3.texture.FrameBuffer,com.jme3.texture.FrameBuffer). The content of a RenderBuffer can be retrieved by using Renderer#readFrameBuffer(com.jme3.texture.FrameBuffer,java.nio.ByteBuffer).
FrameBuffer
s have several attachment points, there are several color attachment points and a single depth attachment point. The color attachment points support image formats such as Format#RGBA8, allowing rendering the color content of the scene. The depth attachment point requires a depth image format.
[中]FrameBuffer
是允许屏幕外渲染和渲染到纹理功能的渲染曲面。不是场景渲染到屏幕,而是渲染到帧缓冲区,结果可以是纹理或缓冲区。FrameBuffer
支持两种渲染方法,使用纹理或使用缓冲区。使用纹理时,渲染的结果将渲染到纹理上,然后可以将纹理放置在对象上并进行渲染,就像从磁盘上载纹理一样。使用缓冲区时,结果会呈现到GPU上的缓冲区上,用户无法访问该缓冲区的数据。如果希望仅检索场景的颜色内容,但仍需要深度测试(需要深度缓冲区),则缓冲区非常有用。通过使用渲染器#copyFrameBuffer(com.jme3.texture.FrameBuffer,com.jme3.texture.FrameBuffer),可以将缓冲区复制到包括主屏幕在内的其他帧缓冲区。RenderBuffer的内容可以使用Renderer#readFrameBuffer(com.jme3.texture.FrameBuffer,java.nio.ByteBuffer)检索。FrameBuffer
s有多个附着点,有多个颜色附着点和一个深度附着点。颜色附着点支持图像格式,如Format#RGBA8,允许渲染场景的颜色内容。深度附着点需要深度图像格式。
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
/**
* init the pass called internally
* @param renderer
* @param width
* @param height
* @param textureFormat
* @param depthBufferFormat
* @param numSamples
*/
public void init(Renderer renderer, int width, int height, Format textureFormat, Format depthBufferFormat, int numSamples, boolean renderDepth) {
Collection<Caps> caps = renderer.getCaps();
if (numSamples > 1 && caps.contains(Caps.FrameBufferMultisample) && caps.contains(Caps.OpenGL31)) {
renderFrameBuffer = new FrameBuffer(width, height, numSamples);
renderedTexture = new Texture2D(width, height, numSamples, textureFormat);
renderFrameBuffer.setDepthBuffer(depthBufferFormat);
if (renderDepth) {
depthTexture = new Texture2D(width, height, numSamples, depthBufferFormat);
renderFrameBuffer.setDepthTexture(depthTexture);
}
} else {
renderFrameBuffer = new FrameBuffer(width, height, 1);
renderedTexture = new Texture2D(width, height, textureFormat);
renderFrameBuffer.setDepthBuffer(depthBufferFormat);
if (renderDepth) {
depthTexture = new Texture2D(width, height, depthBufferFormat);
renderFrameBuffer.setDepthTexture(depthTexture);
}
}
renderFrameBuffer.setColorTexture(renderedTexture);
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
offBuffer = new FrameBuffer(width, height, 1);
offBuffer.setDepthBuffer(Format.Depth);
offBuffer.setColorBuffer(Format.RGBA8);
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
public void updateFrameBuffer(FrameBuffer fb) {
if (fb.getNumColorBuffers() == 0 && fb.getDepthBuffer() == null) {
throw new IllegalArgumentException("The framebuffer: " + fb
+ "\nDoesn't have any color/depth buffers");
}
int id = fb.getId();
if (id == -1) {
glfbo.glGenFramebuffersEXT(intBuf1);
id = intBuf1.get(0);
fb.setId(id);
objManager.registerObject(fb);
statistics.onNewFrameBuffer();
}
bindFrameBuffer(fb);
FrameBuffer.RenderBuffer depthBuf = fb.getDepthBuffer();
if (depthBuf != null) {
updateFrameBufferAttachment(fb, depthBuf);
}
for (int i = 0; i < fb.getNumColorBuffers(); i++) {
FrameBuffer.RenderBuffer colorBuf = fb.getColorBuffer(i);
updateFrameBufferAttachment(fb, colorBuf);
}
setReadDrawBuffers(fb);
checkFrameBufferError();
fb.clearUpdateNeeded();
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
/**
* create an offscreen frame buffer.
*
* @param mapSize
* @param offView
* @return
*/
protected FrameBuffer createOffScreenFrameBuffer(int mapSize, ViewPort offView) {
// create offscreen framebuffer
final FrameBuffer offBuffer = new FrameBuffer(mapSize, mapSize, 1);
offBuffer.setDepthBuffer(Image.Format.Depth);
offView.setOutputFrameBuffer(offBuffer);
return offBuffer;
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
public void deleteFrameBuffer(FrameBuffer fb) {
if (fb.getId() != -1) {
if (context.boundFBO == fb.getId()) {
glfbo.glBindFramebufferEXT(GLFbo.GL_FRAMEBUFFER_EXT, 0);
context.boundFBO = 0;
}
if (fb.getDepthBuffer() != null) {
deleteRenderBuffer(fb, fb.getDepthBuffer());
}
if (fb.getColorBuffer() != null) {
deleteRenderBuffer(fb, fb.getColorBuffer());
}
intBuf1.put(0, fb.getId());
glfbo.glDeleteFramebuffersEXT(intBuf1);
fb.resetObject();
statistics.onDeleteFrameBuffer();
}
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
private void createLumShaders(){
int w = mainSceneFB.getWidth();
int h = mainSceneFB.getHeight();
hdr64 = createLumShader(w, h, 64, 64, LUMMODE_ENCODE_LUM, maxIterations, mainScene);
hdr8 = createLumShader(64, 64, 8, 8, LUMMODE_NONE, maxIterations, scene64);
hdr1 = createLumShader(8, 8, 1, 1, LUMMODE_NONE, maxIterations, scene8);
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
/**
* Create a framebuffer for an eye.
*/
public void setupFramebuffers(int eye) {
// Find the chain length
IntBuffer length = BufferUtils.createIntBuffer(1);
ovr_GetTextureSwapChainLength(session, chains[eye], length);
int chainLength = length.get();
LOGGER.fine("HMD Eye #" + eye + " texture chain length: " + chainLength);
// Create the frame buffers
framebuffers[eye] = new FrameBuffer[chainLength];
for (int i = 0; i < chainLength; i++) {
// find the GL texture ID for this texture
IntBuffer textureIdB = BufferUtils.createIntBuffer(1);
OVRGL.ovr_GetTextureSwapChainBufferGL(session, chains[eye], i, textureIdB);
int textureId = textureIdB.get();
// TODO less hacky way of getting our texture into JMonkeyEngine
Image img = new Image();
img.setId(textureId);
img.setFormat(Image.Format.RGBA8);
img.setWidth(textureW);
img.setHeight(textureH);
Texture2D tex = new Texture2D(img);
FrameBuffer buffer = new FrameBuffer(textureW, textureH, 1);
buffer.setDepthBuffer(Image.Format.Depth);
buffer.setColorTexture(tex);
framebuffers[eye][i] = buffer;
}
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
/**
* Creates a BasicShadowRenderer
* @param manager the asset manager
* @param size the size of the shadow map (the map is square)
*/
public BasicShadowRenderer(AssetManager manager, int size) {
shadowFB = new FrameBuffer(size, size, 1);
shadowMap = new Texture2D(size, size, Format.Depth);
shadowFB.setDepthTexture(shadowMap);
shadowCam = new Camera(size, size);
//DO NOT COMMENT THIS (it prevent the OSX incomplete read buffer crash)
dummyTex = new Texture2D(size, size, Format.RGBA8);
shadowFB.setColorTexture(dummyTex);
shadowMapSize = (float)size;
preshadowMat = new Material(manager, "Common/MatDefs/Shadow/PreShadow.j3md");
postshadowMat = new Material(manager, "Common/MatDefs/Shadow/BasicPostShadow.j3md");
postshadowMat.setTexture("ShadowMap", shadowMap);
dispPic.setTexture(manager, shadowMap, false);
for (int i = 0; i < points.length; i++) {
points[i] = new Vector3f();
}
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
fb.dispose();
fb = null;
fb = new FrameBuffer(width, height, 1);
fb.setDepthBuffer(Format.Depth);
fb.setColorBuffer(Format.RGB8);
fb.setSrgb(srgb);
代码示例来源:origin: us.ihmc.thirdparty.jme/jme3-lwjgl
if (fb == null || !fb.isUpdateNeeded()) {
return;
for (int i = 0; i < context.boundFB.getNumColorBuffers(); i++) {
RenderBuffer rb = context.boundFB.getColorBuffer(i);
Texture tex = rb.getTexture();
if (tex != null
if (fb.getNumColorBuffers() == 0 && fb.getDepthBuffer() == null) {
throw new IllegalArgumentException("The framebuffer: " + fb
+ "\nDoesn't have any color/depth buffers");
if (fb.isUpdateNeeded()) {
updateFrameBuffer(fb);
if (context.boundFBO != fb.getId()) {
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb.getId());
statistics.onFrameBufferUse(fb, true);
setViewPort(0, 0, fb.getWidth(), fb.getHeight());
context.boundFBO = fb.getId();
} else {
statistics.onFrameBufferUse(fb, false);
if (fb.getNumColorBuffers() == 0) {
if (fb.getNumColorBuffers() > maxFBOAttachs) {
throw new RendererException("Framebuffer has more color "
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
if (fb == null || !fb.isUpdateNeeded()) {
return;
for (int i = 0; i < context.boundFB.getNumColorBuffers(); i++) {
RenderBuffer rb = context.boundFB.getColorBuffer(i);
Texture tex = rb.getTexture();
if (tex != null
setReadDrawBuffers(null);
} else {
if (fb.isUpdateNeeded()) {
updateFrameBuffer(fb);
} else {
setViewPort(0, 0, fb.getWidth(), fb.getHeight());
assert fb.getId() > 0;
assert context.boundFBO == fb.getId();
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
FrameBuffer offBuffer = new FrameBuffer(512, 512, 1);
offBuffer.setDepthBuffer(Format.Depth);
offBuffer.setMultiTarget(true);
offBuffer.addColorTexture(offTex, TextureCubeMap.Face.NegativeX);
offBuffer.addColorTexture(offTex, TextureCubeMap.Face.PositiveX);
offBuffer.addColorTexture(offTex, TextureCubeMap.Face.NegativeY);
offBuffer.addColorTexture(offTex, TextureCubeMap.Face.PositiveY);
offBuffer.addColorTexture(offTex, TextureCubeMap.Face.NegativeZ);
offBuffer.addColorTexture(offTex, TextureCubeMap.Face.PositiveZ);
代码示例来源:origin: us.ihmc/IHMCJMonkeyEngineToolkit
public LidarDistortionProcessor(JMERenderer jmeRenderer, int scansPerSweep, int scanHeight, int numberOfCameras, float startAngle, float fieldOfView,
LidarSceneViewPort[] lidarSceneProcessors)
{
ViewPort viewport = jmeRenderer.getRenderManager().createPostView("LidarDistortionViewport", new Camera(scansPerSweep, scanHeight));
this.scansPerSweep = scansPerSweep;
this.scanHeight = scanHeight;
this.scan = new float[scanHeight * scansPerSweep];
this.lidarOutFloatBuffer = BufferUtils.createFloatBuffer(scansPerSweep * scanHeight);
FrameBuffer distortionFrameBuffer = new FrameBuffer(scansPerSweep, scanHeight, 1);
distortionFrameBuffer.setColorBuffer(Format.RGBA32F);
Material distortionMaterial = createDistortionMaterial(jmeRenderer.getAssetManager(), scansPerSweep, numberOfCameras, startAngle, fieldOfView,
lidarSceneProcessors);
Picture distortionPicture = new Picture("Distortion");
distortionPicture.setMaterial(distortionMaterial);
distortionPicture.setHeight(scanHeight);
distortionPicture.setWidth(scansPerSweep);
distortionPicture.setQueueBucket(Bucket.Gui);
distortionPicture.setCullHint(CullHint.Never);
viewport.attachScene(distortionPicture);
viewport.setClearFlags(true, true, true);
viewport.setOutputFrameBuffer(distortionFrameBuffer);
viewport.addProcessor(this);
distortionPicture.updateGeometricState();
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
public void simpleInitApp() {
this.flyCam.setMoveSpeed(10);
cam.setLocation(new Vector3f(0.028406568f, 2.015769f, 7.386517f));
cam.setRotation(new Quaternion(-1.0729783E-5f, 0.9999721f, -0.0073241726f, -0.0014647911f));
makeScene();
//Creating the main view port post processor
FilterPostProcessor fpp = new FilterPostProcessor(assetManager);
fpp.addFilter(new ColorOverlayFilter(ColorRGBA.Blue));
viewPort.addProcessor(fpp);
//creating a frame buffer for the mainviewport
FrameBuffer mainVPFrameBuffer = new FrameBuffer(cam.getWidth(), cam.getHeight(), 1);
Texture2D mainVPTexture = new Texture2D(cam.getWidth(), cam.getHeight(), Image.Format.RGBA8);
mainVPFrameBuffer.addColorTexture(mainVPTexture);
mainVPFrameBuffer.setDepthBuffer(Image.Format.Depth);
viewPort.setOutputFrameBuffer(mainVPFrameBuffer);
//creating the post processor for the gui viewport
final FilterPostProcessor guifpp = new FilterPostProcessor(assetManager);
guifpp.setFrameBufferFormat(Image.Format.RGBA8);
guifpp.addFilter(new ColorOverlayFilter(ColorRGBA.Red));
//this will compose the main viewport texture with the guiviewport back buffer.
//Note that you can switch the order of the filters so that guiviewport filters are applied or not to the main viewport texture
guifpp.addFilter(new ComposeFilter(mainVPTexture));
guiViewPort.addProcessor(guifpp);
//compositing is done by mixing texture depending on the alpha channel,
//it's important that the guiviewport clear color alpha value is set to 0
guiViewPort.setBackgroundColor(ColorRGBA.BlackNoAlpha);
guiViewPort.setClearColor(true);
}
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
if (fb.getWidth() > rbSize || fb.getHeight() > rbSize) {
throw new RendererException("Resolution " + fb.getWidth()
+ ":" + fb.getHeight() + " is not supported.");
GLImageFormat glFmt = texUtil.getImageFormatWithError(rb.getFormat(), fb.isSrgb());
if (fb.getSamples() > 1 && caps.contains(Caps.FrameBufferMultisample)) {
int samples = fb.getSamples();
int maxSamples = limits.get(Limits.FrameBufferSamples);
if (maxSamples < samples) {
samples,
glFmt.internalFormat,
fb.getWidth(),
fb.getHeight());
} else {
glfbo.glRenderbufferStorageEXT(GLFbo.GL_RENDERBUFFER_EXT,
glFmt.internalFormat,
fb.getWidth(),
fb.getHeight());
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
if (src != null && src.isUpdateNeeded()) {
updateFrameBuffer(src);
if (dst != null && dst.isUpdateNeeded()) {
updateFrameBuffer(dst);
srcY1 = vpY + vpH;
} else {
glfbo.glBindFramebufferEXT(GLFbo.GL_READ_FRAMEBUFFER_EXT, src.getId());
srcX1 = src.getWidth();
srcY1 = src.getHeight();
dstY1 = vpY + vpH;
} else {
glfbo.glBindFramebufferEXT(GLFbo.GL_DRAW_FRAMEBUFFER_EXT, dst.getId());
dstX1 = dst.getWidth();
dstY1 = dst.getHeight();
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
return false;
if (fb.getSamples() > 1
&& !caps.contains(Caps.FrameBufferMultisample))
return false;
RenderBuffer depthBuf = fb.getDepthBuffer();
if (depthBuf != null){
Format depthFmt = depthBuf.getFormat();
for (int i = 0; i < fb.getNumColorBuffers(); i++){
if (!supportsColorBuffer(caps, fb.getColorBuffer(i))){
return false;
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
fb = new FrameBuffer(w, h, 1);
fb.setDepthTexture(depthData);
fb.addColorTexture(diffuseData);
fb.addColorTexture(normalData);
fb.addColorTexture(specularData);
fb.setMultiTarget(true);
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
scene64FB = new FrameBuffer(64, 64, 1);
scene64 = new Texture2D(64, 64, lumFmt);
scene64FB.setColorTexture(scene64);
scene64.setMagFilter(fbMagFilter);
scene64.setMinFilter(fbMinFilter);
scene8FB = new FrameBuffer(8, 8, 1);
scene8 = new Texture2D(8, 8, lumFmt);
scene8FB.setColorTexture(scene8);
scene8.setMagFilter(fbMagFilter);
scene8.setMinFilter(fbMinFilter);
scene1FB[0] = new FrameBuffer(1, 1, 1);
scene1[0] = new Texture2D(1, 1, lumFmt);
scene1FB[0].setColorTexture(scene1[0]);
scene1FB[1] = new FrameBuffer(1, 1, 1);
scene1[1] = new Texture2D(1, 1, lumFmt);
scene1FB[1].setColorTexture(scene1[1]);
代码示例来源:origin: jMonkeyEngine/jmonkeyengine
leftEyeTexture = (Texture2D) getLeftViewPort().getOutputFrameBuffer().getColorBuffer().getTexture();
rightEyeTexture = (Texture2D) getRightViewPort().getOutputFrameBuffer().getColorBuffer().getTexture();
leftEyeDepth = (Texture2D) getLeftViewPort().getOutputFrameBuffer().getDepthBuffer().getTexture();
rightEyeDepth = (Texture2D) getRightViewPort().getOutputFrameBuffer().getDepthBuffer().getTexture();
内容来源于网络,如有侵权,请联系作者删除!