添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

hello everyone!
I want to use the effectcomposer to render two scenes, just like using the webgl renderer to achieve the same function, like this

renderer.render(scene, camera)
renderer.autoClear = false;
renderer.render(scene2, camera);
renderer.autoClear = true;

What should I do?

Set it just once and use this pattern:

const renderer = new THREE.WebGLRenderer();
renderer.autoClear = false;
function animate() {
    requestAnimationFrame( animate );
    renderer.clear(); // manual clear
    renderer.render(scene, camera);
    renderer.render(scene2, camera);

In context of post processing you normally don’t need two render calls. Have you considered to use two instances of RenderPass?

If one scene needs to use post and the other scene does not need to use post, how should we render two scenes at the same time. :thinking:It doesn’t work

const renderScene = new RenderPass(scene, camera);
      const bloomPass = new UnrealBloomPass(
        new THREE.Vector2(dom.offsetWidth, dom.offsetHeight),
composer = new EffectComposer(renderer);
composer.addPass(renderScene);
composer.addPass(bloomPass);
renderer.autoClear = false;
function animate() {
    requestAnimationFrame( animate );
    renderer.clear(); 
    composer.render();
    renderer.render(scene2, camera);
              

I saw this demo, but all object3d in it are in one scene. I need two scenes to distinguish. Is there any other solution besides this demo?One scene stores my objects and the other stores helpers.
image800×704 167 KB

    const renderScene = new RenderPass(scene, camera);
    const renderScene2 = new RenderPass(scene2, camera);
    const bloomPass = new UnrealBloomPass(new THREE.Vector2(window.innerWidth, window.innerHeight), 1.5, 0.4, 0.85);
    bloomPass.threshold = params.bloomThreshold;
    bloomPass.strength = params.bloomStrength;
    bloomPass.radius = params.bloomRadius;
    const pixelRatio = renderer.getPixelRatio();
    const fxaaPass = new ShaderPass(FXAAShader);
    fxaaPass.material.uniforms['resolution'].value.x = 1 / (window.innerWidth * pixelRatio);
    fxaaPass.material.uniforms['resolution'].value.y = 1 / (window.innerHeight * pixelRatio);
    const bloomComposer = new EffectComposer(renderer);
    bloomComposer.renderToScreen = false;
    bloomComposer.addPass(renderScene);
    bloomComposer.addPass(bloomPass);
    const bloomComposer2 = new EffectComposer(renderer);
    bloomComposer2.renderToScreen = false;
    bloomComposer2.addPass(renderScene2);
    bloomComposer2.addPass(fxaaPass);
    const finalPass = new ShaderPass(
      new THREE.ShaderMaterial({
        uniforms: {
          baseTexture: { value: null },
          bloomTexture: { value: bloomComposer.renderTarget2.texture }
        vertexShader: document.getElementById('vertexshader').textContent,
        fragmentShader: document.getElementById('fragmentshader').textContent,
        defines: {}
      }), "baseTexture"
    finalPass.needsSwap = true;
    const finalPass2 = new ShaderPass(
      new THREE.ShaderMaterial({
        uniforms: {
          baseTexture: { value: null },
          bloomTexture: { value: bloomComposer2.renderTarget2.texture }
        vertexShader: document.getElementById('vertexshader').textContent,
        fragmentShader: document.getElementById('fragmentshader').textContent,
        defines: {}
      }), "baseTexture"
    finalPass2.needsSwap = true;
    const finalComposer = new EffectComposer(renderer);
    finalComposer.addPass(renderScene)
    finalComposer.addPass(finalPass);
    finalComposer.addPass(finalPass2);
    finalComposer.addPass(fxaaPass);

I’m not sure if my approach is reasonable, but it does help me render things in two scenes. What do you think? @Mugen87 @prisoner849 @Harold

const renderScene = new RenderPass(scene, camera);
let target = new THREE.WebGLRenderTarget(window.innerWidth, window.innerHeight);
renderer.clear()
renderer.setRenderTarget(target);
renderer.render(scene2, camera);
renderer.setRenderTarget(null)
bloomComposer.render()//This is the scene that needs post-processing for rendering
const finalPass = new ShaderPass(
      new THREE.ShaderMaterial({
        uniforms: {
          baseTexture: { value: null },
          bloomTexture: { value: bloomComposer.renderTarget2.texture }
        vertexShader: document.getElementById('vertexshader').textContent,
        fragmentShader: document.getElementById('fragmentshader').textContent,
        defines: {}
      }), "baseTexture"
    finalPass.needsSwap = true;
const finalPass2 = new ShaderPass(
      new THREE.ShaderMaterial({
        uniforms: {
          baseTexture: { value: null },
          bloomTexture: { value: target.texture }
        vertexShader: document.getElementById('vertexshader').textContent,
        fragmentShader: document.getElementById('fragmentshader').textContent,
        defines: {}
      }), "baseTexture2"
    finalPass2.needsSwap = true;
const finalComposer = new EffectComposer(renderer);
finalComposer.addPass(renderScene)
finalComposer.addPass(finalPass);
finalComposer.addPass(finalPass2);
finalComposer.render()

What I want to know is whether you want to add finalpass and finalpass2 to effectcomposer by mixing the rendering results of two scenes.

Harold:

Then as a final post processing pass, you simply blend the final results of both layers together

How do I mix the final results of two layers? :sob: