I'm working on the interface for a game and I want to apply a post effect to the UI elements whilst leaving the image in the background unaffected.
I have set up the background and the UI elements to use separate cameras with different depths. That all looks fine. When I apply an image effect using OnRenderImage like this:
void OnRenderImage(RenderTexture source, RenderTexture destination)
{
Graphics.Blit(source, destination, material);
}
The shader / effect also distorts the background. This makes sense when you consider that the output of all the previous cameras are probably being fed in.
Next I tried manually calling the UI Camera's Render() method manually from a script I put on the background, and then blitting it to the destination like this:
// Called by camera to apply image effect
void OnRenderImage(RenderTexture source, RenderTexture destination)
{
// render the background in straight away
Graphics.Blit(source, destination);
// user temporary texture
if(rt!=null)
{
RenderTexture.ReleaseTemporary(rt);
rt=null;
}
rt = RenderTexture.GetTemporary(Screen.width, Screen.height, 16);
UICamera.targetTexture = rt;
UICamera.Render();
Graphics.Blit(rt, destination, material);
}
At a glance that looked like it was working, but on closer inspection the blending didn't look right. I have some semi transparent black boxes that seem to disappear when I do this. I believe that the Clear Flags are having an effect on this, but I can't quite figure it out.
Other things I have tried was trying to render the background to a Render Texture and then feeding that texture into the shader I am using to process the foreground. I was blending it based on the alpha value of the UI, but I ended up getting the exact same result as my earlier code.
I've tried using different combinations of blend modes as well, but nothing seems to work the way I want.
Any help would be greatly appreciated.
↧