Pages: [1] 2   Go Down
  Print  
Author Topic: Point cloud and volume texture in Q3D5 !!!!  (Read 6317 times)
May 18, 2012, 09:42:58 pm
Dose anyone have any idea how we should feed new volumetric texture with point cloud xml data? I am insisting on point cloud data not a texture, because i don't have any free buffer on my deferred setup and since i already have z and world normal i guess its possible to doing this. But first issue is how to feed volume texture with xml data. I don't even know if its possible or not. If you are interesting for help i can setup a complete  scene with point cloud data ( baked in turtle) and we can move on, if not i don't know where else i should go.
Btw turtle point cloud system is very powerful, it support both static and dynamic objects even with movable sunlight and local lights and many more interesting stuff ...

ali-rahimi.net
May 18, 2012, 10:15:04 pm
Few shots in maya viewport.


* 01.jpg (176.23 KB, 1920x1154 - viewed 399 times.)

* 02.jpg (180.5 KB, 1920x1154 - viewed 380 times.)

* 03.jpg (188.22 KB, 1920x1154 - viewed 331 times.)

* 04.jpg (151.68 KB, 1920x1154 - viewed 324 times.)

* 05.jpg (154.46 KB, 1920x1154 - viewed 329 times.)

* 06.jpg (144.47 KB, 1920x1154 - viewed 333 times.)

ali-rahimi.net
May 18, 2012, 10:38:26 pm
Turtle: baking Next-Gen Illumination to Point Clouds tutorial  Wink

* pointcloud.pdf (825.34 KB - downloaded 596 times.)

ali-rahimi.net
May 20, 2012, 01:56:02 pm
So it's not about adding a point cloud, but adding lighting information stored in a point cloud.

I looked at this a little bit, but there is no easy or direct way to add this to a rendering pipeline. The advantage is that you bake the light flow in open space basically, instead of on the surfaces, making it easier to relight highly detailed objects. (Including animated objects.)

We don't use turtle, but I think you can use this approach with any software by placing a number of spheres in open space set to receive illumination, but not cast it. After baking to the spheres, you can convert the maps into spherical harmonics.

Then you would store all this data into one or more textures that contain the lighting info for the whole scene. The main issue I see is that you should structure this data in some way, so you know where to look in the texture when lighting the scene. (Which pixel uses which lighting information.) The most logical seems a 2D grid dropped to lets say 1 meter above the ground. But some models might need additional layered structures to complete the light information. You would then do a full screen pass per light structure to relight the scene. From crytek I saw that they use this basic grid approach, but allow for manual relocation of the points.

It's a complex technique to get working and I've seen reports on light leaking issues. Once it works it is a great way to light a scene with high quality without being dependent on the object quality. (So it's great for many instanced plants for example.) It looks promising, but the engine and shader design is not easy it seems.
May 20, 2012, 02:13:53 pm
Yes. It's not that much easy. I think we should go for Farcry3 method. Lots of work http://engineroom.ubi.com/wp-content/bigfiles/farcry3_drtv_lowres.pdf

ali-rahimi.net
May 21, 2012, 08:55:27 am
Yes, that is the one I was referring to. The idea is "simple", to store directional light flow in open space instead of at surfaces. The whole indexing system makes it a bit tricky. For most of a scene a fixed set of layers in a regular grid will probably do. But you probably need extra probes around a high rise building or inside an interior. Then the whole indexing system will fail. Let alone common issues in the filtering and with light leaking.
May 21, 2012, 01:07:31 pm
Do we have all the tools for rebuilding farcry3 GI solution? Just want to be sure before investing more on this paper  Wink Its performance and memory usage are fantastic and it's designed for a full deferred pipeline.

ali-rahimi.net
May 21, 2012, 04:11:13 pm
Well, I think you could bake it with any tool and with HLSL you can shade it in Quest3D.

You do need something in between to encode probe position into some indexed map and place the light samples in a texture accordingly. So, custom programming is probably required.
May 24, 2012, 12:26:24 pm
I think first i must go for Global Ambient Occ. It don't need and SH calculation or color. My main issue is only how to bake it in volume texture. Actually i did not understand volume texture completely. Its more like a 3d planner technique to my eyes. How should i put a single point in 3d space with volume texture? Dose it work like 1x16 or 4x4 greed? Since it's depth could be set to any power2 value. How dose it work in general?

Find this one: http://msdn.microsoft.com/en-us/library/windows/desktop/bb206344(v=vs.85).aspx

Btw, DxTex.exe is not working correctly and DirectX SDK installation return an error http://www.microsoft.com/en-us/download/details.aspx?id=6812 So i am in a bad situation and hardly can see how volume texture is looks like. My only tool is photoshop nvidia plugin.

ali-rahimi.net
May 24, 2012, 02:02:56 pm
The volume texture is laid out with slices side by side and with one mipmap that blends from slice one to slice 2.

When you use a 3D sampler you supply uvw textcoordinate where z goes from 0 to 1. The slices are interpolated as you move through the volume.
May 24, 2012, 03:08:40 pm
Thanks tom. I need to find out for example how can i put a single point in volume texture at (0.0.0) worldspace position. When i open this 3d texture in Q3D in texture 3d property i can see this 8x-1920. Why -1920? and which slice is presenting which direction?


* volume-texture-result.jpg (116.38 KB, 1440x986 - viewed 360 times.)

* volume-texture-photoshop.jpg (37.32 KB, 2048x256 - viewed 303 times.)
* volumeTexture.dds (439.12 KB - downloaded 139 times.)

ali-rahimi.net
May 24, 2012, 03:13:05 pm
Could you send me a volume texture with a single point on it? Then i will open it in photoshop to see how dose it work.

ali-rahimi.net
May 24, 2012, 03:50:44 pm
It is all about the texture coodinates in the sampler. The texture coordiantes are u,v,w where w is depth in the volume. The coordiantes go from 0 to 1 relative to the texture. You have to to map world space to these coordinates in your shader..

To make a volume texture use the direct x texture tool and the nvidia plugins for photoshop to load it. It is very simple, the slices (think of a CT scan) are placed side by side.
May 24, 2012, 04:37:50 pm
So for example this is how i should have a point is 3d space?


* volume-texture-2.jpg (20.89 KB, 2048x256 - viewed 340 times.)

ali-rahimi.net
May 24, 2012, 04:56:33 pm
woow. great. Thanks alot tom. Trick was to use world mult for UV, just like clip map terrain. Now i can bake Global AO to volume texture and the rest. Good progress  Grin
Code:
float4x4 world        : World;
Out.pos_uv = mul(In.position/Volume_Scale, world).xyz;
float3 Volume_Texture = tex3D(Volume_Texture_Sampler,In.pos_uv);

ali-rahimi.net
Pages: [1] 2   Go Down
  Print  
 
Jump to: