"Marriage is neither heaven nor hell, it is simply purgatory."
- Abraham Lincoln
More pages: 1 ... 11 ... 21 ... 31 ... 41 ... 51 ... 61 ... 71 ... 81 ... 91 ... 101 ... 110 111 112 113 114 115 116 117 118 119 120 ... 131 ... 141 ... 151 ... 161 ... 171 ... 181 ... 191 ... 201 ... 211 ... 221 ... 231 ... 241 ... 251 ... 261 ... 271 ... 281 ... 291 ... 301 ... 311 ... 321 ... 331 ... 341 ... 351 ... 361 ... 371 ... 381 ... 391 ... 401 ... 411 ... 421 ... 431 ... 438
Query Failedzhugel_007
Tuesday, March 15, 2011

Actually, because the depth buffer is not used, this method would not work for alpha-test object.

zhugel_007
Tuesday, March 15, 2011

Actually, because the depth buffer is not used, this method would not work for alpha-test object.

Zhugel_007
Monday, March 14, 2011

Actually, there are something going bizarre if linear depth is used. looks like the per pixel depth is not interpolated correctly.

Zhugel_007
Monday, March 14, 2011

Really nice article. it explains lots of questions in my head. One more question though I understand what you've explained in your article, but what if i force the Z to be linear. (play with the projection matrix by dividing fFar:
mProj._33/=fFar;
mProj._43/=fFar;
and multiply W in vertex shader:
float4 vPos = mul(Input.Pos,worldViewProj);
vPos.z = vPos.z * vPos.w;
Output.Pos = vPos
Would there be any visual difference compared with using the hyperbolic Z in the final rendering result? I did this test, and visually, it looks the same as using the hyperbolic Z. Also, would this break the Hi-Z? (looks ok in pix though)

Humus
Sunday, March 13, 2011

TomF: Forgot to answer the copying question ... it needs the backbuffer as an input while also rendering to it. If the hardware could read from and write to the same surface in the same draw call there wouldn't have to be a copying needed. In DX10 it's explicitely disallowed, but I guess it might work on same hardware in DX9 and I suppose it might work on consoles.

WheretIB: Well, antialiasing edges of transparent surfaces should work fine I think (haven't tested though). However, background edges behind a transparent surface would smear pixels on the transparent surface. This may not be so visible though. But in any case it should be possible to separate the rendering into two passes, one for opaque followed by its GPAA pass, then transparent surfaces on top of that and then GPAA of that.

WheretIB
Sunday, March 13, 2011

Since the depth-buffer is used to remove lines that are hidden, I figure that it doesn't work with transparent surfaces?

Humus
Sunday, March 13, 2011

Tom: The depth-buffer is not used other than removing lines that are hidden. No particular information is extracted from it during antialiasing, but the deferred rendering part of the demo does of course read it.

TomF: I was unaware of that paper, but the idea is indeed quite similar. The difference is I blend as a post-step over a final image whereas AFAICT he's rendering antialiased lines and doing the shading of both sides of the edge.

Ben
Sunday, March 13, 2011

It looks really really good! I'm curious to see scenes with more geometric detail.

More pages: 1 ... 11 ... 21 ... 31 ... 41 ... 51 ... 61 ... 71 ... 81 ... 91 ... 101 ... 110 111 112 113 114 115 116 117 118 119 120 ... 131 ... 141 ... 151 ... 161 ... 171 ... 181 ... 191 ... 201 ... 211 ... 221 ... 231 ... 241 ... 251 ... 261 ... 271 ... 281 ... 291 ... 301 ... 311 ... 321 ... 331 ... 341 ... 351 ... 361 ... 371 ... 381 ... 391 ... 401 ... 411 ... 421 ... 431 ... 438