PDA

View Full Version : Fragment Program and Pixel Locations



potmat
04-12-2005, 07:45 AM
Hello all,

I know that it's theoretically not possible for a fragment shader to affect pixel location. But does anyone know any hacks or work-arounds for this?

Can fragment shaders write to arbitrary locations in texture memory?

Basically what I'm asking is, if we think of the output of the rendering pass as a 2D array, is it possible for a fragment shader to write to any arbitrary location in that array?

Relic
04-12-2005, 08:19 AM
Absolutely no way.
Imagine a super-duper chip which hardwires one pipe per pixel on your screen :cool: .
As soon as you are in that pipe, you're locked down to its final position.

The workaround would be to render a grid of GL_POINTS carefully adjusted to pixel centers (no FSAA!) and program the vertex pipeline to do the offset math.
If you have a texture dependent displacement in mind, you'd need a GeForce 6 using vertex textures.

Or, if your algorithm is able to store a unique inverse offsets per pixel ("From where should I read?") inside a floating point texture, this could be done with a dependent texture lookup in the second path of a fragment program.

Obli
04-12-2005, 10:41 AM
Well, some limited fake-displacement seems to be possible.
On NVidia developer site http://developer.nvidia.com they're giving away some free chapters from the lastest GPU gems book. There's a chapter which explains per-fragment displacement mapping.
Well, it's not really much flexible as you may need but maybe you should take a look.

dorbie
04-12-2005, 11:51 AM
That is a displaced texture dependent read not moving the pixel. You can attempt to move texture fetches but ultimately the fragments you get are those from the primitive being rasterized. The net result may be a pretty good approximation of what you would have done with displaced pixels.

potmat
04-13-2005, 09:19 AM
Thanks all, that's pretty much what I've been trying, but, as you all point out, those methods are messy and not very robust.

What about vertex programs, can they write to arbitrary locations, in texture memory or otherwise? I suspect that they can, but I'm really just getting started with shaders.

LarsMiddendorf
04-13-2005, 09:38 AM
Carmack talks about screen space displacement mapping at the last QuakeCon.
You could project the offset vector from (steep) parallax or relief mapping into screen space. Then you've got the vector the current pixel should be moved. But how can you get the inverse offset ("From where should I read?")?

michagl
04-14-2005, 01:31 PM
there is a boiling desert heat type effect that the playstation 2 seems to do very easily. ie. the scene is swimming globally.

i figure for non-ps2 hardware it would be achieved by rendering to a texture slightly larger than the screen and then after you are through writing the texture to the frame buffer while modulating the fragment reads.

for the ps2 though it seems to be a built in effect.

would it be worth while to have programmable frame buffer swapping units?

sincerely,

michael

michagl
04-14-2005, 01:44 PM
Originally posted by LarsMiddendorf:
Carmack talks about screen space displacement mapping at the last QuakeCon.
You could project the offset vector from (steep) parallax or relief mapping into screen space. Then you've got the vector the current pixel should be moved. But how can you get the inverse offset ("From where should I read?")?are you sure??? wouldn't this leave all sorts of holes? i couldn't imagine what kind of technique you could use to ensure that no holes would be created. and how do you assign proper depth values? or they talking about doing this before the fragment shader entirely in hardware?

LarsMiddendorf
04-15-2005, 12:16 AM
I have some interesting thoughts for being able to do
sort of a screen space displacement mapping, where we render different offsets into the screen and then
go back and render the scene warping your things as necessary for that, which would solve the T-junction
cracking problem that you get when using real displacement mapping across surfaces where the edges
don't necessarily line up. http://www.gamedev.net/community/forums/topic.asp?topic_id=266373

This is an interesting idea. But how to get the correct offset if you only know how far the current pixel should be moved? Perhaps some kind of filtering or flow simulation?

dorbie
04-15-2005, 01:15 AM
It initially seems possible with image based approach but the question of where you read the image from is an embodiment of a complex problem, but perhaps not for the reasons assumed, you have an offset vector that could approximate the destination depending on how the assets are represented (you can chose to have the polygon hull enclose an entirely negatively offset displacement to the true surface for example). The real issue is resolution of offset fragment occlusion. Each sample gets a single read on the subsequent pass and a single offset value. Cracks are a non issue due to texture filtering, every fragment gets hit. The silhouette is described by alpha, the rgb image is actually an RGBA with the alpha term determining the final displaced silhouette.

So making this work is actually a question of rendering your offset vector map in a way that resolves the occluded offset issue. That may be possibly when rendering the offset vectors to screen space.

Without resolving occlusion in the offset vector map I think you could fix this with a depth map and multiple itterative image fetches almost like a search, it's no worse that the itteration through the map now for accurate tracing. It just does it in screen space.

Definitely doable I'd say, worth it? I dunno.. One problem would be discovered fragments on the same model. For example a body displacing in to reveal a hidden limb. Perhaps that could be fixed by a hull with all positive displacements but it's a second order effect. The magnitude of the displacement and required search is a performance limiting factor and a positively displaced hull has no starting vector for the displacement search.

michagl
04-25-2005, 08:06 PM
i lost track of this topic... not that it went to far.

anyhow, here is an answer to my own question that i ran into today:

http://www.ati.com/developer/gdc/Tatarchuk-ParallaxOcclusionMapping-FINAL_Print.pdf

i wouldn't call this 'screen space' displacement mapping (which would be awesome and probably voxel related). i guess it is texture space displacement mapping. the silhouettes always reduce this sort of stuff to little more than a visual hack for me. bump mapping at the end of the day is really only good for speckle type surfaces, and faces facing the camera.

edit: traditional silhouetteless displacement bump mapping also works well with inlays such as on jewelry where the silhouette can easilly be hidden... but lets talk about silhouette bump mapping please!

michagl
04-25-2005, 08:13 PM
i dunno, all the images in that paper were clipped at the silhouettes, but it seems like since it uses negative displacement mapping, if the outer geometry completely encompassed the inner geometery ( a hull as dorbie suggested ) ... it seems to me like it might be possible to use this texel space raytracing aproach to achieve proper silhouettes by assigning alpha values to silhouette rays which escape the surface... you might even be able to do some sort of antialiasing on the silhouettes.

maybe i should give that paper another look over.

sincerely,

michael

edit: sorry dorbie about not noticing your suggestion to use alpha fragments on the silhouettes. (admitedly in a hurry i sort of skimmed your post because it looked wordy and complicated) maybe someone here might find that paper interesting. i'm considering looking into do this... i think it would work very well with the system i'm building right now... i need some way to sop up gpu cycles as it is.

michagl
04-25-2005, 08:31 PM
oooo, i find this an exciting prospect.

anyone think it might be possible to use a thing filtered 3d texture (say 512x512x8) with this sort of technique to lay down a 'pot marked' surface.

about the most complicate microscopic terrain i can think of is soft soil/mud rutted up by horses... or maybe a loose gravel road.

so might it be possible to capture a loose gravel road in a 4~8 pixel deep filtered 3D texture with this technique?

could even do a scaley type surface maybe.

LarsMiddendorf
04-26-2005, 03:24 AM
You could possible get correct silhouettes by transforming the three edges of the triangle into texture space and killing those pixels that lay outside of the triangle. That would be three DP3 instructions and a KIL.

knackered
04-26-2005, 04:15 AM
http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=3;t=012842

michagl
04-26-2005, 09:08 AM
Originally posted by LarsMiddendorf:
You could possible get correct silhouettes by transforming the three edges of the triangle into texture space and killing those pixels that lay outside of the triangle. That would be three DP3 instructions and a KIL.i was thinking yesterday after i logged off that there is probably no straight forward way to determine silhouette pixels after displacement. i could be very wrong... but really tangent space is flat locally i believe... so there seems to be no way of determing in the fragment shader alone if a ray cast escapes tangent space.

btw? what is a 'kill' and how do you do it?

michagl
04-26-2005, 09:30 AM
Originally posted by knackered:
http://www.opengl.org/discussion_boards/cgi_directory/ultimatebb.cgi?ubb=get_topic;f=3;t=012842 wow! this is simply awesome... i didn't realize so much could be achieved so flawlessly...

i definately want to do this with this. the latest system i'm working (which i've been most vocal about in this forum) is the perfect platform for this technology. i think i will use a multi path shader with mipmapping to variate the effects and sampling frequency as suggested in the paper i posted.

did fpo never release any high-level shader code for this effect?

it would be very cool if fpo would chime in here.

i have a real-time clod nurbs system with geometric displacement mapping running for macroscopic displacement. tangent space on the fly is practicly a necessity for normal generation anyhow. the density of the screen space vertices can be leveraged to ensure that the triangles requiring the deepest paths / higher sampling frequency can be alotted fewer pixels.

anyhow, i would be very interested in either some high-level cg/glsl code, or a fairly detailed break down of the method.

sincerely,

michael

Brolingstanz
04-26-2005, 05:50 PM
potmat, might i ask what specifically you had in mind for this or was this sort of a generic kind of inquery?

pesonally i dont think the texture kludge is all that ugly but this clearly depends on the objective and of this you would certainly know better than i. since hacks are very task specific, some task specifics might be helpful. by that i mean the higher level objective beyond the lower level implementation of "moving pixels" if such a distinction could be made.

respectfully,
bonehead

michagl
04-27-2005, 09:13 AM
as far as potmat is concerned, there is no literal way to move fragments with a frag shader... this is these are viable alternatives though. remember a forum isn't here just for potmat or any individual, it is for everyone, and positive discourse.

bbs forums bother me a bit i must admit. every time i see a new response to a thread i'm happy to see discussion moving (wherever it goes) but i still can't resist crossing my fingers and praying that the new posts are positive.

opengl isn't the product of any particular society for any particular society. just loosen up and enjoy yourself. why would you want to force everyone to act like you as long as the intended direction is positive.

if potmat wanted something more specific geared to their application, than just like everyone else, they are free to chime in.

FPO's work is very exciting in my opinin. if he is not going to grace us with the promised paper and demo, then at the least i think he owes us an explanation of his disposition.

i have a vague idea of what is going on with his routine, but it would save me and probably a lot of others some heart ache if fpo would be more giving.

that is all i'm waiting for.

if people like, i would be quite happy to start a discussion discussing the ins and outs of this aproach.

i'm especially interested in how fpo's curvature metric is derived. is it literal gaussian curvature or what, and how is it aproximated and utilized?

i can make guesses, but i also believe that such forums should be used to make people's lives easier for the betterment of the scientific knowledge of the human race in general, and not just as a last resort after the internet and pricey textbooks and commercial books have been exausted.... i mean its not as if this forum is just so flooded with discussion that it can't be managed. its pretty dry actually in my opinion.

sincerely,

michael

michagl
04-27-2005, 10:14 AM
sorry, but i felt like i had to add some more.

i'm personally very excited about this business. i can see a sizable chunk of the future of computer graphics in it.

we can't throw triangles at the rasterizer that no bigger than a pixel... that defeats the purpose of using triangles in the first place. but on hte other hand we need believable silhouettes right down to the pixel.

i believe the future is in environments that are so large in scope and scalability that it is preposterous to precompute everything and store it on disk... that is every vertex should be sampled at run-time from most likely some infinitely scalable parametric base geometry (nurbs control mesh) and a combination of various sorts of map encoded data.

i've done a lot of work to manage clod systems. and my focus has changed in the meantime from trying to beat static precomputed algorithms to simply managing an environment where all data is sampled at run-time. that is where the real bottle-neck is. its not about necesarrilly effeciently displaying data, as much as it is retrieving the data.

this texture space displacement algorithm allows for the need for high resolution detail in the geometry to be pushed back even futher meaning that run-time sampling can be relaxed because you can rely on the fragment shader to pick up the slack in the geometry department. (which is especially helpful once you get in deformable run-time sampled geometry).

this fragment geometry is awesome really, its like a little baryocentric lattice deformer. how long will we have to wait until hardware supports *robust* vertex lattice deformations, but for fragment shaders we have them right here.

i just think this is awesome.

i think this linear/binary sampling should be handled in a single instruction entirely on hardware.

i would really like talk about how this silhouette capable curvature based shader differs from say the shader outlined in the ATI paper i referenced ealier.

i'm assuming the ray being cast is parabolic rather than linear. the curvature 'bends' the shape of the ray.

i would like to know if the curvature can be sampled from a nurbs surface straight forwardly via a derived surface.

the ATI presentation says that the u and v vectors of tangent space are derived from 'b' and 't' basis vectors... does some relationship between these vectors have something to do with the curvature of the surface?

any other ideas?

i fully intend to do whatever investigation i can in my free time. i will have to drag out some books and hit the internet i guess. i wish i could give this investigation a higher priority. that is why i'm hoping for some leads here.

sincerely,

michael

dorbie
04-28-2005, 09:58 AM
Guys, behave, I've had to detete the flames in this thread and that's no fun.

@michaelgl, as knackered gently (and correctly) pointed out you should take a break between posts and maybe wait for a reply or try an edit or two. If readers see a whole series of posts from one poster it merges into a rambling monologue that nobody will read. Yes this is cool stuff, but take a breather between posts and read the posts of others in the thread to get a discussion going.

potmat
04-29-2005, 10:10 AM
Hello all, sorry I haven't checked in for a while, but I thought it was pretty much sewn up with the fact that there's really no way for the fragment shader to affect pixel position. Why was trying this? Basically if I render to an FBO I have an image stored on the graphics card. If I then want to do some kind of image based rendering using that (or more) images I may need to arbitrarily rearrange them. So far the only thing that works is rendering a mesh of vertices the size of the screen (one vertex per pixel) and then using displacement mapping, as stated early in the thread. I was hoping there would be some faster/easier way.

dorbie
04-29-2005, 10:43 AM
This is not the only way, see fpo's recent post where he itterates a parabola through a tangent space height map with negative bump displacement relative to his mesh.

Also the image based pass described in my post above could use dependent reads for the final fetches based on a shaded displacement vectors (or at least tangent space partial derivatives). As I said that could be an itterative search.

I don't understand why folks are posting to every thread but the interesting one where fpo actually presents a working solution.

michagl
04-29-2005, 04:49 PM
potmat, just a guess, but are you sure you can't just render a single quad on the second pass and do the fragment displacements entirely in the frag shader somehow... at the least that would avoid the geometry overhead.

----

sorry dorbie, but i'm not much for self regulation. first of all if i tried to second guess myself i'm sure i would noever go anywhere. i have enough trouble getting through the day under the radar. as for the delete posts you didn't delete anything i'll miss. ii'm not going to let someone step on toes, but if you had deleted the negative posting before anyone caught wind of it no one would've had to reply to it.

dorbie
04-29-2005, 07:45 PM
michaelgl, I have better things to do with my time than nanny flamers all day long :-). The reason I deleted the flamefest was because I thought it directed unfair criticism at you, if I'd read it sooner I'd have deleted it sooner.

As for self regulation, it's your reputation, not mine. You're obviously a smart guy, that gets lost on some when you make a multi-post monologue, post without reading the contents of your own links, download something but post redundant questions before your download is done, or don't check the contents of the thread you hastily post to.

Don't let me cramp your style :-)

Happy posting.

P.M. me if you want to discuss this.

michagl
04-29-2005, 08:28 PM
[OFF-TOPIC]

no problem, your points are taken, and i hope you didn't get the impression that i was faulting you for any reason... i just 'if' as in literally 'if'... not as in implying you are at any fault.

sorry i'm very literal... my brain isn't adapted well i'm afraid for unexplicit meaning.

as for my posting irregularities and lazyness... a great deal of it is lazyness, i also like to think that the first words out of the horses mouth are the most honest, but mostly its just my 300mhz internet terminal and a weak internet connection. moving about the internet for me is clunky, and can get distracting as i move my attention to other machines.

unfortunately having development machines on the internet is a security risk. but i'm expecting a new (second hand) 1.5ghz internet terminal any day now as much as i hate to further demote my beloved portable. (i really need to put linux on it but the only solution on the market is a comercial app PartitionMagic).

and this is the last i have to say. i don't want to muck up any thread, but would also prefer to air these grievances publicly. a little personal banter never hurt anyone even in a discressionary public forum.

basicly i have to do things my way or no way. i like it here... most comfortable bbs i've encountered so far. i'm confrontational and acccustomed to taking flak. i'm used to it so please forget it.

thanks for the 'smart guy' bit as well... i'm flattered really. i try my best to keep my priorities in order.

michagl
04-29-2005, 08:40 PM
just for the record:


post redundant questions before your download is done, or don't check the contents of the thread you hastily post to.i've never posted any redundant questions (go ahead and try to fault me), and as for checking contents and hasty posting there are a lot of people in this forum who have been guilty of so much... especially when a thread gets grueling, its easy to make an oversight. and if you have a thought, get it out before you lose it is my motto.

dorbie
04-29-2005, 09:06 PM
i've never posted any redundant questionsNo?

"i'm downloading... just because you didn't say so explicitly, do these demos demonstrate proper silhouettes?"

You need a new motto :-)

FWIW, I'm not coming down on you just offering some friendly advice, and you're right this is off topic but my intent is to nip this in the bud.

P.S. I do see the contributions you make that's why the personal attack I deleted here was so unfair. More people could benefit from your contributions if you kept your powder dry, that's why both knackered and I went to your defense and offered the advice we did. Thanks for listening, neither of us are perfect, heck if knackered and I are offering you advice on netiquette then maybe there's something behind it :-)

michagl
04-30-2005, 11:36 AM
"i'm downloading... just because you didn't say so explicitly, do these demos demonstrate proper silhouettes?"

honestly i thought it would be useful for anyone reading the threads to know what was in the demos without having to download them... they are decent downloads and since i don't have the hardware to just up and run them the only way i could verify this without asking would be to get around to picking the shaders apart myself or maybe pick apart the discussion for hints (whatever might be easier, a straight statement would've been easier for everyone)... plus at the same time i was kind of making it known that some people would like to see demos with silhouettes even if they aren't perfect. as soon as i get my hands on some hardware that can run this stuff you can bet i will give some thought to perfecting the silhouettes if someone doesn't beat me to it... i'm also curious if the silhouettes are just 'not good enough' for extreme displacement, or if they work well enough for suddle displacement... personally the screens all look really good to me whenever the surfaces are not obsessively displaced.



FWIW, I'm not coming down on you just offering some friendly advice, and you're right this is off topic but my intent is to nip this in the bud.believe i try to keep everyone happy. but its easier for you to sit back and complain, but for me its difficult to empathize with you because i can't really grasp what your issues are or even how i would begin to reform my behavior without driving myself crazy.

all i expect of people is personal empathy. you can't force people to behave a certain way without asking really too much of them psychologicly. i wouldn't inflict anything on anyone here that i wouldn't inflict on myself... and i don't expect more than that of anyone.

i will try to keep people happy... but my reality/culture doesn't really resemble silicon valley office etiquete or an american sitcom (both of which i find equally confounding) so its personally difficult for me to relate. but i try never the less.
i'm not going to let people's requests of me warp my brain though. so if i'm really too much to handle, i can just walk away, or resort only to antiseptic language, which will undoubtedly limit my interaction with this community purely to a one of absolute necessity as opposed to comfort and outgoing support.

i'm all for healthy bbs communities... unfortunately in my experience most bbs communities do not really meet this criterion. this bbs so far is quite bareable and informative. i'm really shocked how much i've frequented this forum since i registered and in relative comfort.

for now though this is enough words on such matters for me. lets move forward please. not that these sorts of diversions are not a necesarry part of a healthy introspective bbs community.

i would be very happy to discuss silhouette conformant relief mapping in here as well as other inverse fragment sampling techniques in this thread as far as potmat and others would allow it.

i really believe a derivative of this technique will be a major cornerstone of low-level 'image synthesis' in the years to come and i would be happy to discuss it in any sort of divergeant fashion.
----

i'm mostly intrested in harnessing the technique for sub vertex smoothing and not just merely detail displacement mapping. preferably i would like to see a sub vertex smoothing technique that is indapendant of the linearity of the base geometry... ie. the displacement map should not reflect the linear nature of the aproximated geometry it is applied to.

you can easilly generate a displacement map by projecting a high resolution model onto a low resolution model, but how could you use a displacement map designed to be applied to a 'perfect' aproximation of the geometry, even when the geometry is less than perfect.

also interested in better techniques for producing per vertex curvature aproximations, especially generating consistant curvature data given a control mesh and a parametric sampling technique. ie. nurbs

PS: still interested in what is a fragment 'KIL' operation and where can i read about it (i have looked)... not in my dated Cg docs best i can tell, and i can't figure out what to give a browser to work with. should i just download new docs?

EDIT: found DISCARD operator in a Cg frag program.

dorbie
04-30-2005, 07:14 PM
Sigh, I could kick myself for encouraging this.

Please keep your ill-founded stereotypes to yourself and no more threadjacking.