A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Little bug in nvidia's sraa algorithm.



 
 
Thread Tools Display Modes
  #1  
Old March 12th 11, 04:44 AM posted to alt.comp.borland-delphi,alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.nvidia.programming,comp.graphics.algorithms,comp.graphics.api.opengl
Skybuck Flying[_3_]
external usenet poster
 
Posts: 230
Default Little bug in nvidia's sraa algorithm.

It seems nvidia recently invented something samiliar but for whole scenes
instead of one little pixel like mine

They call it "sraa" which stands for "sub-pixel reconstruction
antialiasing". I came across this term when I was
reading some Dice publications about frost engine 2 and Battlefield 3 which
ofcourse interests me as a shooter fan ! =D

A google took me to this document/webpage:

http://anteru.net/projects/research/...-antialiasing/

http://research.nvidia.com/sites/def...ions/I3D11.pdf

Picture 2a is a two dimensional version of what I was talking about in my
original posting, see below, bitmap pixels lieing half-way the screen
pixels.

Picture 2d is nvidia's attempt at antialiasing it (sraa), which reminds me
of my results sort of... (two greys for one black pixel)

Picture 2e is apperently the reference result which is I guess how it should
look like... but I have doubts about that if it's even correct or the best
way
(thus the blending question/related which maybe is still unsolved by nvidia
as well ?! )

But let's assume picture 2e is best for now

Comparing the pictures clearly shows that sraa is closest to how it should
look like again assuming that picture 2e is the best one.

However to me it seems picture 2d also shows a bug in nvidia's sraa
algorithm... which can further be seen in their code sample.

In picture 2d the vertical black line to the left doesn't seem correct to
me... this should also be grey, this can be seen in picture 2e as
verification).

I shall give a little hint about the potential bug.

At most 4 subpixels contribute to the final pixel.

However that's not what nvidia's code is doing... it's actually summing up
the results from 9 subpixels which would be wrong.

I shall give some hints from my solution without entirely revealing it My
solution is to simply zero-fy the "weights" (as they call it) to zero if
they are not contributing to the final pixel.

The slightly more difficult part is to figure out a way how to determine
which subpixels are contributing and which are not contributing to the final
pixel. It took my experienced magical brain about two days to come up with a
nice probably high performant solution. The first solution was a slow
verification solution, the second solution was the high performant solution.
The solutions were compared and were practically identical, only very slight
floating point differences in the order of something like
0.0000000000000xxxx which can be neglected.

If nvidia is interested in hearing my theories about the bug and how to
solve it they can contact me at skybuck 2000 at hotmail dot com

I shall require either some big credits or some financial compensation just
for the kicks of it and as a token of appreciation ! =D

Though it's probably not that difficult for them to figure out...

It shall be most amuzzzing to see if they can figure it out and can come up
with a solution...

If not my offer stands ! =D

I shall also re-post this with a new subject line and include the nvidia
newsgroup because it might otherwise slip past their attention.

Heck... if I am doing "research for nvidia" then I should get the same
compensation as their own researchers correct ?! =D

Bye,
Skybuck =D


"Skybuck Flying" wrote in message
.home.nl...
Hello,

Suppose the center of each screen pixels is 0.5 and 0.5.

Now suppose a bitmap/single pixel lies exactly on top of it.

The resulting color would be exactly the same and thus bright.

Now try to imagine the same bitmap pixel lieing at the boundery edge of
the
screen pixel.

So the bitmap/single pixel lies exactly between the screen pixels like so:



+---+---+
| | |
+---+---+

^
Bitmap pixel lies between screen pixels.

The bitmap pixel falls on 50% of the left screen pixel and on 50% of the
right screen pixel.

The question is how to color/blend these two screen pixels ?

Should the formula simply be 50% of bitmap pixel color for left screen
pixel
and 50% of bitmap pixel color for right screen pixel ?
(and blend with 50% of background pixels... )

If this is done then the pixel will start to appear to blink/flicker a
bit... from bright to less bright and back to bright again when it moves
across the screen pixel boundaries.

Is there maybe a better formula ? So it will seem to blink/flicker less
(when moving across screen pixel boundaries) ?

Perhaps some kind of logarithmic formula ?

Bye,
Skybuck.







  #2  
Old March 12th 11, 12:49 PM posted to alt.comp.borland-delphi,alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.nvidia.programming,comp.graphics.algorithms,comp.graphics.api.opengl
Skybuck Flying[_3_]
external usenet poster
 
Posts: 230
Default Little bug in nvidia's sraa algorithm.

Hmm this morning when I woke up I tried to reproduce the supposedly nvidia
bug on my simple little pixel test program and now I am not sure anymore if
it's truely a bug... could be... or maybe not...

When I try to reproduce the bug the surrounding pixels at the edges start to
colorize... (because of the blending with the background and probably out of
range weights) which nvidia's picture does not seem to do... so I am now not
sure if it's still a bug or not... could be that their 3d dimensional and
bilateral stuff is somehow/kinda hiding the bug a little bit and still
leading to okish results... (or perhaps their blending is less)

Could also be that there is no bug... difficult for me to say... trying out
my idea's on how to prove it might show if it truely is/was a bug or not

Bye,
Skybuck.

"Skybuck Flying" wrote in message
.home.nl...
It seems nvidia recently invented something samiliar but for whole scenes
instead of one little pixel like mine

They call it "sraa" which stands for "sub-pixel reconstruction
antialiasing". I came across this term when I was
reading some Dice publications about frost engine 2 and Battlefield 3
which ofcourse interests me as a shooter fan ! =D

A google took me to this document/webpage:

http://anteru.net/projects/research/...-antialiasing/

http://research.nvidia.com/sites/def...ions/I3D11.pdf

Picture 2a is a two dimensional version of what I was talking about in my
original posting, see below, bitmap pixels lieing half-way the screen
pixels.

Picture 2d is nvidia's attempt at antialiasing it (sraa), which reminds me
of my results sort of... (two greys for one black pixel)

Picture 2e is apperently the reference result which is I guess how it
should look like... but I have doubts about that if it's even correct or
the best way
(thus the blending question/related which maybe is still unsolved by
nvidia as well ?! )

But let's assume picture 2e is best for now

Comparing the pictures clearly shows that sraa is closest to how it should
look like again assuming that picture 2e is the best one.

However to me it seems picture 2d also shows a bug in nvidia's sraa
algorithm... which can further be seen in their code sample.

In picture 2d the vertical black line to the left doesn't seem correct to
me... this should also be grey, this can be seen in picture 2e as
verification).

I shall give a little hint about the potential bug.

At most 4 subpixels contribute to the final pixel.

However that's not what nvidia's code is doing... it's actually summing up
the results from 9 subpixels which would be wrong.

I shall give some hints from my solution without entirely revealing it
My solution is to simply zero-fy the "weights" (as they call it) to zero
if they are not contributing to the final pixel.

The slightly more difficult part is to figure out a way how to determine
which subpixels are contributing and which are not contributing to the
final pixel. It took my experienced magical brain about two days to come
up with a nice probably high performant solution. The first solution was a
slow verification solution, the second solution was the high performant
solution. The solutions were compared and were practically identical, only
very slight floating point differences in the order of something like
0.0000000000000xxxx which can be neglected.

If nvidia is interested in hearing my theories about the bug and how to
solve it they can contact me at skybuck 2000 at hotmail dot com

I shall require either some big credits or some financial compensation
just for the kicks of it and as a token of appreciation ! =D

Though it's probably not that difficult for them to figure out...

It shall be most amuzzzing to see if they can figure it out and can come
up with a solution...

If not my offer stands ! =D

I shall also re-post this with a new subject line and include the nvidia
newsgroup because it might otherwise slip past their attention.

Heck... if I am doing "research for nvidia" then I should get the same
compensation as their own researchers correct ?! =D

Bye,
Skybuck =D


"Skybuck Flying" wrote in message
.home.nl...
Hello,

Suppose the center of each screen pixels is 0.5 and 0.5.

Now suppose a bitmap/single pixel lies exactly on top of it.

The resulting color would be exactly the same and thus bright.

Now try to imagine the same bitmap pixel lieing at the boundery edge of
the
screen pixel.

So the bitmap/single pixel lies exactly between the screen pixels like
so:



+---+---+
| | |
+---+---+

^
Bitmap pixel lies between screen pixels.

The bitmap pixel falls on 50% of the left screen pixel and on 50% of the
right screen pixel.

The question is how to color/blend these two screen pixels ?

Should the formula simply be 50% of bitmap pixel color for left screen
pixel
and 50% of bitmap pixel color for right screen pixel ?
(and blend with 50% of background pixels... )

If this is done then the pixel will start to appear to blink/flicker a
bit... from bright to less bright and back to bright again when it moves
across the screen pixel boundaries.

Is there maybe a better formula ? So it will seem to blink/flicker less
(when moving across screen pixel boundaries) ?

Perhaps some kind of logarithmic formula ?

Bye,
Skybuck.









  #3  
Old March 12th 11, 01:10 PM posted to alt.comp.borland-delphi,alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.nvidia.programming,comp.graphics.algorithms,comp.graphics.api.opengl
Skybuck Flying[_3_]
external usenet poster
 
Posts: 230
Default Little bug in nvidia's sraa algorithm.

One simple possible explanation which came to mind is nvidia's floating
points are always limited/clamped to a range of 0.0 to 1.0 therefore the
weights could never go out of range... if that's the case then there
probably is no bug...

Bye,
Skybuck.


  #4  
Old March 12th 11, 01:20 PM posted to alt.comp.borland-delphi,alt.comp.periphs.videocards.nvidia,alt.comp.periphs.videocards.nvidia.programming,comp.graphics.algorithms,comp.graphics.api.opengl
Skybuck Flying[_3_]
external usenet poster
 
Posts: 230
Default Little bug in nvidia's sraa algorithm.


"Skybuck Flying" wrote in message
b.home.nl...
One simple possible explanation which came to mind is nvidia's floating
points are always limited/clamped to a range of 0.0 to 1.0 therefore the
weights could never go out of range... if that's the case then there
probably is no bug...


Then again that would probably still lead to wrong calculations in
formula's... so that probably not going on...

Could also be a problem with the weights not being correctly calculated...
or beingly overly applied... or not being zeroed or whatever.

The picture does seem a bit weird... what could be the explanation for the
vertical line which looks bad huh ?!

So from the looks of it there does seem to be a bug somewhere ! =D

Bye,
Skybuck.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Nvidia wrong resolution bug? Jim[_31_] Nvidia Videocards 0 May 14th 10 10:07 AM
Bug with all Nvidia drivers > 53.03 on Ti4200? Mark Mackey Nvidia Videocards 11 March 12th 05 06:20 PM
HELP: Need to report this Bug to NVidia Update 2 Andrew Nvidia Videocards 2 November 18th 03 07:32 AM
How can I report this Bug to NVidia Update 2 B Nvidia Videocards 1 November 15th 03 03:36 PM
NVidia Bug reporting Allan Dwyer Nvidia Videocards 8 November 14th 03 10:02 AM


All times are GMT +1. The time now is 09:31 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.