Gaussian Blur - standard deviation, radius and kernel size

前端 未结 1 578
[愿得一人]
[愿得一人] 2020-12-29 09:04

I\'ve implemented a gaussian blur fragment shader in GLSL. I understand the main concepts behind all of it: convolution, separation of x and y using linearity, multiple pass

相关标签:
1条回答
  • 2020-12-29 09:19

    What's the relationship between sigma and radius?

    I think your terms here are interchangeable depending on your implementation. For most glsl implementations of Gaussian blur they use the sigma value to define the amount of blur. In the Gaussian blur definition the radius can be considered the 'blur radius'. These terms are in pixel space.

    How do I choose sigma?

    This will define how much blur you want, which corresponds to the size of the kernel to be used in the convolution. Bigger values will result in more blurring.

    The NVidia implementation uses a kernel size of int(sigma*3).

    You may experiment using a smaller kernel size with higher values of sigma for performance considerations. These are free parameters to experiment with, which define how many pixels to use for modulation and how much of the corresponding pixel to include in the result.

    What's the good size for a kernel, and how does it relate to sigma?

    Based on the sigma value you will want to choose a corresponding kernel size. The kernel size will determine how many pixels to sample during the convolution and the sigma will define how much to modulate them by.

    You may want to post some code for a more detailed explanation. NVidia has a pretty good chapter on how to build a Gaussian Kernel. Look at Example 40-1.

    0 讨论(0)
提交回复
热议问题