Gaussian Blur - standard deviation, radius and kernel size

ⅰ亾dé卋堺 提交于 2019-12-03 03:01:27
Aaron Hagan

What's the relationship between sigma and radius?

I think your terms here are interchangeable depending on your implementation. For most glsl implementations of Gaussian blur they use the sigma value to define the amount of blur. In the Gaussian blur definition the radius can be considered the 'blur radius'. These terms are in pixel space.

How do I choose sigma?

This will define how much blur you want, which corresponds to the size of the kernel to be used in the convolution. Bigger values will result in more blurring.

The NVidia implementation uses a kernel size of int(sigma*3).

You may experiment using a smaller kernel size with higher values of sigma for performance considerations. These are free parameters to experiment with, which define how many pixels to use for modulation and how much of the corresponding pixel to include in the result.

What's the good size for a kernel, and how does it relate to sigma?

Based on the sigma value you will want to choose a corresponding kernel size. The kernel size will determine how many pixels to sample during the convolution and the sigma will define how much to modulate them by.

You may want to post some code for a more detailed explanation. NVidia has a pretty good chapter on how to build a Gaussian Kernel. Look at Example 40-1.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!