I have researched all over in order to find a kernel that performs adaptive thresholding on iOS. Unfortunately I do not understand the kernel language or the logic behind it. Be
How does this look: I've used the CoreImage CIBoxBlur
(although the dedicated convolution filters may be faster) and passed the output of that into my existing threshold filter.
class AdaptiveThresholdFilter: CIFilter
{
var inputImage : CIImage?
var thresholdKernel = CIColorKernel(string:
"kernel vec4 thresholdFilter(__sample image, __sample threshold)" +
"{" +
" float imageLuma = dot(image.rgb, vec3(0.2126, 0.7152, 0.0722));" +
" float thresholdLuma = dot(threshold.rgb, vec3(0.2126, 0.7152, 0.0722));" +
" return vec4(vec3(step(imageLuma, thresholdLuma)), 1.0);" +
"}"
)
override var outputImage: CIImage!
{
guard let inputImage = inputImage,
thresholdKernel = thresholdKernel else
{
return nil
}
let blurred = inputImage.imageByApplyingFilter("CIBoxBlur",
withInputParameters: [kCIInputRadiusKey: 9])
let extent = inputImage.extent
let arguments = [inputImage, blurred]
return thresholdKernel.applyWithExtent(extent, arguments: arguments)
}
}
I found this image of a shaded page and with this code:
let page = CIImage(image: UIImage(named: "son1.gif")!)
let filter = AdaptiveThresholdFilter()
filter.inputImage = page
let final = filter.outputImage
I got this result:
Cheers!
Simon