Ever wanted to play with bilateral filtering inside of Core Image? Well you've come to the right place.
Inside you'll find a basic toolkit for switching images, and tinkering around with parameters:
It's almost certainly not the fastest/best implementation out there, but for small images (< 1000px square, give or take) it's capable of running in real time.
Inside of the project is the class NTJBilateralCIFilter
. It extends CIFilter
, so feel free to use it in other projects like you would any other CIFilter
. Or take a look at NTJBilateralFilterRunner
in the demo app for some ideas on how to load content in.
If you're manually including the code into your project, make sure to also bring NTJBilateralCIFilter.cikernel
into your project.
The three parameters are:
inputImage
: ACIImage
to filtersigma_R
: The range of blur. Higher value = more blur. (Default value: 15)sigma_S
: The spatial parameter. Higher value = more features getting flattened. (Default value: 0.2)
- Make it work properly on iOS. The image gets darker instead of blurring, perhaps a different shader language somehow? Or CoreImage works differently? Not sure.