Matt Rajca

Generating Random Psychedelic Art with Core Image

December 31, 2013

On his wonderful Math ∩ Programming blog, Jeremy Kun published an article on algorithmically generating random “psychedelic” art. By combining common mathematical functions, we can generate images such as the one pictured below (they even make great iOS 7 wallpapers)!

The implementation referenced in Jeremy’s blog post uses Python, so I set out to implement some of the ideas discussed as a native OS X app, PsychedelicArt, using Core Image to generate the images on the GPU. While I won’t rehash the original article or walk through a bunch of Cocoa boilerplate code, the Core Image logic is noteworthy. PsychedelicArt is available on GitHub if you want to follow along.

The Core Image framework on OS X (unlike iOS) allows developers to define custom kernel functions that process image pixels in parallel on the GPU. These kernels are very similar to OpenCL kernels, which as of OS X 10.9 “Mavericks”, power Core Image on supported hardware. By leveraging a technology like Core Image, we can tap into the computing power of modern GPUs.

Sure enough, images in PsychedelicArt are generated with a custom Core Image filter. Unlike most Core Image filters, our drawing filter – a generator – does not process an input image; rather, it takes three mathematical functions as input. These functions determine the red, green, and blue components of each pixel. Each function has a normalized domain [-1, 1] and codomain [-1, 1], which is ultimately mapped to RGB values. By tweaking or altering the input functions, the outputs we can produce are limitless.

The interface of the filter is trivial, as expected:

@interface DrawingFilter : CIFilter

- (instancetype)initWithRedFunction:(MathFunction *)rf
					  greenFunction:(MathFunction *)gf
					   blueFunction:(MathFunction *)bf

- (CIImage *)outputImage;


The implementation compiles a Core Image kernel dynamically based on the three input functions given. These are instances of the seven MathFunction subclasses included in the project. MathFunction itself is modeled as an abstract syntax tree (with its arguments defined recursively):

@interface MathFunction : NSObject

- (CGFloat)evaluateWithX:(CGFloat)x y:(CGFloat)y;

- (NSString *)stringRepresentation;



@interface MathFunctionSinPi : MathFunction

@property (nonatomic, strong) MathFunction *argument;


String representations of the input functions are baked into the underlying kernel; the kernel is then compiled. Conveniently enough, CIKernel sports a string-based constructor. The full implementation of the filter is shown below.

@implementation DrawingFilter {
	CIKernel *_kernel;
	CGSize _size;

@"kernel vec4 drawRfGfBf(sampler src) {"\
@"    float pi = 3.14159265;"\
@"    vec2 coord = samplerCoord(src);"\
@"    vec2 size = samplerSize(src);"\
@"    float hw = size.x / 2.0;"\
@"    float hh = size.y / 2.0;"\
@"    float x = (coord.x / hw) - 1.0;"\
@"    float y = (coord.y / hh) - 1.0;"\
@"    float r = %@;"\
@"    float rt = (r + 1.0) / 2.0;"\
@"    float g = %@;"\
@"    float gt = (g + 1.0) / 2.0;"\
@"    float b = %@;"\
@"    float bt = (b + 1.0) / 2.0;"\
@"    return vec4(rt, gt, bt, 1.0);"\

- (instancetype)initWithRedFunction:(MathFunction *)rf greenFunction:(MathFunction *)gf
					   blueFunction:(MathFunction *)bf size:(CGSize)size {

	NSParameterAssert(!CGSizeEqualToSize(size, CGSizeZero));

	CIKernel *kernel = [CIKernel kernelsWithString:[NSString stringWithFormat:
													[rf stringRepresentation],
													[gf stringRepresentation],
													[bf stringRepresentation]]][0];

	if (!kernel) {
		return nil;

	self = [super init];
	if (self) {
		_kernel = kernel;
		_size = size;
	return self;

- (CIImage *)outputImage {
	CIColor *black = [CIColor colorWithRed:0 green:0 blue:0];
	CGRect extentRect = CGRectMake(0, 0, _size.width, _size.height);
	CIImage *inputImage = [CIImage imageWithColor:black];
	inputImage = [inputImage imageByCroppingToRect:extentRect];

	CISampler *sampler = [CISampler samplerWithImage:inputImage];
	NSArray *outputExtent = @[ @(0), @(0),
							   @([inputImage extent].size.width),
							   @([inputImage extent].size.height) ];

	return [self apply:_kernel, sampler, kCIApplyOptionExtent, outputExtent, nil];


Now, given some input functions, all we need to do is use our drawing filter to output a CIImage, and draw the image in a custom view. This is fairly standard Core Image code:

DrawingFilter *dw = [[DrawingFilter alloc] initWithRedFunction:rf
														  size:CGSizeMake(W, W)];
CIImage *outputImage = [dw outputImage];


@implementation RenderView

- (void)setImage:(CIImage *)image {
	if (_image != image) {
		_image = image;
		[self setNeedsDisplay:YES];

- (void)drawRect:(NSRect)dirtyRect {
	if (!_image)

	CIContext *ctx = [[NSGraphicsContext currentContext] CIContext];
	[ctx drawImage:self.image inRect:[self bounds] fromRect:[_image extent]];


That’s all there is to it – algorithmically-generated, “psychedelic” art on the GPU with Core Image.

Have fun!

Source code on GitHub

Tip: playing with Core Image for the first time? Quartz Composer’s Core Image patch is a phenomenal debugging tool.