I need to create a base64 string representation of an NSImage cocoa object. What\'s the best way of handling this, apple documentation seems to be a little short on the subj
Swift 3
extension NSImage {
var base64String:String? {
guard let rep = NSBitmapImageRep(
bitmapDataPlanes: nil,
pixelsWide: Int(size.width),
pixelsHigh: Int(size.height),
bitsPerSample: 8,
samplesPerPixel: 4,
hasAlpha: true,
isPlanar: false,
colorSpaceName: NSDeviceRGBColorSpace,
bytesPerRow: 0,
bitsPerPixel: 0
) else {
print("Couldn't create bitmap representation")
return nil
}
NSGraphicsContext.saveGraphicsState()
NSGraphicsContext.setCurrent(NSGraphicsContext(bitmapImageRep: rep))
draw(at: NSZeroPoint, from: NSZeroRect, operation: .sourceOver, fraction: 1.0)
NSGraphicsContext.restoreGraphicsState()
guard let data = rep.representation(using: NSBitmapImageFileType.PNG, properties: [NSImageCompressionFactor: 1.0]) else {
print("Couldn't create PNG")
return nil
}
return data.base64EncodedString(options: [])
}
}
An NSImage is a very abstract object. NSImage doesn't really care whether it's a raster image or a vector image; an NSImage object can even have raster, vector, and even programmatic representations all at once—it's that general.
Before you can generate Base64 data, you must decide what you want to encode.
The first step is to decide whether you want to encode a raster or vectors. The former is quite easy, and I'd guess it's probably what you meant. However, an NSImage could have come from a vector source, such as a PDF document.
If you know you created the image from a raster source, you can just encode that source data.
If it came from a vector source, you can still just encode that if you know the application on the decoding end will be able to handle it (e.g., if it's another Cocoa or Cocoa Touch app). On the other hand, if the app on the decoding end may be unable to handle vector data, then you should avoid this tactic.
The one solution that works in all cases is to use NSBitmapImageRep to create a raster capture of the image. Lock focus on the image, then create an NSBitmapImageRep using that method, then unlock focus. Then, use representationUsingType:properties: to generate PNG (or whatever format is appropriate) data for the image. Then Base64-encode the PNG (or whatever format) data.
I've got a bunch of code, including Base64 parsing based on the implementation in OmniFoundation, in my toolkit on github. In particular, look at Extensions/NSData+Base64.h.
Swift 4
extension NSImage {
var base64String: String? {
guard let rep = NSBitmapImageRep(
bitmapDataPlanes: nil,
pixelsWide: Int(size.width),
pixelsHigh: Int(size.height),
bitsPerSample: 8,
samplesPerPixel: 4,
hasAlpha: true,
isPlanar: false,
colorSpaceName: .calibratedRGB,
bytesPerRow: 0,
bitsPerPixel: 0
) else {
print("Couldn't create bitmap representation")
return nil
}
NSGraphicsContext.saveGraphicsState()
NSGraphicsContext.current = NSGraphicsContext(bitmapImageRep: rep)
draw(at: NSZeroPoint, from: NSZeroRect, operation: .sourceOver, fraction: 1.0)
NSGraphicsContext.restoreGraphicsState()
guard let data = rep.representation(using: NSBitmapImageRep.FileType.png, properties: [NSBitmapImageRep.PropertyKey.compressionFactor: 1.0]) else {
print("Couldn't create PNG")
return nil
}
// With prefix
// return "data:image/png;base64,\(data.base64EncodedString(options: []))"
// Without prefix
return data.base64EncodedString(options: []))
}
}
Apple doesn't provide any particular help here, so you do have to tackle the complexity on your own, one way or another.
Fortunately, there are some resources available to make this easier. The first approach is to literally do the encoding and decoding yourself. Google Toolbox for Mac has a good example of this approach and you might be able to just use this source file as-is:
http://code.google.com/p/google-toolbox-for-mac/source/browse/trunk/Foundation/GTMBase64.m
If you're building only for the Mac, where the OpenSSH libraries are available, then you could take advantage of some functions in those libraries to do the encoding and decoding:
http://www.dribin.org/dave/blog/archives/2006/03/12/base64_cocoa/
Here is a Swift 2 extension to convert NSImage into a base64 string:
private extension NSImage {
var base64String:String? {
guard let rep = NSBitmapImageRep(
bitmapDataPlanes: nil,
pixelsWide: Int(size.width),
pixelsHigh: Int(size.height),
bitsPerSample: 8,
samplesPerPixel: 4,
hasAlpha: true,
isPlanar: false,
colorSpaceName: NSDeviceRGBColorSpace,
bytesPerRow: 0,
bitsPerPixel: 0
) else {
print("Couldn't create bitmap representation")
return nil
}
NSGraphicsContext.saveGraphicsState()
NSGraphicsContext.setCurrentContext(NSGraphicsContext(bitmapImageRep: rep))
drawAtPoint(NSZeroPoint, fromRect: NSZeroRect, operation: .CompositeSourceOver, fraction: 1.0)
NSGraphicsContext.restoreGraphicsState()
guard let data = rep.representationUsingType(NSBitmapImageFileType.NSPNGFileType, properties: [NSImageCompressionFactor: 1.0]) else {
print("Couldn't create PNG")
return nil
}
return data.base64EncodedStringWithOptions([])
}
}