In Objective-C, we use this code to set RGB color codes for views:
#define UIColorFromRGB(rgbValue)
[UIColor colorWithRed:((float)((rgbValue & 0x
You cannot use a complex macros like #define UIColorFromRGB(rgbValue)
in swift. The replacement of simple macro in swift is global constants like
let FADE_ANIMATION_DURATION = 0.35
Still the complex macros that accept parameters are not supported by swift. you could use functions instead
Complex macros are used in C and Objective-C but have no counterpart in Swift. Complex macros are macros that do not define constants, including parenthesized, function-like macros. You use complex macros in C and Objective-C to avoid type-checking constraints or to avoid retyping large amounts of boilerplate code. However, macros can make debugging and refactoring difficult. In Swift, you can use functions and generics to achieve the same results without any compromises. Therefore, the complex macros that are in C and Objective-C source files are not made available to your Swift code.
Excerpt from Using swift with cocoa and objective C
Check @Nate Cooks answer for the Swift version of that function to be used here
You can use this:
//The color RGB #85CC4B
let newColor = UIColor(red: CGFloat(0x85)/255
,green: CGFloat(0xCC)/255
,blue: CGFloat(0x4B)/255
,alpha: 1.0)
The simplest way to add color programmatically is by using ColorLiteral.
Just add the property ColorLiteral as shown in the example, Xcode will prompt you with a whole list of colors which you can choose. The advantage of doing so is lesser code, add HEX values or RGB. You will also get the recently used colors from the storyboard.
Example: self.view.backgroundColor = ColorLiteral
import Cocoa
class ViewController: NSViewController{
override func viewDidLoad() {
super.viewDidLoad()
self.view.wantsLayer = true
self.view.layer?.backgroundColor = NSColor(hexString: "#4d9b48").cgColor
}
extension NSColor {
convenience init(hexString: String, alpha: CGFloat = 1.0) {
let hexString: String = hexString.trimmingCharacters(in: CharacterSet.whitespacesAndNewlines)
let scanner = Scanner(string: hexString)
if (hexString.hasPrefix("#")) {
scanner.scanLocation = 1
}
var color: UInt32 = 0
scanner.scanHexInt32(&color)
let mask = 0x000000FF
let r = Int(color >> 16) & mask
let g = Int(color >> 8) & mask
let b = Int(color) & mask
let red = CGFloat(r) / 255.0
let green = CGFloat(g) / 255.0
let blue = CGFloat(b) / 255.0
self.init(red:red, green:green, blue:blue, alpha:alpha)
}
func toHexString() -> String {
var r:CGFloat = 0
var g:CGFloat = 0
var b:CGFloat = 0
var a:CGFloat = 0
getRed(&r, green: &g, blue: &b, alpha: &a)
let rgb:Int = (Int)(r*255)<<16 | (Int)(g*255)<<8 | (Int)(b*255)<<0
return String(format:"#%06x", rgb)
}
}
This is worked for me in swift. Try this
bottomBorder.borderColor = UIColor (red: 255.0/255.0, green: 215.0/255.0, blue: 60/255.0, alpha: 1.0).CGColor
Here's a Swift version of that function (for getting a UIColor representation of a UInt
value):
func UIColorFromRGB(rgbValue: UInt) -> UIColor {
return UIColor(
red: CGFloat((rgbValue & 0xFF0000) >> 16) / 255.0,
green: CGFloat((rgbValue & 0x00FF00) >> 8) / 255.0,
blue: CGFloat(rgbValue & 0x0000FF) / 255.0,
alpha: CGFloat(1.0)
)
}
view.backgroundColor = UIColorFromRGB(0x209624)