Just converted a project to Swift 3 and cant figure out the following error.
public func currencyString(_ decimals: Int) -> String {
let formatter =
You can do it this way:
public func currencyString(_ decimals: Int) -> String {
let formatter = NumberFormatter()
formatter.numberStyle = .currency
formatter.maximumFractionDigits = decimals
return formatter.string(from: NSNumber(value: decimals))!
}
Because if you check NSNumber you will get predefined init like:
public init(value: Int)
What about this?
override func viewDidLoad() {
super.viewDidLoad()
self.navigationController?.navigationBar.backIndicatorImage = UIImage(named: "backButton")
self.navigationController?.navigationBar.backIndicatorTransitionMaskImage = UIImage(named: "backButton")
self.navigationItem.backBarButtonItem = UIBarButtonItem(title: "", barButtonSystemItem: UIBarButtonItemStyle.Plain, target: nil, action: nil)
}
public func currencyString(_ decimals: Int) -> String {
let numberFormatter = NumberFormatter()
numberFormatter.numberStyle = .currency
numberFormatter.formatterBehavior = .default
let priceString = numberFormatter.string(from: NSNumber(value:product.introPrice))
return priceString!
}
Difference between syntex
// Old code
formatter.string(from: NSNumber(product.introPrice))!
// swift 3.0.1
formatter.string(from: NSNumber(value:product.introPrice)
While the accepted answer shows how NSNumber
initializer should be called correctly, it is good to know that there is no reason to convert Swift numbers to NSNumber
if we use the string(for:)
method instead of string(from:)
.
return formatter.string(for: self)!
To clarify the confusion as to what the error is,
NSNumber is calling NSNumber.init( value: X )
method to instantiate a NSNumber object.
"Argument labels '(_:)
' do not match any available overloads"
The code produces the error because NSNumber is not a type rather it is a class with members. "NSNumber(...)"
instantiates a class object to contain the 'value' of (1.0 / 1.29)
.
This is not a type conversion or cast like in C/C++. where you are trying to cast the type to allow the compiler to do its job.
float y = 1.3;
int x = int( y );
NSNumber is not a type like int, float, char
The error comes into play because there are several ways to call NSNumber.init( value: type )
Swift is requiring that you specifically say that you want the 'value' member of the NSNumber to contain the value x.
let localRate = NSNumber( 1.0 / 1.29)
var y = NSNumber( 0 )
var b = NSNumber( false )
let localRate = NSNumber(value: 1.0 / 1.29)
var y = NSNumber( value: 0 )
var b = NSNumber( value: false )
The confusion might be coming into play because this works.
w = String( "4" )
The class String does not require the argument label, while NSNumber does require an argument label of 'value:'
Perhaps this is due to how IOS treats NSNumber as coming from legacy?