This is a question about programming style in Swift, specifically Int
vs UInt
.
The Swift Programming Language Guide advises programmers to use the generic signed integer type Int
even when variables are known to be non-negative. From the guide:
Use UInt only when you specifically need an unsigned integer type with the same size as the platform’s native word size. If this is not the case, Int is preferred, even when the values to be stored are known to be non-negative. A consistent use of Int for integer values aids code interoperability, avoids the need to convert between different number types, and matches integer type inference, as described in Type Safety and Type Inference.
However, UInt
will be 32-bit unsigned on 32-bit architectures and 64-bit unsigned on 64-bit architectures so there is no performance benefit to using Int
over UInt
.
By contrast, the Swift guide gives a later example:
let age = -3
assert(age >= 0, "A person's age cannot be less than zero")
// this causes the assertion to trigger, because age is not >= 0
Here, a runtime issue could be caught at compile time if the code had been written as:
let age:UInt = -3
// this causes a compiler error because -3 is negative
There are many other cases (for example anything that will index a collection) where using a UInt
would catch issues at compile time rather than runtime.
So the question: is the advice in the Swift Programming Language guide sound, and do the benefits of using Int
"even when the values to be stored are known to be non-negative" outweigh the safety advantages of using UInt
?
Additional note: Having used Swift for a couple of weeks now its clear that for interoperability with Cocoa UInt
is required. For example the AVFoundation
framework uses unsigned integers anywhere a "count" is required (number of samples / frames / channels etc). Converting these values to Int
could lead to serious bugs where values are greater than Int.max
int
– Jack Jamesfor
loop can easily go wrong; e.g.for (unsigned a = 10; a > 0; --a)
is wrong becausea
is always > 0 by definition. – alastairunsigned a
isn't always > 0 it's >= 0 – j b