Given the following code and a device running iOS 7.1 or later:
NSDictionary *fontTraitsDictionary = @{UIFontWeightTrait : @(-1.0)};
NSDictionary *attributesDictionary = @{
UIFontDescriptorFamilyAttribute : @"Helvetica Neue",
UIFontDescriptorTraitsAttribute : fontTraitsDictionary
};
UIFontDescriptor *ultraLightDescriptor = [UIFontDescriptor fontDescriptorWithFontAttributes:attributesDictionary];
UIFont *shouldBeAnUltraLightFont = [UIFont fontWithDescriptor:ultraLightDescriptor size:24];
NSLog(@"%@", shouldBeAnUltraLightFont);
I would expect the value of shouldBeAnUltraLightFont
to be an instance of HelveticaNeue-UltraLight, but instead it is:
<UICTFont: 0x908d160> font-family: "Helvetica"; font-weight: normal; font-style: normal; font-size: 24.00pt
I am following the Apple documentation as far as I understand it. Why is the font family and font weight information completely ignored?
Things I’ve Tried
- I've tried other family names like Helvetica, Avenir, etc.
- I've tried other font weights in the valid range from -1 to 1, in increments of 0.25
Regardless of these changes, the font returned is always a vanilla instance of Helvetica at normal weight.
UIFontDescriptorTraitsAttribute
key from the dictionary, then the resulting font is of the correct family ("Helvetica Neue", not "Helvetica"). Probably a bug. It seems more likely that you should be usingfontDescriptorWithSymbolicTraits
to get the correct font, butUIFontDescriptorSymbolicTraits
is lacking a value for light/ultralight. Probably an oversight. – blork<UICTFont: 0x7b22c690> font-family: "Helvetica Neue"; font-weight: normal; font-style: normal; font-size: 24.00pt
), so there's some progress. – Arek Holko<UICTFont: 0x7bf91af0> font-family: "HelveticaNeue-UltraLight"; font-weight: normal; font-style: normal; font-size: 16.00pt
. – Nate Cook