0
votes

I get the ascii code from a char like this (using C langage):

char * text = "5°"; 
int sizeText = strlen(text);
int i = 0;
float asciiCode;
for(i=0; i<sizeText; i++) {
    asciiCode = (GLfloat)text[i];
}

It's working well for the ASCII table (chars number 0 to 127) but it's not working with characters from the extended ASCII table. For example I get -62 from this symbol "°" (it should be 248).

I tried with several encoding like UTF-8 and ISO 8859-1 (I'm using eclipse cdt btw), I got a different ASCII code each time but not the good one :/

Do you have any idea why it is not working? How can I get it work?

Thanks.

2
How are you displaying the values? Why are you assigning an ASCII code to a float? If I do printf("5°") it works just fine. The problem is you're assigning a signed character 248 (which is equivalent to -62 as a number) to a float which is keeping it as -62 and now it's just a number..lurker
You don't know for sure that "it should be 248". The signed-ness of char is platform dependent.Bathsheba
why in hell would you use a "graphic type" like GLfloat to manage char values which are moreover integersGuiroux
Thanks for your answers! GLfloat is a float, use an int didn't solve the problem (even if i agree it's better), using the type wchar_t did! Now I get 176 as ascii code of "°", i'm wondering where -62 is coming from!Epiliptik

2 Answers

0
votes

Because char is from -127 to 127. You can't use extended ASCII table like that.

I suggest you use wchar instead. Check wchar.h and wctype.h manual.

0
votes

I think your problem is that assiiCode is a float, try declaring it as an unsigned int