I am trying to convert the input ascii string to hex format using the format specifier "%.2X" . I have pasted the program at the end which does the same. But, this program works fine in AIX server and in Linux server its giving improper hex output. Below I have mentioned the output of the program in both the environments. The AIX env gives the equalent hex value for the given input and this works well. The Linux env returns the equalent hex value for the normal ascii characters but not for the extended ascii (i.e. Latin characters - character code 128-255). Output in AIX --------------- INput chunk = [<8f><80>âY^SZÃpS ðï^Y] Read buf size = [30] ls_input_chunk_raw = [3C38663E3C38303EC383C2A2595E535AC38370532020C3B0C383C2AF5E59] Output in Linux ----------------- INput chunk = [<8f><80>âY^SZÃpS ðï^Y] Read buf size = [30] ls_input_chunk_raw = [3C38663E3C38303EFFFFFFFF595E535AFFFF70532020FFFFFFFFFFFF5E59] I suspect the Linux processor (which I am using) doesn't support the extended ascii chars (I am not sure about this). Server Information: ---------------------- AIX server: ------------ $uname -a AIX serv01 3 5 00CBCEFC4C00 Linux server: -------------- $uname -a Linux serv02 2.6.9-42.0.0.0.1.ELsmp #1 SMP Sun Oct 15 15:13:57 PDT 2006 x86_64 x86_64 x86_64 GNU/Linux I am expecting the same output in Linux env as the output in AIX env. Please let me know your suggestion to resolve this issue to make it work in both the environments. Code: /***********C Program ***************************/ #include <stdio.h> void text_to_hex(char *is_ascii_string, char *os_hex_string, int ii_ascii_string_size); int main() { FILE *pf_in_file = NULL; char ps_in_filename[20] = ""; int pi_input_chunk_size = 0; char ps_input_chunk[161] = ""; long pl_bytes_to_read = 160; char ls_input_chunk_raw[220] = ""; strcpy(ps_input_chunk, "<8f><80>âY^SZÃpS ðï^Y"); pi_input_chunk_size = strlen(ps_input_chunk); printf("INput chunk = [%s]\n", ps_input_chunk); printf("Read buf size = [%d]\n", pi_input_chunk_size); text_to_hex(ps_input_chunk, ls_input_chunk_raw, pi_input_chunk_size); printf("ls_input_chunk_raw = [%s]\n", ls_input_chunk_raw); return (0); } void text_to_hex(char *is_ascii_string, char *os_hex_string, int ii_ascii_string_size) { char *function = "text_to_hex"; char ls_hex[3] = ""; int i = 0; int j = 0; memset(ls_hex, '\0', 3); j = 0; for (i=0; i < ii_ascii_string_size; i++) { sprintf(ls_hex, "%.2X", is_ascii_string[i]); os_hex_string[j++] = ls_hex[0]; os_hex_string[j++] = ls_hex[1]; } os_hex_string[j] = '\0'; } /****************** Program End ******************/
Please read the "Before you make a query" thread and learn how to use code tags. It's definitely easier than programming. ASCII only defines 128 characters, 0-127. The remaining will depend on the system, its locale settings, or its "code page." Any "defined" behavior you achieve will be "defined" only for complying implementations.
Thanks for your advice. So, you mean to say the Sun & Linux servers will not support the 256 ASCII character set and it support only 0-127 standard ASCII character set. Whereas, the IBM AIX server supports the standard and extended ASCII character set i.e. 0-255. Please let me know if I my understanding is wrong. Also, could you please suggest the method/command to identify the character set supported by the server?
They will support values from 0-255. There is no standard definition for the symbolic representations in the upper half of the set. Some might want some line-drawing characters, some might want more accented characters, some might want to provide switchable choices. None of your systems is "wrong", they have just made different choices.
Could you please suggest me the way, by which I can get the equal hex value for the given extended ascii character without using sprintf(...,"%.2X", ....). I don't think any built-in function is available in C for hex conversion. If so, could you please suggest?
I think you're still missing the point that a binary value and a representation of that value are two different things. For instance, when you print the value 1, you are not printing the value, 1, but the character, '1'. This has (in ASCII) a value of 49 or 0x31. The ASCII symbol for the representation of the value, 1, is clearly defined. The ASCII symbol for the representation of the value, 200, is not clearly defined. It might be ╚ and it might be something else. It is relatively trivial to come up with the hexadecimal symbolic representation for actual values. Here is one method, for the values 0 through 255. Note that this method extracts the least significant digit first. The resulting string representation is then reversed to get the MSD-first order. I think you can see how this could be adapted for binary, octal, or any other base. All you need are the symbols for that base and a suitable divisor. Personally, I'd use sprintf. In other words, the wheel has already been invented. Code: int main() { char symbols [] = "0123456789abcdef"; char result [4]; unsigned i, j, lsd, hexNum; for (i = 0; i < 256; ++i) { hexNum = i; j = 0; strcpy (result, "0"); do { lsd = hexNum % 16; result [j++] = symbols [lsd]; hexNum /= 16; } while (hexNum); result [j] = '\0'; printf ("Hex: %s\n", _strrev (result)); } return (0); }