I need to write an assembly language program for the MC68HC12 processor to convert an ASCII string into the appropriate 8 bit signed number that represents the ASCII string. Can someone give me some direction on this?

# ASCII binary conversion

0

So let's take the string '123'. sum=0, and the first character is '1'. '1'-'0'=1, so sum*10+1=1.

Second character is '2', so sum=1*10+2=12.

And so on to '3' where sum=12*10+3=123.

If you haven't got a multiply instruction then you can do shifts and adds to get the desired result. x*10=x*8+x*2, so you can shift x left 1, copy that to the result, then shift it left twice more and add that to the result, i.e.

sum=0

shift left x 1 (x now contains 2* the original x)

add x to sum (sum now contains x*2)

shift left x 1

shift left x 1 (x now contains 8* the original x)

add x to sum (sum now contains x*2+x*8, or x*(2+8), or 10x).

If you want to be able to handle negative numbers as well then simply note somewhere if the first character is a minus, then at the end of the conversion just 2's complement the result (or flip all the bits and add 1, which is the same thing). So 123 is 7b in hex and 0111 1011 in binary, and -123 is 1000 0101.

0

0

sum=sum*10+digit, that just builds the number up as we encounter more digits.

So at '1', sum=0*10+1=1. At '2', sum=1*10+2=12. At '3', sum=12*10+3=123.

So for each digit we encounter we multiply sum by 10 and add the new digit.