Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
char problem in C code kernel

I have come across a problem in C code that's called as C from script from a motion program, which has surprised me (probably not difficult)!

My C function had a variable declared like this:

char initialCubicTrue = 0;

In the code the value of initialCubicTrue is set to a positive or negative value to control logic in the code. I found that the logic didn't work when the value was -3 (for instance) and I'm guessing that it didn't work for any negative number.

The problem was resolved when I changed the variable declaration to:

signed char initialCubicTrue = 0;

My understanding of the C complier used is that it would default to treating char as signed - but in this case it seems not. (I am using a 465 card running firmware).

Can you let me know if there are any other similar things to beware of?

We believe unsigned is the normal default for char. Every character in the ascii tables is represented by a positive integer.
Searching this issue on the web, I found the comment "Most systems, including x86 GNU/Linux and Microsoft Windows, use signed char, but those based on PowerPC and ARM processors typically use unsigned char."

This presumably explains the behaviour I saw since the 485 is a PowerPC chip.

Am I right in thinking there is an Omron "Delta Tau" product on the way that uses an x86 processor? If so, it seems that we will need to watch code portability here.
The Omron IPC (Industrial PC Platform) product is x86 based - the NY51[]-A. See the following:

Just use an "int" declaration. Memory is really cheap and your compiler could be aligning the char values to 64bit boundaries anyway. If all you need is a multistate value then you would be better off using an enum which is an int under the hood but with a little extra type checking and syntactical sugar.


Forum Jump:

Users browsing this thread: 1 Guest(s)