You are not logged in.
In my system this is a working and reasonable function to calculate the integer square root of an unsigned long integer:
#include <stdlib.h>
unsigned long int sqrt_utility(unsigned long int input) {
unsigned long int k = 0;
unsigned long int rv = input;
unsigned long int y;
while (rv > 0) {
++k;
rv >>= 1;
}
/* k now contains the number of binary digits of input */
/* Now the good old Newton algorithm... */
rv = 1;
++k;
rv <<= k/2;
y = (rv + input/rv)/2;
while (y < rv) {
rv = y;
y = (y + input/y)/2;
}
return rv;
}
But I was wondering: does the operator << have the the same meaning in all machines?
Or the left/right shift change meaning depending the edianness of the machine?
Or is it portable?
Thanks
Edit: fixed bug in the code. k/2 must rounded up, not down.
Didn't change the meaning of the question anyway. But if anyone would seek for integer square root and finish here she will find the correct code.
Last edited by ezzetabi (2008-04-02 07:49:25)
Offline
I never tried this really, but I imagine that even if the low level CPU instructions shift work based on the endianness and all that fun stuff, C was made so we didn't have to think about it. Unless you're talking about the high order bits, and overflows and things like that, the compiler will emit different instructions to do the exact same thing
Offline
I never tried this really, but I imagine that even if the low level CPU instructions shift work based on the endianness and all that fun stuff, C was made so we didn't have to think about it. Unless you're talking about the high order bits, and overflows and things like that, the compiler will emit different instructions to do the exact same thing
Exactly, the compiler makes sure it does the same thing everywhere.
Offline
besides shifting by negative or out of range numbers (> 31 when using standard ints) a c shift left will produce the same results on any architecture (ie a<<b = a*(2^b)). right shifts are a little different though. an arithmetic shift left and a logical shift left produce the same results where as an arithmetic shift right preserves the sign bit while a logical shift right shifts in 0's. when using unsigned ints a shift right will always be a logical shift right, but for signed ints i'm pretty sure that ansi/iso c standard _doesn't_ require that the shift be an arithmetic shift right. still a right shift on signed ints should produce an arithmetic shift right, every version of gcc i've ever used and across various architectures i've never seen it not produce an arithmetic shift right on signed integers.
Offline