
2010/8/23 <200901104@daiict.ac.in>:
This function takes 1.8 seconds to convert 2000 integers of length 10^13000. I need it to be smaller that 0.5 sec. Is it possible?
2000 integers of magnitude 10^13000 equals to about 26 MBytes of data (2000 numbers each 13000 digits long). Rounding 1.8 seconds to two seconds we conclude that you proceed with speed about 13MBytes per second. Assuming you have CPU clock frequency near 2.6GHz, you performing about 200 clock cycles per input digit. 10^13000 roughly equal to 2^39000. Or (2^32)^1219 - 1219 32-bit words of representation. So you're doing some last nextN = (n*10)+currentDigit conversion operations in less that one clock cycle per word. Either I err'd in my math, or you're performing better than most of us could expect. Maybe you are off in your 10^13000 by an order of magnitude.