No, that is OK. :-) The matter is that I've implemented two
algorithms solving the same problem. For one of them, I can
easily derive the time complexity and even verify - by performing
several tests and then using the least squares fitting. Nevertheless,
for the other algorithm the expected time complexity  ( quite well known
in general :-) )  and measured values do no fit together. As I'm trying
to use as large inputs as possible as well as the short ones,
I was wondering, whether there might be some distortion due to
added values to the counter of reductions because of garbage
collection - now I know that not  ( verified such a way, so that
I changed the amount of allocated memory by hugs and run the
same command => once with GC, the other time without :-) ).

Sorry for distortion in the list.

Dusan

Daniel Fischer wrote:
Am Mittwoch, 24. August 2005 16:55 schrieb Dusan Kolar:
  
Hello,

  Even if I know number of reductions should not be used
to anything important I'm quite confused with values I get.
Is garbage collection somehow affecting the number of
reductions? I have always thought not, but... ;-)

  Thx,

   Dusan

    

What is confusing you?
Different numbers of reductions for the same computation?
That would probably be due to the fact that named entities are stored and not 
re-evaluated.

Cheers,

Daniel
  

-- 

 Dusan Kolar            tel: +420 54 114 1238
 UIFS FIT VUT Brno      fax: +420 54 114 1270
 Bozetechova 2       e-mail: kolar@fit.vutbr.cz
 Brno 612 66
 Czech Republic

--