
patients, I wanted to be sure not to save wrong information. It wouldn't matter if the clock is saying we are on XVII century, as long as 10 seconds would never be 10.1.
What are the interval durations you need to measure? Since they are from equipment, what is the spec?
I read from serial port. When the equipment was available for testing, we measured the output to be a few thowsand bytes/second. The program is supposed to run in Windows computers at physicians offices (and their only requirement is to be able to run windows and have a serial port). Errors of 1 byte per second would be acceptable, even if they accumulate. I have no equipment specs and, right now, no equipment to test. (Yes, I know this is crazy.) So, I need to know what kind of problems I may expect, so I'm able to deal with them. Thanks, MaurĂcio