QueryPerformanceCounter为什么和GetTickCount的结果相差很大
一个很简单的小函数,看GetTickCount,clock,QueryPerformanceCounter这三个函数得到的结果。说明一下前提条
件,我是在复制一个303字节大小的文件来测试时间的;通过QueryPerformanceFrequency函数,可以得到我的机器上每秒频率值为3579545,这说明两点:1. 我的机器上高精确度性能计数器可用; 2. 高精确度性能计数器的精度可以达到微秒级。由于另外两个函数都是毫秒级的结果,所以这里把QueryPerformanceCounter的结果也转换为毫秒级:
void test1()
{
DWORD start = 0;
DWORD end = 0;
clock_t c_start, c_end;
LARGE_INTEGER freqc = {0};
LARGE_INTEGER llStart = {0};
LARGE_INTEGER llEnd = {0};
BOOL ret;
QueryPerformanceFrequency(&freqc);
DEBUG("frequence is %lld\n",freqc.QuadPart);
start = GetTickCount();
c_start = clock();
ret = QueryPerformanceCounter(&llStart);
system("copy 4.txt 4.txt.bak");
end = GetTickCount();
c_end = clock();
ret = QueryPerformanceCounter(&llEnd);
DEBUG("\nruns for %ld, start = %ld, end = %ld\n", end - start,
start, end);
DEBUG("\nruns for %ld, start = %ld, end = %ld\n", c_end - c_start,
c_start, c_end);
DEBUG("\nruns for %lld, llStart = %lld, llEnd = %lld\n",
((llEnd.QuadPart - llStart.QuadPart)*1000)/freqc.QuadPart,
llStart.QuadPart, llEnd.QuadPart);
}
函数GetTickCount和clock得到的结果也差不多,误差最多不超过1毫秒。但是同样就在后面执行的QueryPerformanceCounter函数,统计出来的结果会和前两个最多相差10毫秒。
不是说QueryPerformanceCounter应该更加精确吗?怎么和其它函数差那么远?它们的计时机制是什么?哪种比较靠谱?