I’ve always wondered if a hard drive can make much difference to the speed of a system.
My suspicion is that newer HDDs have a larger cache, and can “slurp” up more data in 1 disk rotation, than older ones.
I recently had a chance to test out my theory.
I had 2 very different hard drives and systems:
- HDD: 4.0 Gb quantum fireball CR
- HDD: 250 GB Seagate st3250820A
- System: Pentium D 2.66Ghz, 2GB RAM
- System: Pentium 3 550Mhz, 256Mb RAM
Now, I can’t say this is a totally scientific test, I just wanted to test an assumption that I had.
With each hard drive, I’d place it into a system, repartition the hard drive, then do a fresh install of windows XP SP2.
Once XP was installed, I would update all the device drivers.
I would not defragment, install any antimalware, or anything else that might alter the results. The reason is that I would have a system that many average users would find “out of the box”.
I would then time the startup, from when the windows logo first appears, until the start button first appears.
I would then restart the system twice, so that I would have 3 bootup times for each configuration.
Pentium D system:
4GB HDD: 33.9 sec, 27.3 sec, 32.9 sec
250GB HDD: 24.1 sec, 24.0 sec, 24.0 sec
4GB HDD: 26.0 sec, 25.6 sec, 26.3 sec
250GB HDD: 25.3 sec, 24.4 sec, 25.1 sec
It looks like the timespan I was measuring might have included many points when XP might have been waiting for timeout conditions, as there doesn’t seem to be much difference between the HDDs (the biggest difference (about 3 – 9 seconds) was with the pentium D system… but surprisingly, the P3 system seemed to beat it while using the 4GB drive… curious.
I’m not sure if I can make any real conclusions about all this… as I also have 2 virtually identical HP systems (one with a 6GB HDD, and the other with a 20GB HDD)… and I could swear that the 20GB system feels faster to me… maybe my mind is playing tricks on me.