Long ago I understood the basics of fragmentation, the whole system prefers contiguous files but it is the nature of the beast to have to break up files and you get fragmentation. Solution: have the system rewrite the drive such that there are more contiguous files and less segmentation.
Great idea, so how come nobody agrees on how to do this? I am mostly avoiding the SSD question because the waters are already muddy enough. My machine is Win7, multiple drives, and several choices for optimization. (For the purposes of this thread I will be using optimization and defragmentation synonymously.)
Now, my do NOT like list:
- that NIS only lets you run the optimizer, that you can't just check on the status of a drive
- that NIS starts by saying "testing drive", does that mean it has figured that C: is an SSD and it shouldn't touch?
- that NIS nearly always says that the disk fragmentation is 0% or just slightly over that
- that a another Norton Product, NU16, has a disk optimizer which behaves completely different from NIS, AND comes up with completely different numbers!
- that if I run a 3rd party defragmenter, that I will get still different results from NIS or NU16
It goes against my grain that NIS runs so quickly and generates such low numbers. If it sounds too good to be true then it probably isn't. Just food for thought. On the other hand, the other defragmenting methods usually have options for a quick or deep defrag, and they often remove the drive from operation for a period of time.
What are your thoughts? Why are the fragmentation numbers so wildly different? A given file is either contiguous or it's not. The degree to which it is not could be debated but how does that figure into the overall scheme of things? Does the "deepness" of the optimizing process no longer matter? What do you think?