The purpose of this blog post is to provide some data about fragmentation on the Mac, that I’ve not seen researched/published elsewhere.
Mac OSX has a defragmenter in the file system itself. Given Mac is open-source, we looked at the code.
During a file open the files get defragmented if the following conditions are met:
1. The file is less than 20MB in size
2. There are more than 7 fragments
3. System has been up for more than 3 minutes
4. A regular file
5. File system is journaled
6. And the file system is not read-only.
So what’s Apple’s take on the subject? An Apple technical article states this:
Do I need to optimize?
You probably won’t need to optimize at all if you use Mac OS X. Here’s why:
- Hard disk capacity is generally much greater now than a few years ago. With more free space available, the file system doesn’t need to fill up every “nook and cranny.” Mac OS Extended formatting (HFS Plus) avoids reusing space from deleted files as much as possible, to avoid prematurely filling small areas of recently-freed space.
- Mac OS X 10.2 and later includes delayed allocation for Mac OS X Extended-formatted volumes. This allows a number of small allocations to be combined into a single large allocation in one area of the disk.
- Fragmentation was often caused by continually appending data to existing files, especially with resource forks. With faster hard drives and better caching, as well as the new application packaging format, many applications simply rewrite the entire file each time. Mac OS X 10.3 Panther can also automatically defragment such slow-growing files. This process is sometimes known as “Hot-File-Adaptive-Clustering.”
- Aggressive read-ahead and write-behind caching means that minor fragmentation has less effect on perceived system performance.
For these reasons, there is little benefit to defragmenting.
Note: Mac OS X systems use hundreds of thousands of small files, many of which are rarely accessed. Optimizing them can be a major effort for very little practical gain. There is also a chance that one of the files placed in the “hot band” for rapid reads during system startup might be moved during defragmentation, which would decrease performance.
If your disks are almost full, and you often modify or create large files (such as editing video, but see the Tip below if you use iMovie and Mac OS X 10.3), there’s a chance the disks could be fragmented. In this case, you might benefit from defragmentation, which can be performed with some third-party disk utilities.
Here is my take on that information:
While I have no problem with the lead-in which states probably, the reasons are theoretical. Expressing theory and then an opinion on that theory is fine, so long as you properly indicate it is an opinion. The problem I do have with this is the last sentence before the notation, “For these reasons, there is little benefit to defragmenting.”, or more clearly; passing off theory as fact.
Theory, and therefore “reasons” need to be substantiated by actual scientific processes that apply the theory and then either validate or invalidate it. Common examples we hear of theory-as-fact are statements like “SSDs don’t have moving parts and don’t need to be defragmented”. Given our primary business is large enterprise corporations, we hear a lot of theory about the need (or lack thereof) of defragmenting complex and expensive storage systems. In all those cases, testing proves fragmentation (files, free space or both) slows computers down. The reasons sound logical, which dupes readers/listeners into believing the statements are true.
On that note, while the first three are logical, the last “reason” is most likely wrong. Block-based read-ahead caching is predicated on files being sequentially located/interleaved on the same disk “tracks”. File-based read-ahead would still have to issue additional I/Os due to fragmentation. Fragmentation of data essentially breaks read-ahead efforts. Could the Mac be predicting file access and pre-loading files into memory well in advance of use, sure. If that’s the case I could agree with the last point (i.e. “perceived system performance), but I find this unlikely (anyone reading this is welcome to comment).
They do also qualify the reason by stating “minor fragmentation“, to which I would add that that minor fragmentation on Windows may not have “perceived” impact either.
I do agree with the final statement that states “you might benefit from defragmentation” when using large files, although I think might is too indecisive.
Where my opinion comes from:
A few years ago (spring/summer of 2009) we did a research project to understand how much fragmentation existed on Apple Macs. We wrote and sent out a fragmentation/performance analysis tool to select customers who also had Macs at their homes/businesses. We collected data from 198 volumes on 82 Macs (OSX 10.4.x & 10.5.x). 30 of those systems were in use between 1 – 2 years.
While system specifics are confidential (testers provided us the data under non-disclosure agreements) we found that free space fragmentation was particularly bad in many cases (worse than Windows). We also found an average of a little over 26,000 fragments per Mac, with an average expected performance gain from defrag of about 8%.Our research also found that the more severe cases of fragmentation, where we saw 70k/100k+ fragments, were low on available free space (substantiating that last paragraph in the Apple tech article).
This article also provide some fragmentation studies as well as performance tests. Their data also validates Apple’s last paragraph and makes the “might benefit” statement a bit understated.
Your Mileage May Vary (YMMV):
So, in summary I would recommend defragmenting your Mac. As with Windows, the benefit from defragmenting is proportionate to the amount of fragmentation. Defrag will help. The question is “does defrag help enough to spend the time and money?”. The good thing is most Mac defragmenters, just like Diskeeper for Windows, have free demo versions you can trial to see if its worth spending money.
Here are some options:
+ iDefrag (used by a former Diskeeper Corp employee who did graphic design on a Mac)
+ Drive Genius suite (a company we have spoken with in the past)
Perhaps this article begs the question/rumor “will there be a Diskeeper for Mac?”, to which I would answer “unlikely, but not impossible”. The reason is that we already have a very full development schedule with opportunities in other areas that we plan to pursue.
We are keeping an “i” on it though ;-).
Thanks Joakim,
It is a good topic for another post. The MFT is an undocumented data structure, so there is risk with features to shrink/grow it (with each service pack). Generally unless you have hundreds of thousands of files, delete them, and then write a small number of large files, shrinking the MFT is not needed – but the topic is worth a separate post.
Sorry for off- topic. It would be nice if you could blog about NTFS MFT. After one project for example I ended up having MFT with nearly two hundred thousand ghost records floating around. Whats is your take on this? Does ghost records have any negative effect to the performance and have you ever considered adding MFT compact and truncate feature(at least Total Defrag has one) to Diskeeper?