Explained: The thinking behind the 32GB Windows Format limit on FAT32

‘Because I said so’

There is at last a definitive answer to the question of why the Windows UI slapped a 32GB limit on the formatting of FAT32 volumes and it’s “because I said so,” according to the engineer responsible.…

There is at last a definitive answer to the question of why the Windows UI slapped a 32GB limit on the formatting of FAT32 volumes and it’s “because I said so,” according to the engineer responsible.

While many welcomed 2021 within the walls of their own home, retired Microsoft engineer Dave Plummer marked the end of 2020 with the confession in the latest of a series of anecdotes hosted on his YouTube channel Dave’s Garage.

The limit has always seemed somewhat arbitrary, particularly when one considers the theoretical 16TB maximum volume size of the file system. Using a different formatting tool or dropping into the command line can handily override the presets, but sticking with the stock UI meant sticking with Plummer’s Format dialog. And that meant 32GB for FAT32.

In the closing years of the last century, Plummer was involved in porting the Windows 95 shell to Windows NT. Part of that was a redo of Windows Format (“it had to be a replacement and complete rewrite since the Win95 system was so markedly different”) and, as well as the grungy lower-level bits going down to the API, he also knocked together the classic, stacked Format dialog over the course of an hour of UI creativity.

As he admired his design genius, he pondered what cluster sizes to offer the potential army of future Windows NT 4.0 users. The options would define the size of the volume; FAT32 has a set maximum number of clusters in a volume. Making those clusters huge would make for an equally huge volume, but at a horrifying cost in terms of wasted space: select a 32-kilobyte cluster size and even the few bytes needed by a “Hello World” file would snaffle the full 32k.

“We call it ‘Cluster Slack’,” explained Plummer, “and it is the unavoidable waste of using FAT32 on large volumes.”

“How large is too large? At what point do you say, ‘No, it’s too inefficient, it would be folly to let you do that’? That is the decision I was faced with.”

At the time, the largest memory card Plummer could lay his hands on for testing had an impossibly large 16-megabyte capacity.

“Perhaps I multiplied its size by a thousand,” he said, “and then doubled it again for good measure, and figured that would more than suffice for the lifetime of NT 4.0. I picked the number 32G as the limit and went on with my day.”

While Microsoft’s former leader may have struggled to put clear water between himself and the infamous “640K” quote of decades past, Plummer was clear that his decision process was aimed at NT 4.0 and would just be a temporary thing until the UI was revised.

“That, however, is a fatal mistake on my part that no one should be excused for making. With the perfect being the enemy of the good, ‘good enough’ has persisted for 25 years and no one seems to have made any substantial changes to Format since then…”

NTFS and exFAT (now widely used in removable storage devices, and the specification of which was recently published by Microsoft) will cheerfully dispense with the limits imposed by Plummer’s decades-old design choices (and there is always the command line option “so you can make a disk as big and inefficient as you’d like, subject to the FAT32 limits”).

However, as Plummer put it: “At the end of the day, it was a simple lack of foresight combined with the age-old problem of the temporary solution becoming de-facto permanent.”

Been there, done that? Just perhaps not in an operating system running on billions of PCs around the world? Perhaps an email to Who, Me? might be in order. ®

Rojenx is a leading concept artist who work appears in games and publications

Check out his personal gallery here

This site uses Akismet to reduce spam. Learn how your comment data is processed.