um bom conselho sobre pastas grandes em volumes NTFS:
Here's some advice from someone with
an environment where we have folders
containing tens of millions of files.
- A folder stores the index information (links to child files &
child folder) in an index file. This
file will get very large when you have
a lot of children. Note that it
doesn't distinguish between a child
that's a folder and a child that's a
file. The only difference really is
the content of that child is either
the child's folder index or the
child's file data. Note: I am
simplifying this somewhat but this
gets the point across.
- The index file will get fragmented. When it gets too
fragmented, you will be unable to add
files to that folder. This is because
there is a limit on the # of fragments
that's allowed. It's by design. I've
confirmed it with Microsoft in a
support incident call. So although the
theorical limit is several billions,
good luck when you start hitting tens
of million of files.
- It's not all bad however. You can use the tool: contig.exe to
defragment this index. It will not
reduce the size of the index (which
can reach up to several Gigs for tens
of million of files) but you can
reduce the # of fragments. Note: The
Disk Defragment tool will NOT defrag
the folder's index. It will defrag
file data. Only the contig.exe tool
will defrag the index. FYI: You can
also use that to defrag an individual
file's data.
- If you DO defrag, don't wait until you hit the max # of fragment
limit. I have a folder where I cannot
defrag because I've waited until it's
too late. My next test is to try to
move some files out of that folder
into another folder to see if I could
defrag it then. If this fails, then
what I would have to do is 1) create a
new folder. 2) move a batch of files
to the new folder. 3) defrag the new
folder. repeat #2 & #3 until this is
done and then 4) remove the old folder
and rename the new folder to match the
old.
To answer your question more directly:
If you're looking at 100K entries, no
worries. Go knock yourself out. If
you're looking at tens of millions of
entries, then either:
a) Make plans to sub-divide them into
sub-folders (e.g., lets say you have
100M files. It's better to store them
in 1000 folders so that you only have
100,000 files per folder than to store
them into 1 big folder. This will
create 1000 folder indices instead of
a single big one that's more likely to
hit the max # of fragments limit or
b) Make plans to run contig.exe on a
regular basis to keep your big
folder's index defragmented.
Fonte
Contig é gratuito.
se você não está confortável com a linha de comando, você pode usar Desfragmentador de energia , uma GUI escrita para o Contig.