添加链接
link管理
链接快照平台
  • 输入网页链接,自动生成快照
  • 标签化管理网页链接

In this situation, arcpy.da.Walk doesn't get you anything over os.walk because arcpy.da.Walk is focused on the contents of geospatial data stores (feature classes, raster catalogs, tables, etc...) and not the data stores themselves (file geodatabases, folders, enterprise geodatabases, etc...). arcpy.da.Walk can do the job, it just does it slower because of the extra overhead from enumerating geospatial data within the data stores.

Since you primarily seem interested in only the size of the file geodatabases and not the specifics of what is inside them, I recommend using straight os.walk to find the file geodatabases and determine their sizes.

import os
def get_size(start_path = '.'):
    total_size = 0
    for dirpath, dirnames, filenames in os.walk(start_path):
        for f in filenames:
            fp = os.path.join(dirpath, f)
            total_size += os.path.getsize(fp)
    return total_size
output = #output file
path = #root/parent directory to start recursive search
with open(output, 'w') as f:
    for dirpath, dirnames, filenames in os.walk(path):
        for dirname in dirnames:
            if dirname.endswith('.gdb'):
                gdb = os.path.join(dirpath, dirname)
                size = get_size(gdb)
                f.write("{},{}\n".format(gdb, size))

The get_size function code came from the accepted answer by monkut to the following Stackoverflow post: Calculating a directory size using Python?