How to reduce the size of the File System

We're in the On-Demand environment and the Sugar Insights shows our File system Size is 90% of the allocation and the Database size is 61% of the allocation.

The Database size has been reduced in recent months because of our clean-up project (deleting very old and no longer useful records).  The File System size continues to grow.

What can be done to reduce the size of the File System?

Parents
  • Hi ,

    I can't really explain it perfectly, because I'm not a developer myself, but we regularly do a kind of normalization in the upload directory (we are on-prem, I have to add): We identify file duplicates (which often happens through e.g. images in email signatures and stuff like that) with rdfind and hard-link all notes/emails that basically refer to the same file to one single copy of that file and delete the rest.

    root@XXXXXXXX:/var/www/html/sugarcrm/upload [22:42] $ rdfind -removeidentinode false -deleteduplicates true -makehardlinks true . 
    Now scanning ".", found 1113514 files. 
    Now have 1113514 files in total. 
    Total size is 139555040280 bytes or 130 GiB 
    Removed 53148 files due to unique sizes from list.1060366 files left. 
    Now eliminating candidates based on first bytes:removed 87369 files from list.972997 files left. 
    Now eliminating candidates based on last bytes:removed 32170 files from list.940827 files left. 
    Now eliminating candidates based on sha1 checksum:removed 13379 files from list.927448 files left. 
    It seems like you have 927448 files that are not unique 
    Totally, 71 GiB can be reduced. 
    Now making results file results.txt 
    Now making hard links. 
    Making 829566 links.

    Maybe it helps someone else as well Slight smile

    BR

    Julia Weinhold

Reply
  • Hi ,

    I can't really explain it perfectly, because I'm not a developer myself, but we regularly do a kind of normalization in the upload directory (we are on-prem, I have to add): We identify file duplicates (which often happens through e.g. images in email signatures and stuff like that) with rdfind and hard-link all notes/emails that basically refer to the same file to one single copy of that file and delete the rest.

    root@XXXXXXXX:/var/www/html/sugarcrm/upload [22:42] $ rdfind -removeidentinode false -deleteduplicates true -makehardlinks true . 
    Now scanning ".", found 1113514 files. 
    Now have 1113514 files in total. 
    Total size is 139555040280 bytes or 130 GiB 
    Removed 53148 files due to unique sizes from list.1060366 files left. 
    Now eliminating candidates based on first bytes:removed 87369 files from list.972997 files left. 
    Now eliminating candidates based on last bytes:removed 32170 files from list.940827 files left. 
    Now eliminating candidates based on sha1 checksum:removed 13379 files from list.927448 files left. 
    It seems like you have 927448 files that are not unique 
    Totally, 71 GiB can be reduced. 
    Now making results file results.txt 
    Now making hard links. 
    Making 829566 links.

    Maybe it helps someone else as well Slight smile

    BR

    Julia Weinhold

Children