Pour compresser son tar.gz avec la meilleur compression possible il faut utiliser l'option best de gzip :

GZIP=--best tar cfz archive.tar.gz lesFichiersEtOuDossiersACompresser

ou

tar cf - lesFichiersEtOuDossiersACompresser | gzip --best -c - > archive.tar.gz

Pour les détails, voyez le manuel de GNU tar: an archiver tool - Creating and Reading Compressed Archives

‘-z’
‘--gzip’
‘--ungzip’

    Filter the archive through gzip.

   You can use ‘--gzip’ and ‘--gunzip’ on physical devices (tape drives, etc.) and remote files as well as on normal files; data to or from such devices or remote files is reblocked by another copy of the tar program to enforce the specified (or default) record size. The default compression parameters are used; if you need to override them, set GZIP environment variable, e.g.:
    	

   $ GZIP=--best tar cfz archive.tar.gz subdir

   Another way would be to avoid the ‘--gzip’ (‘--gunzip’, ‘--ungzip’, ‘-z’) option and run gzip explicitly:
    	

   $ tar cf - subdir | gzip --best -c - > archive.tar.gz

   About corrupted compressed archives: gzip'ed files have no redundancy, for maximum compression. The adaptive nature of the compression scheme means that the compression tables are implicitly spread all over the archive. If you lose a few blocks, the dynamic construction of the compression tables becomes unsynchronized, and there is little chance that you could recover later in the archive.

   There are pending suggestions for having a per-volume or per-file compression in GNU tar. This would allow for viewing the contents without decompression, and for resynchronizing decompression at every volume or file, in case of corrupted archives. Doing so, we might lose some compressibility. But this would have make recovering easier. So, there are pros and cons. We'll see!