mastouille.fr est l'un des nombreux serveurs Mastodon indépendants que vous pouvez utiliser pour participer au fédiverse.
Mastouille est une instance Mastodon durable, ouverte, et hébergée en France.

Administré par :

Statistiques du serveur :

568
comptes actifs

#compression

0 message0 participant0 message aujourd’hui

🆕 blog! “Would adding Brotli Compression help shrink ePubs?”

The ePub format is the cross-platform way to package an eBook. At its heart, an ePub is just a bundled webpage with extra metadata - that makes it extremely easy to build workflows to create them and apps to read them.

Once you've finished authoring your ePub, you've got a folder full of HTML, CSS,…

👀 Read more: shkspr.mobi/blog/2025/07/would

#compression #epub

Terence Eden’s Blog · Would adding Brotli Compression help shrink ePubs?
Plus via Terence Eden

🗜️ #compression #7zip
Pour illustrer les paramètres "compression maximale" dont j'ai parlé là : sebsauvage.net/links/?s0zmfA

Illustration avec un jeu : Loophole.

Décompressé: 3,18 Go
7z "ultra" (-mx=9) : 1,33 Go
7z avec mes réglages : 0,53 Go
(Et zpaq -m4 fait un peu mieux : 0,49 Go)

Bien sûr c'est un exemple qui marche bien, le gain ne sera pas forcément aussi bon sur d'autres données.

sebsauvage.net7-Zip en compression maximale. - Liens en vrac de sebsauvage

👑 #ArchiveDeJeux
#Compression #7zip
Comme il m'a été demandé, voici les paramètres que j'utilise désormais pour obtenir la meilleure compression avec 7-Zip.

Inconvénient :
- on passe en mono-thread (la compression prend *beaucoup* plus de temps ; la décompression sera un peu moins rapide).
- cela consomme plus de RAM (10 Go à la compression, 1 Go à la décompression).

1 Go pour la décompression ne me semble pas déraisonnable pour les machines de nos jours.

So I have hundreds of videos of ~1 minute recorded from my phone ~10 years ago, and they generally don’t have that great compression, nor they are stored in a modern and advanced video format.

For archiving purposes, I want to take advantage of my workstation’s mighty GPU to process them so that the quality is approximately the same, but the file size would be strongly reduced.

Nevertheless, compressing videos is terribly hard, and way more complex than compressing pictures, so I wouldn’t really know how to do this, what format to use, what codec, what bitrate, what parameters to keep an eye on, etc.

I don’t care if the compression takes a lot of time, I just want smaller but good looking videos.

Any tips? (Links to guides and tutorials are ok too)

Also, unfortunately I am forced to use Windows for this (don’t ask me why 🫠), but I know nothing about Windows because I hate it. Practical software suggestions are very much welcome, too!

#ffmpeg#help#askFedi

Fascinating.

tmp $ wc -c < somefile.xopp 
735772
tmp $ file somefile.xopp 
somefile.xopp: gzip compressed data, from Unix, original size modulo 2^32 2086031
tmp $ gunzip < somefile.xopp |file -
/dev/stdin: XML 1.0 document, ASCII text, with very long lines (12483)
tmp $ gunzip < somefile.xopp |wc
    937  204466 2086031
tmp $ gunzip < somefile.xopp |bzip2 -9 |wc -c
619543
tmp $ gunzip < somefile.xopp |bzip3 |wc -c
575115
tmp $ gunzip < somefile.xopp |xz -9e |wc -c
519764
tmp $ gunzip < somefile.xopp |grep -m1 "^.stroke" |cut -c 1-160
<stroke tool="pen" color="#3333ccff" width="2.26 0.72691752 0.73026261 0.73809079 0.74588449 0.74364294 0.72915908 0.71467521 0.71133013 0.70908858 0.7057435 0.
tmp $ gunzip < somefile.xopp |grep -oE "\<[0-9]+\.[0-9]+\>" |wc -l
201692
tmp $ echo "735772/201692" |bc -l
3.64799793744917993772
tmp $ echo "519764/201692" |bc -l
2.57701842413184459472
tmp $ echo "2086031/201692" |bc -l
10.34265612914741288697
tmp $ 

#Compression #XML #Xournal #Xournalpp #Xournal++

So @rl_dane introduced #bzip3 to me to use instead of #bzip2. Let's turn some bz2 files into bz3 to see the difference.

First example: 90k opus files

hey snips wake word dataset. It has ~90k opus files and a tar file of 3.1GB. bzip2 produces the same 3.1GB which is as expected. bzip3 created 3.0GB but used tons of computation power. Not worth the 100MB

Second example: Windows 7 virtual box VM image

Windows7.vdi it's Windows 7 VM image for the "special" days. I think I have to get rid of it. But while it is still there, let's see how each will perform. It is 16GB uncompressed. bzip2 -9 is 7.0GB. bzip3 is 6.3GB but at the expense of like 3x CPU time. Deleting all of them anyway. Down with Windows.

Third example: Pure XML text file

Pure XML file. It's Persian and English characters. Uncompressed is 1.7GB. bzip2 -9 is 276M while bzip3 is 260MB

Final example: Creating a simple bomb

So I did this:

dd if=/dev/zero of=./justzero bs=2G count=6

So now I have a 16GB with only zero bytes. bzip2 -9 is 672KB. bzip3 is 46KB.

Conclusion

Thank you @rl_dane

Real nice thing!

#compression#gzip#zip

So I was short on storage on my archive drive. I saw librewolf source code. It was tar.gz and ~800MB. I uncompressed it then recompressed it with bzip2 -9 and now it's ~600MB. Generally #bzip2 has better compression for such these data than #gzip.

Edit: But don't do bzip2 -9 all and everywhere. Sometimes -4 is the same as -9 however the latter being tons slower. Also there is pbzip2 for using all your CPU cores.