I've discovered an interesting bug.
For my own needs, I wanted to save bandwidth while sending JPEGs over internet.
Lucky me, I know how to code C# and WPF; and a quick research showed that JpegBitmapEncoder was made with size optimization in mind. I wanted to know how much can I save and for that purposes I decided to put my hands into little coding. The idea was to take a bunch of files and see the compressed size for a few quality levels.
using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; using System.Windows.Media.Imaging; using System.Reflection; using System.Threading.Tasks; using System.Threading; using System.Diagnostics; class Program { static int step = 1; StreamWriter swLog; void Log(string s, params object[] args) { Console.WriteLine(s, args); if (swLog != null) swLog.WriteLine(s, args); } /// <summary> /// Printing beautiful numbers like 1 234 567 /// </summary> static string a2s(long l) { if (l == 0) return "0"; var lst = new List<char>(); const int i0 = (int)'0'; for (int cnt = 0; l > 0; cnt++) { lst.Add((char)(i0 + (l % 10))); if (cnt % 3 == 2) lst.Add(' '); l /= 10; } lst.Reverse(); return new string(lst.ToArray()); } static long AsCompressed(BitmapFrame bf, int quality) { var stream = new MemoryStream(); var enc = new JpegBitmapEncoder(); //Console.WriteLine("{0}", enc.QualityLevel); // check for default value: it prints 75 enc.QualityLevel = quality; enc.Frames.Add(bf); enc.Save(stream); return stream.Length; } static long AsCompressed(BitmapSource bs, int quality) { var bf = BitmapFrame.Create(bs); return AsCompressed(bf, quality); } void ExperimentWith(BitmapSource bs) { bs.Freeze(); var p = Process.GetCurrentProcess(); for (int q = step; q < 100; q += step) { var tsBefore = p.TotalProcessorTime; var cmp = AsCompressed(bs, q); var tsSpent = p.TotalProcessorTime - tsBefore; Log("q = {0,3} length = {1,9} time = {2} ms", q, a2s(cmp), tsSpent); } } void DoTheJob() { var di = new DirectoryInfo("."); var ffi = di.GetFiles("*.png") .Union(di.GetFiles("*.jpg")); foreach (var png in ffi) { Log("{0,40} ||| {1}", png.Name, a2s(png.Length)); using (var stream = new FileStream(png.FullName, FileMode.Open)) { var bf = BitmapFrame.Create(stream); ExperimentWith(bf); } } } void Run() { using (swLog = new StreamWriter("Testing JpegBitmapEncoder.txt")) DoTheJob(); } static void Main(string[] args) { if (args.Length > 0) int.TryParse(args[0], out step); new Program().Run(); } }
The typical output looks like this:
DSC04936.JPG ||| 4 807 010 q = 5 length = 390 219 time = 00:00:00.8424054 ms q = 10 length = 501 659 time = 00:00:00.2028013 ms q = 15 length = 598 753 time = 00:00:00.2028013 ms q = 20 length = 688 350 time = 00:00:00.2028013 ms q = 25 length = 778 698 time = 00:00:00.2184014 ms q = 30 length = 854 326 time = 00:00:00.2184014 ms q = 35 length = 912 867 time = 00:00:00.2340015 ms q = 40 length = 1 005 425 time = 00:00:00.2184014 ms q = 45 length = 1 070 221 time = 00:00:00.2340015 ms q = 50 length = 1 159 822 time = 00:00:00.2340015 ms q = 55 length = 1 219 095 time = 00:00:00.2340015 ms q = 60 length = 1 318 958 time = 00:00:00.2496016 ms q = 65 length = 1 406 811 time = 00:00:00.2340015 ms q = 70 length = 1 549 384 time = 00:00:00.2496016 ms q = 75 length = 3 420 505 time = 00:00:00.3432022 ms q = 80 length = 2 063 555 time = 00:00:00.2652017 ms q = 85 length = 2 398 605 time = 00:00:00.2808018 ms q = 90 length = 3 277 471 time = 00:00:00.3276021 ms q = 95 length = 4 043 037 time = 00:00:00.3588023 ms
I went on and tested on a hundred of different files. There is always a peak on 75. Even step = 1 shows it.
And yes, 75 is the default value, as I discovered next.
And wow, the bug has been there since October 15, 2008, as google kindly informed after searching for "JpegBitmapEncoder 75".
Not only the output is significantly bigger, it's also slower. There must be some hard-core optimization under the hood.
No comments:
Post a Comment