X-Git-Url: http://www2.svjatoslav.eu/gitweb/?p=imagesqueeze.git;a=blobdiff_plain;f=doc%2Findex.html;h=46e38ef38005c3592225b0bef0d3d10c0fa165d8;hp=7ccf9f68e8f25959e9b08704a1c436041c8809fa;hb=HEAD;hpb=c7d0b8e1723045c0df086d9214a35f54db47684c diff --git a/doc/index.html b/doc/index.html old mode 100755 new mode 100644 index 7ccf9f6..46e38ef --- a/doc/index.html +++ b/doc/index.html @@ -1,47 +1,385 @@ - - + + - -ImageSqueeze +ImageSqueeze - lossy image codec + + + + + + + + + + + -

ImageSqueeze - lossy image codec

- Download -    - Online homepage -    - Other applications hosted on svjatoslav.eu -
-Program author:
-    Svjatoslav Agejenko
-    Homepage: http://svjatoslav.eu
-    Email: svjatoslav@svjatoslav.eu
+
+

ImageSqueeze - lossy image codec

+
+

1 General

+
+ +
-Lossy image codec. Optimized for photos. -I developed it to test out an image compression ideas. +
+

1.1 Source code

+
+
    +
  • Download latest snapshot in TAR GZ format +
  • -I believe my algorithm has following advantages: - * Fast. Relatively few computations per pixel. - * Easy to add support for progressive image loading. (Saving is already progressive) +
  • Browse Git repository online +
  • -Current limitations / to do: - * Documentation describing idea behind this algorithm is still missing (lack of time) - * Code documentation is weak. - * Better sample applications needed: Commandline image conversion utility. Image viewer. +
  • Clone Git repository using command: +
    +git clone https://www2.svjatoslav.eu/git/imagesqueeze.git
     
    +
    +
  • +
+
+
+
-Below are original photo and the same image being compressed down to ~93 Kb and then decompressed. - +
+

2 Overview

+
+

+Lossy image codec. Optimized for photos. I developed it to test out an +image compression ideas. +

-When looking very closely, slight grainyness, loss of color precision and -blurriness (loss of detail) could be noticed as a compression artifacts. -Still sharp edges are always preserved. Also no blocks typical to JPEG are ever seen. -I think that is awesome result for just ~ 2.5 bits per pixel on that color photo. -
+

+I believe my algorighm has interesting advantages. +

+ +

+Below are original photo and the same image being compressed down to +~93 Kb and then decompressed. +

+ + +
+

originalAndCompressed.png +

+
+ +

+When looking very closely, slight grainyness, loss of color precision +and blurriness (loss of detail) could be noticed as a compression +artifacts. Still sharp edges are always preserved. Also no blocks +typical to JPEG are ever seen. I think that is awesome result for +just ~ 2.5 bits per pixel on that color photo. +

+ + +
+

3 Algorithm description

+
+ +
+
+

3.1 Algorighm advantages

+
+
    +
  • It can be applied to any amount of dimensions, even for sound and +volumetric data. +
  • + +
  • Algorithm can operate in lossy and lossless mode. +
  • + +
  • Algorithm naturally handles progressive loading. That is: low +resolution thumbnail of entire thing is immediately available and +gets gradually more dense during entire loading process. +
  • + +
  • Algorithm naturally supports variable resolution. That is: different +areas can be encoded with different resolutions / pixel densities. +
  • + +
  • Fast: Very little computations per pixel. +
  • +
+
+
+
+ +
+

4 TODO Things to improve

+
+ +
+
+
+
+ - \ No newline at end of file +