Author Topic: Data compression from Google ...  (Read 345 times)

0 Members and 1 Guest are viewing this topic.

Offline Mad Penguin

  • #Mad_Penguin_UK
  • Administrator
  • Hero Member
  • *****
  • Posts: 1316
  • Karma: 10016
  • Gender: Male
    • View Profile
    • Linux in the UK
    • Awards
Data compression from Google ...
« on: July 30, 2013, 04:25:57 pm »
There's quite a nice compression library called "snappy" knocking about on Google code for anyone who's looking for something pretty quick to compress / decompress data in their code, here's the URL; https://code.google.com/p/snappy/

To Compress;

Code: [Select]
  char inp_buf[4096];
  char out_buf[5200];  // watch this, compressed data 'can' be larger than the raw data!
  size_t inp_len = sizeof(inp_buf);
  size_t out_len = sizeof(out_buf);

  snappy_status status = snappy_compress(inp_buf,inp_len,out_buf,&out_len);
  if( status != SNAPPY_OK ) printf("** Compression Error! (%d)\n",status);
And to deCompress;

Code: [Select]
  char inp_buf[5200];
  char out_buf[4096];
  size_t inp_len = sizeof(inp_buf);
  size_t out_len = sizeof(out_buf);

  snappy_status status = snappy_uncompress(inp_buf,inp_len,out_buf,&out_len);
  if( status != SNAPPY_OK ) printf("** deCompression Error! (%d)\n",status);
Should be able to compress around 250Mb/sec and decompress at around 500Mb/sec on a modern processor. Empty 4k blocks compress by as much as 20:1, but on average uncompressed data gives better than 2:1 which isn't a bad saving .. :) .. After messing around with other libraries I was fairly impressed by how clean the implementation is and how easy it is to slot into existing code.
https://twitter.com/#!/Mad_Penguin_UK

 


SimplePortal 2.3.3 © 2008-2010, SimplePortal