Hi !

In this file i will try to explain how i did Blue's.com ...
At first i analysed the given text.txt. Counting the words gives you: 

 count:   9  word: ARE
 count:   8  word: THE
 count:   7  word: OF
 count:   6  word: PLACES
 count:   6  word: SOME
 count:   5  word: IN
 count:   5  word: PEOPLE
 count:   4  word: EARTH
 count:   4  word: LIVE
 count:   4  word: PLANTS
 count:   3  word: ALL
 count:   3  word: DIFFERENT
 count:   3  word: KINDS
 count:   3  word: OTHER
 count:   3  word: THAT
 count:   3  word: YOU
 count:   2  word: AND
 count:   2  word: ANIMALS
 count:   2  word: CAN
 count:   2  word: EAT
 count:   2  word: FOOD
 count:   2  word: FROM
 count:   2  word: LIKE
 count:   2  word: LIVING
 count:   2  word: MANY
 count:   2  word: THINGS
 count:   2  word: WAYS
 count:   2  word: YOURS

 ----------------------------------------------------------------------------
 
 count:   1  word: -
 count:   1  word: AIR
 count:   1  word: ALWAYS
 count:   1  word: BUT
 count:   1  word: CERTAIN
 count:   1  word: CHANGE
 count:   1  word: COLD
 count:   1  word: COMES
 count:   1  word: DO
 count:   1  word: DRY
 count:   1  word: FIND
 count:   1  word: FOR
 count:   1  word: GREEN
 count:   1  word: HILLY
 count:   1  word: HOW
 count:   1  word: IMPORTANT
 count:   1  word: INTO
 count:   1  word: IS
 count:   1  word: KIND
 count:   1  word: LAND
 count:   1  word: LEVEL
 count:   1  word: MATERIALS
 count:   1  word: MUCH
 count:   1  word: MUST
 count:   1  word: NOW
 count:   1  word: ON
 count:   1  word: ONLY
 count:   1  word: OR
 count:   1  word: OTHERS
 count:   1  word: OUR
 count:   1  word: OUT
 count:   1  word: PLACE
 count:   1  word: PLANET
 count:   1  word: QUITE
 count:   1  word: RAINY
 count:   1  word: REALLY
 count:   1  word: REASON
 count:   1  word: SAME
 count:   1  word: SEE
 count:   1  word: SHARE
 count:   1  word: SO
 count:   1  word: SUNLIGHT
 count:   1  word: SUNSHINE
 count:   1  word: THERE
 count:   1  word: THIS
 count:   1  word: TO
 count:   1  word: USE
 count:   1  word: VERY
 count:   1  word: WARM
 count:   1  word: WATER
 count:   1  word: WHICH
 count:   1  word: WHY
 count:   1  word: WILL
 count:   1  word: WINTER

 total number of words in text: 154
 total number of different words : 82

 Looking at the characters used:
  ',':     6
  '-':     1
  '.':    13
  '?':     2
  'B':     1
  'I':     1
  'N':     1
  'O':     1
  'P':     2
  'S':     6
  'T':     2
  'Y':     1
  'a':    61
  'c':    15
  'd':    16
  'e':    85
  'f':    19
  'g':     7
  'h':    33
  'i':    41
  'k':     6
  'l':    47
  'm':    19
  'n':    44
  'o':    48
  'p':    21
  'q':     1
  'r':    42
  's':    37
  't':    47
  'u':    14
  'v':     8 
  'w':    11
  'y':    16

 total number of different characters: 34

So my first idea was - and i'm still using it - to include the words appearing
multiple times only once and remember, were to use them.
Another thing was to encode the used characters in only 5 bit, what is 
possible if you use a flag that tells you when the next character should be 
upcased instead of including all used upcased characters.
Third trick i used is to encode the number of spaces in front of each word
in 2 extra-bits.
And last but not least i do not encode the LF/CR-bytes but filling them
in after decoding the text by counting characters ...

These thoughts resulted in the following bitstream to encode the text:

            Bit   description
           
                1 0 - Chars follow
                  1 - Word follows
            2 - 3 Number of Spaces in front of word/chars (0-3)
           
 Chars:     4 - 6 number of chars that follow (1-9)
            7 -11 1st character (5 bit)
           12 -16 2nd character (5 bit)
             ...  and so on :-)
           
 Word:      4 - 8 Wordnumber

The words used multiple times are encoded like the Chars in a seperate 
bitstream. This way, the encoded text uses only about 400 bytes.

The only thing left was to write a small decoder. My first implementation
needed approx. 200 bytes - but i was able to crunch it to about 150 bytes.


