reply to post by sirnex
Again, you don't have a clue as to what you're talking about.
You're just quoting stuff on information theory and saying look, this agrees with me.
You sound idiotic.
I ask you this:
WHAT?????
You're truly making yourself look bad because it's obvious you're making it up as you go.
Where has anyone said you can only store high information entropy on bits? LOL
Where has anyone used the term meaningful packets of information?
Where has anyone used the term meaningless packets of bits?
What are packets of bits and what makes them meaningless?
What are packets of information and where are these packets found and what equation governs these packets?
You said this:
What are you talking about? Are you now saying you don't agree with information theory at all?
Where does information theory or Shannon talk about meaningful packets of bits or meaningless chunks of bits?
This is just idiotic.
A bit is a unit of information. So a string of letters is information it's just in a high information entropy state. It's not meaningless chunks of
bits LOL.
I can take a string of letters or a pile of rocks and order them in a way to give them a low information entropy state.
Both of these states are bits of information.
You're just quoting stuff and you have no clue what it means.
You quoted this:
Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints:
treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon's source coding theorem shows
that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by
the logarithm of the number of symbols in the target alphabet.
Idiot,
What this is saying that even abcdefghijklmnop...... are bits of information and when you order these letters they are just in a lower state of
entropy because you have reduced uncertainty.
Again, show me in one peer reviewed paper on information theory where they talk about "meaningless chunks of bits."
Like smoke signals.
Smoke contains bits of information and when these bits are turned into signals they go from a high information entropy state to a low on because you
have reduced uncertainty.
You say some of the dumbest things like this:
Bits, plural, more than one as in a random string of bits don't contain the information to build a car as is dictated by Shannon entropy. So,
your now saying Shannon entropy is pure nonsense?
Yes, random bits contain the information to build a car. Random bits actuall contain more information than ordered bits of information.
For instance, a junky room will contain more information than a clean and ordered room.
This is because the junky room has more probable ways to be ordered than the clean and ordered room. So a junky room is in a high state of entropy
will a clean and ordered room is in a low state of entropy.
We then convert energy from the sun in order to do work and order the room.
So the alphabet will contain more information than this sentence. I went to the park.
This is because the alphabet can be ordered in more ways than the sentence can.
Like I said you don't understand what you're talking about.
Why do you think they hide messages in a string of bits? It's because the string of bits contain more information.
Have you ever heard of cryptography?
If I send a message that says Meet Pete at three. This is in a low state of entropy. If I type abmadebyetrtdspage..... The message Meet Pete at three
is embedded after every two letters. This is in a high information state of entropy.
If I wanted to hide the message meet pete at 3, I would do so in a string of bits because it contains more information.
There's no such thing as meaningless chunks of Bits. Shannon is talking about uncertainty. A string of bits has more uncertainty than a string of
bits that's ordered.
Again, there's no such thing as meaningles chunks of Bits LOL.
Try typing a string of meaningless chunks of bits.
This random string of bits you keep quoting contains information.
fHJZXpVVbuqKbaazaaw
This information is in a high information entropy state but it's not meaningless. It's just information with more uncertainty than the sentence:
I went to the park.
You really don't understand what you're talking about. This is why you keep quoting these things out of context but you can't explain what you're
saying. This is because you don't understand it.
Bits and qubits store both high information entropy and low information entropy. There are no meaningless chunks of bits. Again, please type a string
of meaningless chunks of bits or show me where anyone in information theory used this silly term LOL.
You still haven't answered any of these questions:
Where did this sentient life gather the information from? Did the information just pop into existence when the sentient being was born? So are
you saying the laws of physics stored this information or that the information appeared out of nowhere when the laws of physics accidently made
sentient beings that are full of information that just popped into existence? Do sentient beings have the power to move the laws of physics to create
information out of nothing? Where do sentient beings get this capacity?
You're really making yourself look bad. I suggest you read up on information theory before you try to debate it and you start making stuff up.
Shannon entropy is a measure of uncertainty not meaningless chunks of bits.
[edit on 19-5-2010 by Matrix Rising]