It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Originally posted by boaby_phet
Originally posted by sirnex
First I have to laugh.
Secondly, I have to ask where does this capacity for atoms to form a car exist until a car is actually formed?
From the laws of physics (for lack of better words), and no I don't mean laws as is defined by human language. I mean the set forces and constants that pre-existed before any sentient life could impart "information" in the form of language or knowledge.
Again, your response begs me to ask you:
In what context do you define information when claiming information is unbounded by space and time?
[edit on 18-5-2010 by sirnex]
Our predetermined path is determined by the choices we make before we make them (think about that one for a second).
From the laws of physics (for lack of better words), and no I don't mean laws as is defined by human language. I mean the set forces and constants that pre-existed before any sentient life could impart "information" in the form of language or knowledge.
Where did this sentient life gather the information from?
You're just stalling because you're not making any sense. This is because your entropy is everything theory is a made up fantasy in your mind. It has no equations, published papers or tested theories to support it.
I keep saying information in the form of bits and qubits.
These bits and qubits are stored on things like atoms and subatomic particles.
Since we know that matter or energy can't be created nor destroyed, so you can't create new information because you can't create new matter and energy.
So the universe contains all the information from the past and the future of our universe. It goes even further but I will stop right there.
Sentient beings can just process and discover this information.
How do you explain Programming to your kids?
I tell them that it says everything in the universe is made of bits. Not chunks of stuff, but chunks of information - ones and zeros.
I've just put on your magic glasses, and looking around I see that, oh my gosh, everything is computing. Is this just fashionable?
Computers are our favorite metaphor at the moment, so maybe we see everything as com�puters. But this view is not that facile. Statistical mechanics, which underlies all chemistry, grew out of the realization that the world is information. The mathematical definition of a bit was first �postulated not during the 1930s and '40s when Claude Shannon and Norbert Weiner started information theory but by James Clerk Maxwell and Ludwig Boltzmann during their 19th-�century explorations of the nature of the atom. They were working on thermo�dynamics, but they discovered that the world was made of information.
Where did this sentient life gather the information from? Did the information just pop into existence when the sentient being was born? So are you saying the laws of physics stored this information or that the information appeared out of nowhere when the laws of physics accidently made sentient beings that are full of information that just popped into existence? Do sentient beings have the power to move the laws of physics to create information out of nothing? Where do sentient beings get this capacity?
Again, you're saying information is created when it's discovered and we would have to throw out everything from quantum mechanics to information theory to accept something that makes no sense.
link
In layman's terms, information entropy is the same as randomness. A string of random letters along the lines of "fHJZXpVVbuqKbaazaaw" can be said to have high information entropy, in other words large amounts of entropy: the complete works of Shakespeare, by contrast, have lower information entropy, because when forming meaningful words certain combinations of letters are more likely to occur than others. For example in English if you see a q in a word you can be almost certain it is followed by a 'u'. In other words, using the coin example you can be more confident betting on the outcome of the next letter in a story than you would betting on the next letter in a random string.
In layman's terms, information entropy is the same as randomness. A string of random letters along the lines of "fHJZXpVVbuqKbaazaaw" can be said to have high information entropy, in other words large amounts of entropy: the complete works of Shakespeare, by contrast, have lower information entropy, because when forming meaningful words certain combinations of letters are more likely to occur than others. For example in English if you see a q in a word you can be almost certain it is followed by a 'u'. In other words, using the coin example you can be more confident betting on the outcome of the next letter in a story than you would betting on the next letter in a random string.
As you can no doubt readily see, the bits may still exist, but as meaningful information, such as building a car or using language to colorfully describe an idiot, the bits need to have a lower ordered entropy.
Where did this sentient life gather the information from? Did the information just pop into existence when the sentient being was born? So are you saying the laws of physics stored this information or that the information appeared out of nowhere when the laws of physics accidently made sentient beings that are full of information that just popped into existence? Do sentient beings have the power to move the laws of physics to create information out of nothing? Where do sentient beings get this capacity?
This supports what I'm saying.
Bits and qubits store these different configurations of matter rather it's a string of letters or a car.
Again, matter nor energy can be created or destroyed
so every configuration of matter is stored on bits and qubits from the simple (poking two holes in a piece of paper) to something more complex (a car). This information is stored on the bits and qubits inherent in nature.
The problem we have here is that you're not debating anything I'm saying, you're trying to debate your point of view and silly theories.
Now after all this time you're admitting what I have been saying from the beginning. That these bits exist. Did these bits exist 2,000 years ago? LOL
Now your saying the bits need to have a lower ordered entropy state. Whether it's a higher information entropy state or a lower information entropy state it's still a configuration of matter and energy that's stored on bits and qubits that you now say exist.
I also see that you keep ducking my questions. This is because you don't want a debate, you want me to debate you based on your wild beliefs and speculation.
Now, the atoms, or bits as you like to call them pose no meaningful information about the car. It has the potential to become a car, but despite all the required bits being there, it's no where near a car or information about a car.
I'm using your terminology which you defined bits as atoms, so the two terms by your definition are interchangeably valid in discussion. Note: meaningless chunks of bit do not equal meaningful packets of information.
These bits and qubits are stored on things like atoms and subatomic particles. Since we know that matter or energy can't be created nor destroyed
Now your saying the bits need to have a lower ordered entropy state. Whether it's a higher information entropy state or a lower information entropy state it's still a configuration of matter and energy that's stored on bits and qubits that you now say exist.
meaningless chunks of bit do not equal meaningful packets of information
Where did this sentient life gather the information from? Did the information just pop into existence when the sentient being was born? So are you saying the laws of physics stored this information or that the information appeared out of nowhere when the laws of physics accidently made sentient beings that are full of information that just popped into existence? Do sentient beings have the power to move the laws of physics to create information out of nothing? Where do sentient beings get this capacity?
Boy, you just make up stuff as you go. Like I said you're not trying to debate what I'm saying but you're trying to conform what I'm saying to your idiotic entropy is everything theory
You had to be one of those kids in school that nobody liked.
This is just nonsense. Whatis this based on? Where does it say that bits can only encode meaningless information? Have you typed on your computer lately? Bits can store complex or simple information. Where do you get this nonsense?
Again you just sound silly. You're not trying to debate, you're just running to a website and posting about information theory which you don't understand. I now remember you did this on another posts and you looked silly then and you look silly now.
Where did I say bits and atoms were interchangable?
I never said they were interchangable.
The problem here is that you're ignorant about these things because your silly entropy is everything doesn't make any sense.
A key measure of information in the theory is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Intuitively, entropy quantifies the uncertainty involved when encountering a random variable. For example, a fair coin flip (2 equally likely outcomes) will have less entropy than a roll of a die (6 equally likely outcomes).
I suggest you go and read about these things before you debate them because you're just cutting and pasting things out of context because you don't know what they mean.
A key measure of information in the theory is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Intuitively, entropy quantifies the uncertainty involved when encountering a random variable. For example, a fair coin flip (2 equally likely outcomes) will have less entropy than a roll of a die (6 equally likely outcomes).
Again, I don't have any problem with Shannon and did you read what I said:
Now your saying the bits need to have a lower ordered entropy state. Whether it's a higher information entropy state or a lower information entropy state it's still a configuration of matter and energy that's stored on bits and qubits that you now say exist.
In layman's terms, information entropy is the same as randomness. A string of random letters along the lines of "fHJZXpVVbuqKbaazaaw" can be said to have high information entropy, in other words large amounts of entropy: the complete works of Shakespeare, by contrast, have lower information entropy, because when forming meaningful words certain combinations of letters are more likely to occur than others. For example in English if you see a q in a word you can be almost certain it is followed by a 'u'. In other words, using the coin example you can be more confident betting on the outcome of the next letter in a story than you would betting on the next letter in a random string.
What's so hard to understand about that?
First you said nothing could exist in the future or the past because of your silly entropy is everything theory
A few pages ago you was throwing out everything from Einstein to Information Theory, now you admit that these bits exist but you say there's a limit on how much information these bits can holdbased on your profound theory you just made up called "meaningful packets of information" LOL.
In layman's terms, information entropy is the same as randomness. A string of random letters along the lines of "fHJZXpVVbuqKbaazaaw" can be said to have high information entropy, in other words large amounts of entropy: the complete works of Shakespeare, by contrast, have lower information entropy, because when forming meaningful words certain combinations of letters are more likely to occur than others. For example in English if you see a q in a word you can be almost certain it is followed by a 'u'. In other words, using the coin example you can be more confident betting on the outcome of the next letter in a story than you would betting on the next letter in a random string.
What are meaningful packets of information? Give me the mathematical description of the meaningful packets of information LOL. This is too funny. You're just making up stuff.
For your informatin, bits and qubits can store abdftyhvdersc or I went to the market. Bits and Qubits can store both simple and complex information.
If you have bits, they store information:
11001011
10010111
01111111
10000000
Bits and qubits store information there's no such thing as meaningless chunks of bits or meaningful packets of information.
You're just making stuff up as you go and you look silly. You should first try to understand these things before you comment on them.
You're living in a fantasy. First you don't subcribe to quantum mechanics or Einstein and you have this silly theory that says entropy is everything yet you don't know anything about the theory. Now you're trying to debate information theory and you're talking about meaningless chunks of bits and meaningful packets of information.
A bit of information can be encoded on a piece of paper by sticking two holes in the paper. So someone can look at the paper and 2 holes means meet me at Burger King and 1 hole means meat me at Wendy's. I have just transferred information and the universe can store simple and complex information on bits and qubits.
In layman's terms, information entropy is the same as randomness. A string of random letters along the lines of "fHJZXpVVbuqKbaazaaw" can be said to have high information entropy, in other words large amounts of entropy: the complete works of Shakespeare, by contrast, have lower information entropy, because when forming meaningful words certain combinations of letters are more likely to occur than others. For example in English if you see a q in a word you can be almost certain it is followed by a 'u'. In other words, using the coin example you can be more confident betting on the outcome of the next letter in a story than you would betting on the next letter in a random string.
Now, let's use our brain for a second, if you can manage that. Random invariably would be meaningless, sort of like your delusional crazy ideas. Ordered would obviously be meaningful compared to randomness. Do you agree or do you still want to make up some crazy excuses pulled out of your ass?
As you can no doubt readily see, the bits may still exist, but as meaningful information, such as building a car or using language to colorfully describe an idiot, the bits need to have a lower ordered entropy.
A bit or binary digit is the basic unit of information in computing and telecommunications; it is the amount of information that can be stored by a digital device or other physical system that can normally exist in only two distinct states. These may be the two stable positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc.
Where did this sentient life gather the information from? Did the information just pop into existence when the sentient being was born? So are you saying the laws of physics stored this information or that the information appeared out of nowhere when the laws of physics accidently made sentient beings that are full of information that just popped into existence? Do sentient beings have the power to move the laws of physics to create information out of nothing? Where do sentient beings get this capacity?
At first he rejected everything from QM, Einstein, Information theory, theoretical physics and more now he's trying to conform information theory to his silly theory without understanding what he's talking about.
Bits and qubits can store both a string of letters or a complex design. Bits and Qubits can store every configuration of matter and like I said earlier many people believe information exists at the fundamental level even when space and time evolve into infinity at planck scales.
You're a BS artists that doesn't have a clue and before I discuss anything with you, you have to answer some questions instead of post links out of context and it's obvious you have no clue what it means.
A key measure of information in the theory is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Intuitively, entropy quantifies the uncertainty involved when encountering a random variable. For example, a fair coin flip (2 equally likely outcomes) will have less entropy than a roll of a die (6 equally likely outcomes).
In layman's terms, information entropy is the same as randomness. A string of random letters along the lines of "fHJZXpVVbuqKbaazaaw" can be said to have high information entropy, in other words large amounts of entropy: the complete works of Shakespeare, by contrast, have lower information entropy, because when forming meaningful words certain combinations of letters are more likely to occur than others. For example in English if you see a q in a word you can be almost certain it is followed by a 'u'. In other words, using the coin example you can be more confident betting on the outcome of the next letter in a story than you would betting on the next letter in a random string.
What are meaningful packets of information? I want you to show me one peer reviewed paper on information theory that uses this term. You keep quoting a paper that just says high information entropy can exist in say a string of letters and low entropy information entropy can exist in say a sentence. Where does it say anything about bit's and qubits can only store high information entropy LOL.
What are meaningless chunks of bits???????
Give me one peer reviewed paper on information theory that uses this nonsense.
Here's a journal on Information Theory. Maybe you can learn something by looking for "meaningless chunks of bits" or "meaningful packets of information."
ieeexplore.ieee.org...
All I want is one peer reviewed paper that says bits and qubits can't store "meaningful packets of information" that because there "useless chunks of bits." LOL
When most people don't understand something they learn about it first before they debate it and that way they don't look so silly.
link
The concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, and representation. In its most restricted technical meaning, information is an ordered sequence of symbols.
link
A bit or binary digit is the basic unit of information in computing and telecommunications; it is the amount of information that can be stored by a digital device or other physical system that can normally exist in only two distinct states. These may be the two stable positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc.
link
A qubit has some similarities to a classical bit, but is overall very different. Like a bit, a qubit can have two possible values—normally a 0 or a 1. The difference is that whereas a bit must be either 0 or 1, a qubit can be 0, 1, or a superposition of both.
I mean you come on a message board about science and technology and you discuss fantasy. You first talk about entropy is everything and say you don't subscribe to QM or Einstein and you don't provide one equation, one peer reviewed paper or one tested theory that would get anyone with half a brain to throw out QM or Einstein in favor of your wild speculation.
Now your talking about information theory and it's obvious that you don't understand what your talking about.
This is just pure nonsense and again you're just making things up because you don't understand it.
link
In information science, irrelevant or meaningless data is considered to be noise. Noise consists of a large number of transient disturbances with a statistically randomized time distribution.
link
ordering - order: the act of putting things in a sequential arrangement;
After you saw how silly you looked because it's obvious that information has to be stored somewhere and it doesn't appear out of nowhere, you now say bit's exist but they can only store high information entropy like say a string of bits and they can't store low information entropy like a sentence.
WHAT?????
You're truly making yourself look bad because it's obvious you're making it up as you go.
Where has anyone said you can only store high information entropy on bits? LOL
Where has anyone used the term meaningful packets of information?
Where has anyone used the term meaningless packets of bits?
What are packets of bits and what makes them meaningless?
What are packets of information and where are these packets found and what equation governs these packets?
Again, you're talking about Shannon and random strings vs a sentence but what is your point? Nobody has said there isn't high information entropy states and low information entropy states.
You have to show that bits and qubits only store high information entropy states LOL.
Shannon's entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon's source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
As you can no doubt readily see, the bits may still exist, but as meaningful information, such as building a car or using language to colorfully describe an idiot, the bits need to have a lower ordered entropy.
Again, PURE NONSENSE.
Do you even know what bits are? Bits don't require a lower entropy state, bits can store a low or high information entropy state.