It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
This week Big G was slighly up, the charge on the electron was down and the speed of light held steady.
I don't think so. Even though we now know that Sir Isaac Newton wasn't 100% correct, he was close enough that we still use what he came up with for most engineering purposes on human scales.
Originally posted by dominicus
Freedom of Speech, Censor nothing.
The Talk makes some various points, however keep in mind a few things:
1. Arrogance is a beeatch!!!! We think we know alot today, but in the year 3000 the scientists at that time will look at us and either face palm or say we were super primitive in our knowledge. It's all relative.
You won't learn much from this video about how Big G is measured.
Originally posted by ChaoticOrder
reply to post by BlueMule
This week Big G was slighly up, the charge on the electron was down and the speed of light held steady.
Good talk. I was surprised to learn how they measure the value of "Big G", that is very interesting indeed...
After due diligence, including a survey of published scientific research and recommendations from our Science Board and our community, we have decided that Graham Hancock’s and Rupert Sheldrake’s talks from TEDxWhitechapel should be removed from distribution on the TEDx YouTube channel.
We’re not censoring the talks. Instead we’re placing them here, where they can be framed to highlight both their provocative ideas and the factual problems with their arguments. See both talks after the jump.
All talks on the TEDxTalks channel represent the opinion of the speaker, not of TED or TEDx, but we feel a responsibility not to provide a platform for talks which appear to have crossed the line into pseudoscience.
UPDATE: Please find Rupert Sheldrake’s response below the video window.
According to our science board, Rupert Sheldrake bases his argument on several major factual errors, which undermine the arguments of talk. For example, he suggests that scientists reject the notion that animals have consciousness, despite the fact that it’s generally accepted that animals have some form of consciousness, and there’s much research and literature exploring the idea.
He also argues that scientists have ignored variations in the measurements of natural constants, using as his primary example the dogmatic assumption that a constant must be constant and uses the speed of light as example.
But, in truth, there has been a great deal of inquiry into the nature of scientific constants, including published, peer-reviewed research investigating whether certain constants – including the speed of light – might actually vary over time or distance. Scientists are constantly questioning these assumptions. For example, just this year Scientific American published a feature on the state of research into exactly this question. (“Are physical constants really constant?: Do the inner workings of nature change over time?”) Physicist Sean Carroll wrote a careful rebuttal of this point.
In addition, Sheldrake claims to have “evidence” of morphic resonance in crystal formation and rat behavior. The research has never appeared in a peer-reviewed journal, despite attempts by other scientists eager to replicate the work.
Rupert Sheldrake biologist author telepathy research, morphic resonance, powers of animals, psychic pets, dogmatic skepticism, media skeptics.
I don't think so. Even though we now know that Sir Isaac Newton wasn't 100% correct, he was close enough that we still use what he came up with for most engineering purposes on human scales.
Instead of pointing to his ignorance for not yet knowing some details he didn't yet have a way to know, we actually admire how advanced his knowledge was, and how well it has withstood the test of time. I think people will still appreciate the accuracy of Newtonian mechanics as much in the year 3000 as we did in the year 2000, and the fact that it's slightly wrong in some extremes doesn't make us do a facepalm and talk about how ignorant he was.
However, reading your post was almost enough to give me a face palm.
Graham Hancock and Rupert Sheldrake
You won't learn much from this video about how Big G is measured.
All I got out of the lecture was another new age huckster pitching a strawman argument to sell a book ( what a disappointing excuse for a TED talk, it is fairly easy to see why the organisation chose not to feature the content)
You will learn a little bit more about the difficulties in measuring G here than from Sheldrake's video:
Originally posted by ChaoticOrder
reply to post by Arbitrageur
You won't learn much from this video about how Big G is measured.
Ok... so they don't take all the measurements from labs around the world and average them? And the values produced by labs around the world are always extremely close to each other? It seems to me like he had really looked into this and knew how it worked...
That's a little dated but even the latest measurements are usually accompanied with remarks about the measurement difficulty and the recommendation that more measurements continue to be made independently in different labs, like this for example:
Several measurements in the past decade did not succeed in improving our knowledge of big G's value. To the contrary, the variation between different measurements forced the CODATA committee, which determines the internationally accepted standard values, to increase the uncertainty from 0.013% for the value quoted in 1987 to the twelve times larger uncertainty of 0.15% for the 1998 "official" value. This situation is an embarrassment to modern physics, considering that the intrinsic strength of electromagnetism, for instance, is known 2.5 million times more precisely and is steadily being improved. (The situation of G becomes more understandable if one considers the weakness of gravity: the total gravitational force twisting on the pendulum of a typical Cavendish torsion balance is only equivalent to the weight of a bacteria and that small force must be measured very precisely.)
The possibility that unknown systematic errors still exist in traditional measurements makes it important to measure G with independent methods.
You weren't paying attention. He did say that he thought it might have some kind of cyclical variation, did you miss that? Only problem is, he never published his idea even though he claims to be applying science, and since a scientist would publish his claim in a paper, he's not being scientific contrary to his claim he is.
Originally posted by ImaFungi
Hes not saying in reality big G isnt a constant.
This isn't another Sheldrake, this is a "Frequently asked Questions" answer on a university physics website. So it's not true that constants are automatically assumed to be constant. However we also have some idea how precise our measurements are, and we know it's difficult to measure big G. Given this knowledge, when we see different measurements, isn't it kind of silly to assume G is changing instead of appreciating that it's just hard to measure, because it's such a weak force?
The fundamental laws of physics, as we presently understand them, depend on about 25 parameters, such as Planck's constant h, the gravitational constant G, and the mass and charge of the electron. It is natural to ask whether these parameters are really constants, or whether they vary in space or time.
...
Over the past few decades, there have been extensive searches for evidence of variation of fundamental "constants." Among the methods used have been astrophysical observations of the spectra of distant stars, searches for variations of planetary radii and moments of inertia, investigations of orbital evolution, searches for anomalous luminosities of faint stars, studies of abundance ratios of radioactive nuclides, and (for current variations) direct laboratory measurements.
I would be happy to take part in a public debate with a scientist who disagrees with the issues I raise in my talk. This could take place online, or on Skype. My only condition is that it be conducted fairly, with equal time for both sides to present their arguments, and with an impartial moderator, agreed by both parties.
Therefore I ask Chris Anderson to invite a scientist from TED’s Scientific Board or TED’s Brain Trust to have a real debate with me about my talk, or if none will agree to take part, to do so himself.
Are the constants really constant? The measured values continually change, as I show in my book Science Set Free (The Science Delusion in the UK). They are regularly adjusted by international committees of experts know as metrologists. Old values are replaced by new “best values”, based on the recent data from laboratories around the world.
Within their laboratories, metrologists strive for ever-greater precision. In so doing, they reject unexpected data on the grounds they must be errors. Then, after deviant measurements have been weeded out, they average the values obtained at different times, and subject the final value to a series of corrections. Finally, in arriving at the latest “best values”, international committees of experts then select, adjust and average the data from an international selection of laboratories.
Despite these variations, most scientists take it for granted that the constants themselves are really constant; the variations in their values are simply the result of experimental errors.
The oldest of the constants, Newton’s Universal Gravitational Constant, known to physicists as Big G, shows the largest variations. As methods of measurement became more precise, the disparity in measurements of G by different laboratories increased, rather than decreased.
Between 1973 and 2010, the lowest average value of G was 6.6659, and the highest 6.734, a 1.1 percent difference. These published values are given to at least 3 places of decimals, and sometimes to 5, with estimated errors of a few parts per million. Either this appearance of precision is illusory, or G really does change. The difference between recent high and low values is more than 40 times greater than the estimated errors (expressed as standard deviations).
What if G really does change? Maybe its measured value is affected by changes in the earth’s astronomical environment, as the earth moves around the sun and as the solar system moves within the galaxy. Or maybe there are inherent fluctuations in G. Such changes would never be noticed as long as measurements are averaged over time and averaged across laboratories.
In 1998, the US National Institute of Standards and Technology published values of G taken on different days, revealing a remarkable range. On one day the value was 6.73, a few months later it was 6.64, 1.3% lower. (The references for all the data cited in this blog are given in Science Set Free/The Science Delusion).
In 2002, a team lead by Mikhail Gershteyn, of the Massachusetts Institute of Technology, published the first systematic attempt to study changes in G at different times of day and night. G was measured around the clock for seven months, using two independent methods. They found a clear daily rhythm, with maximum values of G 23.93 hours apart, correlating with the length of the sidereal day, the period of the earth’s rotation in relation to the stars.
Gershteyn’s team looked only for daily fluctuations, but G may well vary over longer time periods as well; there is already some evidence of an annual variation.
By comparing measurements from different locations, it should be possible to find more evidence of underlying patterns. Such measurements already exist, buried in the files of metrological laboratories. The simplest and cheapest starting point for this enquiry would be to collect the measurements of G at different times from laboratories all over the world. Then these measurements could be compared to see if the fluctuations are correlated. If they are, we will discover something new.