It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

Some features of ATS will be disabled while you continue to use an ad-blocker.

Help ATS via PayPal:
learn more

# Ask any question you want about Physics

page: 281
61
share:

posted on Apr, 26 2016 @ 09:31 PM

originally posted by: greenreflections
a reply to: Arbitrageur

Why do you think all object inside gravity well fall at the same speed?

They don't.

they do.

Maybe you mean "accelerate at the same rate", there's a definite difference between that and "fall at the same speed".

posted on Apr, 26 2016 @ 09:37 PM

originally posted by: greenreflections
It is. It is made of stuff. It is made of quark cloud.
Before you said "proton cloud", now you're saying "quark cloud", as if a quark and a proton are the same thing. They are not. Most of the mass of a proton doesn't come from quarks, only a tiny fraction does.

originally posted by: greenreflections
they do.
So you're saying .89G at the altitude of the ISS equals 1.0G at Earth's surface? It doesn't. I explained how you can correct the question, but you have to be more specific than just "gravity well" to get the same acceleration. A single gravity well includes many different accelerations at different altitudes.

posted on Apr, 26 2016 @ 10:15 PM

originally posted by: Bedlam

originally posted by: greenreflections
a reply to: Arbitrageur

Why do you think all object inside gravity well fall at the same speed?

They don't.

they do.

Maybe you mean "accelerate at the same rate", there's a definite difference between that and "fall at the same speed".

are you f u ing with me dude?

posted on Apr, 26 2016 @ 10:22 PM

originally posted by: Arbitrageur

originally posted by: greenreflections
It is. It is made of stuff. It is made of quark cloud.
Before you said "proton cloud", now you're saying "quark cloud", as if a quark and a proton are the same thing. They are not. Most of the mass of a proton doesn't come from quarks, only a tiny fraction does.

originally posted by: greenreflections
they do.
So you're saying .89G at the altitude of the ISS equals 1.0G at Earth's surface? It doesn't. I explained how you can correct the question, but you have to be more specific than just "gravity well" to get the same acceleration. A single gravity well includes many different accelerations at different altitudes.

this test will prove my point.

posted on Apr, 26 2016 @ 10:56 PM

originally posted by: greenreflections

originally posted by: Bedlam

originally posted by: greenreflections
a reply to: Arbitrageur

Why do you think all object inside gravity well fall at the same speed?

They don't.

they do.

Maybe you mean "accelerate at the same rate", there's a definite difference between that and "fall at the same speed".

Not at all. Gravity causes acceleration. Therefore, given a vacuum, nothing falls at a fixed rate. It's unlikely that ANYTHING falls at the same speed, exactly.
edit on 26-4-2016 by Bedlam because: (no reason given)

posted on Apr, 27 2016 @ 04:29 AM
a reply to: dragonridr

What is the longest chunk of DNA used (past tense) in technology that is sequencing using the Sanger method? (Before assembly and alignment)

I think I've already said way too much. Perhaps I should just give a clue.
Revisiting the mouse mitochondrial DNA sequence. - PubMed - NCBI:

2003 Sep 15
...
The existence of reliable mtDNA reference sequences for each species is of great relevance in a variety of fields, from phylogenetic and population genetics studies...
We present compelling evidence for the existence of sequencing errors on the current mouse mtDNA reference sequence.

This is just a tiny tiny clue as to the motivations for my questions, there is no need for someone to quote the last line of that abstract to me and point out what you believe is the case regarding the genome sequences used in popular databases. I'm also interested when someone can give me accurate information regarding the topic of accuracy, which is a puzzling topic for me regarding especially some of the newer sequencing methods, 90% really doesn't sound reliable to me if you can't get that any higher anymore. (what happens if you compare sequences derived from this method with sequences from other methods with varying accuracies, for example 85-95%? It's a rhetorical question. I am aware there are other sequencing methods in use for maintaining sequence databases, just focus on my opening question. This is all just meant to clarify what I'm asking for and not to start a discussion about how accurate or reliable sequence databases are, or the arguments derived from them.)
Just as a reminder quoting from TEO...'s article:

The NIST study suggests the method could identify about 66 billion bases--the smallest units of genetic information--per second with 90 percent accuracy and no false positives.

Phantom423 sadly didn't quote anything about accuracy.

I have a feeling that someone experienced in both older and newer sequencing methods can know why I'm asking about the Sanger method (before assembly and alignment choices) even though the name escaped me when I was making my previous comments. Of course there could be other methods that are called differently but still use the same core technique of sequencing in sections and then doing alignment+assembly. I would also be referring to those methods if they are not labeled as the "Sanger method" but are stil using the same technique to determin the bases in the sequence being analyzed or a technique with the same accuracy per sequenced base (before alignment+assembly or other techniques are applied other than repetition to rule out and discover sequencing errors). And the question counts for those sequencing techniques or methods as well, what's the largest chunk?

Perhaps I can rephrase my initial question now:

What maximum length (number of nucleobases) can be sequenced in one go for a particular section of DNA without doing any assembly or aligning, or any other technique than repetition to rule out and discover sequencing errors; given our current sequencing technology that resembles or operates according to the Sanger method (before assembly and alignment and other techniques except repetition)? Or a method with the same accuracy under the same conditions (before assembly and alignment, and other techniques except repetition).
edit on 27-4-2016 by whereislogic because: addition

posted on Apr, 27 2016 @ 07:02 AM
Just to be really really clear, with the phrase "other techniques" I am not referring to the last step described in this video at 2:50:

All of the above would be included in my usage of the phrase "sequenced in one go for a particular section of DNA without doing any assembly or aligning...".

So what's depicted at and before 2:50 does not fall under my usage of the phrases "assembly or aligning" or "other techniques".

The video above shows an example of the "chunk of DNA" that I'm talking about at the start. I am interested in the size of that chunk.

So forget about all the rest I said and focus on the question:

What is the longest chunk of DNA used (past tense) in technology that is sequencing using the Sanger method? (Before assembly and alignment)

Or the longest you know of.
edit on 27-4-2016 by whereislogic because: addition

posted on Apr, 27 2016 @ 07:19 AM
a reply to: whereislogic

Start a new thread - that's a complicated topic - it goes to the heart of the instrumentation and methodology which should be thoroughly understood before you can know the limitations of the question you're asking.

If you had read the references which I posted, you would have some understanding how these newer methods stand up against Sanger - or at the very least, you could have come to some conclusion yourself.

edit on 27-4-2016 by Phantom423 because: (no reason given)

posted on Apr, 27 2016 @ 07:41 AM

originally posted by: dragonridr
We can now sequencr 300 kilobases up to 1 terabase in a single run.

Just to be clear, this is NOT what I'm talking about, it's not using the Sanger Method (so that's why I asked just about the Sanger method now at the start of my first response to you, best focus on that and nothing else, view it as a question about the history of science). And forget I even mentioned other possible methods in my previous comment.
edit on 27-4-2016 by whereislogic because: addition

posted on Apr, 27 2016 @ 08:06 AM

originally posted by: Phantom423
a reply to: whereislogic
If you had read the references which I posted, you would have some understanding how these newer methods stand up against Sanger - or at the very least, you could have come to some conclusion yourself.

There was a line on the first link that was a bit puzzling to me:

"The feasibility of reaching read-lengths of over 1000 bases of DNA has recently been achieved."

The feasibility?

Sanger sequencing is not dead? | WIRED:

Sanger sequencing is still widely used for small-scale experiments...

I doubt anyone here truly understands why, the article certainly is confusing enough to give the wrong impression exactly why that is the case.

Read length is absolutely crucial when it comes to assembling accurate sequence, especially for genomes as complex and repetitive as the human genome. If a repetitive region is much longer than a platform’s read length, it can’t really be accurately assembled – so human genomes sequenced with current next-gen platforms actually consist of hundreds of thousands of accurately sequenced fragments interspersed by gaps. That’s ... by no means a complete genome sequence.

But perhaps those who haven't given me any numbers so far already knew that when they put up links to the even newer next-gen stuff? I remember when they used to use the word "turbo" a lot (not thinking about DNA sequencing anymore).

Anyone know of a Sanger based technology that has a possible (true Sanger-style) readlength of 6000 or more bases? See comment.

"True Sanger-style" is referring to the way the word "readlength" is applied in discussions about the Sanger method.
edit on 27-4-2016 by whereislogic because: addition

posted on Apr, 27 2016 @ 08:36 AM

originally posted by: whereislogic

originally posted by: Phantom423
a reply to: whereislogic
If you had read the references which I posted, you would have some understanding how these newer methods stand up against Sanger - or at the very least, you could have come to some conclusion yourself.

There was a line on the first link that was a bit puzzling to me:

"The feasibility of reaching read-lengths of over 1000 bases of DNA has recently been achieved."

The feasibility?

I suggest that you go through ThermoFisher's website - they have a very nice explanation of the differences between Sanger and new generation sequencing techniques - the video is very graphic as to how it's done and why it's accurate. Then look through the "compare workflows" section -

I'm not a molecular biologist, but I do know that if you want to know why and how something works, start with the Materials and Methods section of a research paper and understand the instrumentation - that's something I do know about. The instruments and their output are key.

I’m new to Sequencing If you are new to sequencing, you’ve come to the right place to learn more about sequencing and determine which sequencing technologies will best fit your needs. We are a leader in sequencing technologies, with our Applied Biosystems® genetic analyzers and Ion Torrent™ next-generation sequencing systems. Our sequencing platforms are prominent in the history of sequencing, and we strive to actively shape the future of sequencing technology.

www.thermofisher.com...

As to your question on feasibility of read lengths, the references in the original paper should give you your answer.

www.pnas.org...

BTW, the first paper you cited was from 2003. That's ancient history in molecular biology. Better to research more up-to-date papers, especially when it comes to instrumentation.

edit on 27-4-2016 by Phantom423 because: (no reason given)

posted on Apr, 27 2016 @ 09:34 AM

originally posted by: greenreflections
this test will prove my point.
That test has nothing to do with whether acceleration of .89G in orbit is equal to acceleration of 1.0G on Earth's surface. We've already confirmed those values are different. .89G does not equal 1.0G, so at two different altitudes in the same gravity well you don't get the same rate of acceleration.

originally posted by: whereislogic
I have a feeling that someone experienced in both older and newer sequencing methods can know why I'm asking about the Sanger method...
Maybe, but I once again remind you that the people on ATS with such expertise may not even be reading this thread, so if you want better chances of people with such experience to see your question, I think you need a tread with a title specifically targeted at getting those experienced people to read and respond to your thread, which is why you need to...

originally posted by: Phantom423
a reply to: whereislogic

Start a new thread - that's a complicated topic
Yes starting a new thread makes a lot of sense for this kind of specific and complicated topic.

As I said I'm no expert on this topic but apparently the higher error rates of high throughput sequencers can be mitigated, according to this paper:

High-throughput DNA sequencing errors are reduced by orders of Magnitude

A major limitation of high-throughput DNA sequencing is the high
rate of erroneous base calls produced. For instance, Illumina
sequencing machines produce errors at a rate of ∼0.1–1 × 10 −2
per base sequenced. These technologies typically produce billions
of base calls per experiment, translating to millions of errors. We
have developed a unique library preparation strategy, “circle se-
quencing,” which allows for robust downstream computational
correction of these errors. In this strategy, DNA templates are
circularized, copied multiple times in tandem with a rolling circle
polymerase, and then sequenced on any high-throughput se-
quencing machine. Each read produced is computationally pro-
cessed to obtain a consensus sequence of all linked copies of the
original molecule. Physically linking the copies ensures that each
copy is independently derived from the original molecule and
allows for efficient formation of consensus sequences. The circle-
sequencing protocol precedes standard library preparations and
is therefore suitable for a broad range of sequencing applica-
tions. We tested our method using the Illumina MiSeq platform
and obtained errors in our processed sequencing reads at a rate
as low as 7.6 × 10 −6 per base sequenced, dramatically improving
the error rate of Illumina sequencing and putting error on par
with low-throughput, but highly accurate, Sanger sequencing.

So according to that the higher error rates can be mitigated and brought to a level comparable to Sanger sequencing error. How true that is and how the details work I can't answer since I'm not an expert in this subject, and you need experts to answer those types of questions, which you have a better chance of attracting if you start a new thread. I don't understand your reluctance to do that.
edit on 2016427 by Arbitrageur because: clarification

posted on Apr, 27 2016 @ 09:39 AM

originally posted by: Phantom423
BTW, the first paper you cited was from 2003. That's ancient history in molecular biology.

Perhaps I should remind you of this as well, from my comment to dragonridr:

...so that's why I asked just about the Sanger method now at the start of my first response to you, best focus on that and nothing else, view it as a question about the history of science

I love learning from history. The article was not brought up to "research...[current] instrumentation".

I may have found my own answer:

...deteriorating quality of sequencing traces after 700-900 bases.

Source: Sanger sequencing - Wikipedia, the free encyclopedia:

Thanks for the phrase "Sanger method" by the one giving me that clue, that ultimately led me to my answer. Unless anyone knows of higher amounts using the same Sanger method. See, this thread does serve its purpose

Let's see if I can think of an interesting general physics question now...why did Einstein say:

Reality is merely an illusion, albeit a very persistent one.

edit on 27-4-2016 by whereislogic because: addition

posted on Apr, 29 2016 @ 05:40 PM
a reply to: Arbitrageur

Those are computer calculated probability plots, but the real pictures look very much like the computer predictions:

probability plot is not reality...
and this calculated plots work ONLY for single electron atom.
Are you really thinking our universe is consistent only of one type of atoms ??

(facepalm)

posted on Apr, 29 2016 @ 09:22 PM
a reply to: Bedlam

so you deny that any two physical objects regardless of their mass will accelerate toward Moon surface at the same rate and reach it at the same time?

Why do you think they won't? You suggest heavier object gets to the Moon surface first?

posted on Apr, 29 2016 @ 09:23 PM
a reply to: greenreflections
Actually, Bedlam said nothing resembling that statement.

posted on Apr, 29 2016 @ 09:25 PM
a reply to: Arbitrageur

Physical objects with different values for mass will accelerate toward the center of gravity well at equal rate. Yes or no?

Thank you.

edit on 29-4-2016 by greenreflections because: (no reason given)

posted on Apr, 29 2016 @ 09:27 PM

originally posted by: Phage
a reply to: greenreflections
Actually, Bedlam said nothing resembling that statement.

then he simply misunderstood the point I was trying to make. My question tho remains at why physical objects accelerate toward the center of gravity well at the same rate regardless of their mass?

that's all.

cheers bud)

posted on Apr, 29 2016 @ 09:31 PM
a reply to: greenreflections
Maybe, or not.
He said gravity causes acceleration and that nothing falls at a fixed rate. Both statements are true.

posted on Apr, 29 2016 @ 09:58 PM
I am wondering how indeed photon when released is acquiring its speed? What serves as propulsion for emitted quanta?

I know it is crazy but the easiest way I can explain it to myself is that photon is not exactly being emitted. It is being pulled out. Pulled out in a way that an atom to maintain its composure uses force. Force in my vocabulary means something is working against uniform environment sort of. By 'emitting' I would visualize an atom to 'give up', fail to hold any further quanta when an atom can no longer keep it due to, say, changed positive core requests.

In a way, photon is being pulled out. No prolusion is needed to explain the instantaneous gaining speed of photon.

If I assume positive to be a 'trap' for negative where it is using force to acquire (borrow) what it needs to stay as a whole then if "positive" for any reason does not able to keep that negative piece, quanta will be shed into negative sea (sucked in).

There is my photon propulsion mechanism))))

top topics

61