It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

A tiny switch for a few particles of light

page: 1
3
<<   2 >>

log in

join
share:

posted on May, 8 2016 @ 02:30 AM
link   
28 APR 2016

nano werk

.
www.nanowerk.com...=43265.php?utm_source=feedblitz&utm_medium=FeedBlitzRss&utm_campaign=nanowerkemergingtechnologiesnew s
.



.
. . .
.
A team at the Max Planck Institute for the Science of Light has demonstrated for the first time a mediation process with only a single organic molecule and just a handful of photons (Nature Photonics, "Few-photon coherent nonlinear optics with a single molecule"). The researchers influence and switch another light beam with these particles of light. This basic experiment not only promises a place in physics textbooks, but it may also help in the development of nano-optical transistors for a photonic computer.
.
. . .
.
That is a major objective of photonics. However, there is a fundamental problem in the attempt to develop a purely optical transistor: “Light cannot simply be switched by other light in the way that electric current is switched with current in a conventional transistor”, explains Vahid Sandoghdar, Director of the Nano-optics Division at the Max Planck Institute for the Science of Light and Alexander von Humboldt Professor of the Friedrich-Alexander University, Erlangen-Nürnberg. How shy particles of light are becomes obvious when one crosses the beams of two torches or two lasers. What happens is: nothing. “A medium is required to mediate the light-light interaction”
.
. . .
.
“Therefore, just a few photons from the laser beam are enough to alter the optical properties of the molecule.” The researchers are even convinced that the control pulse can be weakened still further. “In principle, a single photon should be enough to alter the fate of a second photon”, says Vahid Sandoghdar.
.
. . .
.


SOUNDS exciting, to this layman's mind.
.
Of course . . . it remains to be seen IF and WHEN such a tech could reach the laps of the average consumer in any volume at reasonable prices.
.
And, I still wonder . . . has this been done decades ago in black projects? Is this a way of slowly bringing such tech to the surface?
.
What do I know.
.
If it ends up putting super-computing power in a phone sized unit . . . with the display in one's glasses or contacts . . . I'm sure many would find that cool and maybe exciting.
.
I'd rather have a screen no smaller than a Kindle. And for most of my computing, I'd rather have a desktop with 3 large screens. LOL.
.




posted on May, 8 2016 @ 03:11 AM
link   
a reply to: BO XIAN

Of course . . . it remains to be seen IF and WHEN such a tech could reach the laps of the average consumer in any volume at reasonable prices.
What tech?


I'd rather have a screen no smaller than a Kindle. And for most of my computing, I'd rather have a desktop with 3 large screens. LOL.
Yeah, really, LOL. What's funny and what does what you said have to do with the article you linked?



posted on May, 8 2016 @ 03:26 AM
link   
a reply to: Phage

AS I understood it, the article was pointing to . . . futuristic computer designs utilizing the phenomena the scientists observed as described in the article.

I realize, Phage, that such a technology is quite some distance in time off from this experiment.

Sigh.

2nd question . . . my conjectures, Phage. Conjectures. Last I knew, they were not against the law.

Sigh.

What was funny to me was that even after reflecting on the possible future technology products as a result of the experiment etc. etc.

I still prefer my 2 large screens (I want 3) and hefty desk top computer. It was a slightly amusing paradox, to ME, Phage, to me.

I was not insisting that you join me in my amusement.

I was not insisting that you join me in my conjectures.

Nor do I understand what can come across as an absolute compulsion to p*ss in others' breakfast cereal over inconsequential minor bits of "reality."

edit on 8/5/2016 by BO XIAN because: added

edit on 8/5/2016 by BO XIAN because: added



posted on May, 8 2016 @ 03:28 AM
link   
a reply to: BO XIAN




2nd question . . . my conjectures, Phage. Conjectures. Last I knew, they were not against the law. Sigh.

And a good thing that is.

However, what's funny about them? (LOL)



posted on May, 8 2016 @ 03:31 AM
link   
a reply to: Phage

Please see my addition above to answer your question about LOL.

Is mother's day leaving you extra prickly, for some reason?

Sigh.

edit on 8/5/2016 by BO XIAN because: (no reason given)



posted on May, 8 2016 @ 03:33 AM
link   
a reply to: BO XIAN

Ok...


Your sense of humor would seem to vary from mine.
As would a desire to broadly express it by LOLing.


edit on 5/8/2016 by Phage because: (no reason given)



posted on May, 8 2016 @ 03:33 AM
link   

originally posted by: Phage
a reply to: BO XIAN

Ok...


Your sense of humor would seem to vary from mine.


I would not think that

that

would be a shock, to you, of all people!



posted on May, 8 2016 @ 03:34 AM
link   
a reply to: BO XIAN

Who expressed shock?



posted on May, 8 2016 @ 03:37 AM
link   
a reply to: Phage

Now I'm laughing--a healthy chuckle--again at Phage being Phage.

Good. The world has returned to what passes for normal around here.

BTW, are you at all aware that

sometimes a bit too often . . . pushy pre-emptory precision can come across as haughty and prissy?



posted on May, 8 2016 @ 03:38 AM
link   
a reply to: BO XIAN

Impressions don't impress me very much.



posted on May, 8 2016 @ 03:45 AM
link   
a reply to: Phage

I think we are keenly aware that precious little impresses you much at all.

Sometimes I'm concerned about what appears--from a distance--to be a rather bleak, sterile, stern window on the world through Phage's eyes.

Nevertheless, I do prefer to accept you and enjoy you as you are . . . with only occasional observations about such on my part. We all have our compulsions, it seems. You with your compulsion to lean so far from the perpendicular in favor of risking a false negative . . . and me, decidedly the other direction.

Thankfully, for now, the world seems big enough to hold both of us.

As to the OP . . . I thought it was interesting and potentially exciting. Yet, seemingly, instead of being able to share in such wonder and interest . . . you chose to . . . carp about petty issues of my posting. That was not, per se, an impression. That was an observation.



posted on May, 8 2016 @ 03:47 AM
link   
a reply to: BO XIAN



You with your compulsion to lean so far from the perpendicular in favor of risking a false negative . . . and me, decidedly the other direction.

A false negative? Is that redundant or an oxymoron?



posted on May, 8 2016 @ 03:49 AM
link   
a reply to: Phage

I KNOW you are bright enough

and I THINK you are well educated enough

to know about a TYPE I vs a TYPE II error.

Sigh.

BTW, I'm heading to bed shortly. Probably won't have any more replies from me tonight . . . at about 0200-0400 my time.

edit on 8/5/2016 by BO XIAN because: (no reason given)

edit on 8/5/2016 by BO XIAN because: (no reason given)



posted on May, 8 2016 @ 04:04 AM
link   
A tiny switch for a few particles of light

Link fixed. Interesting stuff all over that site.



posted on May, 8 2016 @ 05:51 AM
link   
a reply to: Phage



What tech?


I assume the op was refering to the tech they discuss in the article.....did you read it?


The researchers will now continue to work on controlling a light signal with individual photons. Simultaneously, the team in Erlangen is focussing rather on the practical side of things: the researchers would like to embed the molecule as a nano-optical transistor in a photonic wave-guide structure that should serve to wire up many molecules as is common in electrical circuitry. This would be an important step towards the future perspective of processing information in a photonic computer. Read more: A tiny switch for a few particles of light






Yeah, really, LOL. What's funny and what does what you said have to do with the article you linked?


Did you not read the part before that,




If it ends up putting super-computing power in a phone sized unit . . . with the display in one's glasses or contacts . . . I'm sure many would find that cool and maybe exciting.




At this point you are just trolling.



posted on May, 8 2016 @ 06:37 AM
link   
a reply to: Phage
a reply to: BO XIAN
 


Anyway, back to topic, I find this idea genius. I was sceptical at first but I love the way they circumvented the problem (electromagnetic radiations cannot interact with one another) by using molecules which will act as a mediator and do the job light alone can't.

A computer running on light instead of electrons would in theory work more efficiently, since it eliminates hassles created by electrical resistance (and dissipation of energy) in wires.



posted on May, 8 2016 @ 12:44 PM
link   
a reply to: swanne

Another challenge is to get the the light to the appropriate junction to create a change in state (assuming we're still dealing with 1s and 0s at that point). Current circuit cards use copper ( with some some anti-corrosive finish a la ENIG, HASL, Silver/Gold eImmersion) to transfer current/hold potential. This new method would require something to the effect of fiber optics on a board, which would be amazing. No leakage capacitance or inductance. High-speed traces could be routed without concern for the material dielectric at frequency. However, I could become obsolete.....should have gone into software...

I'm also unclear how states would be stored (memory) in an effective, practical way.



posted on May, 8 2016 @ 03:01 PM
link   

originally posted by: Gothmog
A tiny switch for a few particles of light

Link fixed. Interesting stuff all over that site.


I thought so, too.

Wasn't aware the link was unfixed. Glad it's ok now.



posted on May, 8 2016 @ 03:07 PM
link   
a reply to: swanne

It sure sounded exciting to this layman.

And I thought they were very clever about how they did it.

Thx for your kind post.



posted on May, 8 2016 @ 03:08 PM
link   
a reply to: AntiDoppleganger

Yeah. It sounds like there's a ton of stuff remaining to be done, achieved well before marketing such computers.

Unless . . . of course . . . the 'critters' and black ops folks have achieved all this years ago.




top topics



 
3
<<   2 >>

log in

join