US team’s battery ‘breakthrough’ http://www.bbc.co.uk/news/technology-22191650
Category Archive: AQA Unit 1 Particles/ Quantum/ Elec
Permanent link to this article: http://www.animatedscience.co.uk/2013/us-teams-battery-breakthrough
Physicist’s atom struggles revealed http://www.bbc.co.uk/news/science-environment-22174013
Permanent link to this article: http://www.animatedscience.co.uk/2013/physicists-atom-struggles-revealed
New Scientist: Drone-wrecking laser gun to sail on US warship. http://goo.gl/mag/zHCXYPj
Permanent link to this article: http://www.animatedscience.co.uk/2013/new-scientist-drone-wrecking-laser-gun-to-sail-on-us-warship
Donald Glaser Obituary
Scientist who won the Nobel prize for physics in 1960 for the invention of the bubble chamber…..
Donald Glaser, who has died aged 86, won the Nobel prize for physics in 1960 for his invention of the bubble chamber, which made the world of subatomic particles visible and led to many further discoveries. In the 1950s and 60s, before the advent of modern electronics, which dominate high-energy physics today, Glaser’s bubble chamber was one of the most powerful tools for revealing the ephemeral existence of a plethora of subatomic particles. The discovery of hordes of novel particles, whose behaviours showed them to be cousins of the more familiar proton, neutron or pion, revealed that these families are made of more fundamental constituents – the quarks. The quark model has become a foundation of the current “standard model” of the fundamental particles and forces.
Glaser was a 25-year-old faculty member at the University of Michigan when he conceived of the bubble chamber. A homely example of the effect that Glaser developed is that of opening a bottle of beer. Releasing the bottle’s cap causes a sudden drop in pressure, whereby bubbles start to rise through the liquid. Glaser’s idea was to keep a liquid at high pressure, near to its boiling point. In such circumstances, a gentle drop in pressure will cause the liquid to start boiling, an effect well known to mountaineers who, at altitude, can brew a cup of tea at lower temperatures than at sea level.
However, if the pressure drop is sudden, the liquid remains liquid even though it is above its boiling point. This “superheated liquid” is unstable and can be maintained only if left undisturbed.
Glaser’s genius was to realise that if electrically charged particles shoot through a superheated liquid, a trail of bubbles forms as they ionise atoms along their paths. Initially too small to see, they rise up, growing to be large enough to be photographed. The process is very delicate; wait too long and the whole liquid will boil, so Glaser’s idea was to release the pressure and then restore it quickly. Particles entering the liquid during the critical moments of lowered pressure could be photographed.
Initially he made a minute demonstration device, a small glass phial containing a mere 3cl of diethyl ether. This delicate apparatus was able to show the trails left when cosmic rays or particles emitted by a radioactive source passed through.
His idea was, at first, regarded with less than enthusiasm. The US Atomic Energy Commission, and the National Science Foundation, both refused financial support, regarding his scheme as too speculative. His first paper on the subject was apparently rejected because it used the word “bubblet”, which was not in the dictionary. When he asked to speak about his invention at a meeting of the American Physical Society in Washington in April 1953, he received similar lack of enthusiasm, but then had a slice of good fortune.
The organisers had assigned Glaser a slot at the end of the meeting’s final day – a Saturday – when many participants would already have left. On the first day, however, his luck turned by a chance meeting over lunch with Luis Alvarez, a leading nuclear physicist from Berkeley.
Alvarez asked Glaser if he was speaking at the meeting and Glaser said that his 10-minute talk was the final slot when many would have gone home. Alvarez admitted that he too would be unable to be present, and asked Glaser what he was going to report on. Alvarez was immediately impressed, realised that here was a breakthrough, and arranged for a colleague to hear the talk.
In 1959, Glaser moved to Berkeley and the bubble chamber became a practical device in high energy particle physics. It was here that Alvarez’s team developed large versions of Glaser’s device, eventually 2m long, filled with liquid hydrogen, constructed of metal and with glass windows, through which trails of subatomic particles could be photographed. The iconic images adorned the walls of physicists’ offices during the latter half of the 20th century, and the discoveries of particles using this device led Alvarez himself to a Nobel prize.
Glaser was born in Cleveland, Ohio, the son of William, a businessman, and his wife Lena. He received his early education in the public schools of Cleveland Heights, Ohio, and took his BSc in physics and mathematics at the Case Institute of Technology in 1946. After completing his PhD at the California Institute of Technology in 1949, he joined the faculty at Michigan.
After winning the Nobel prize, Glaser shifted his interests to molecular biology, and into applying biotechnology to medicine and agriculture. He also revealed the true version of a popular misconception about his discovery. An oft-told story is that Glaser had his inspiration for the bubble chamber when watching the bubbles rise in a beer glass at the student union. The reality was subtly different. Having made the discovery, and become famous, he would be asked over drinks by colleagues, as if puzzled, what was so profound in such a trivial phenomenon?
He is survived by his second wife, Lynn, whom he married in 1975; and a son, William, and daughter, Louise, from his first marriage, which ended in divorce.
• Donald Arthur Glaser, physicist, born 21 September 1926; died 28 February 2013
• This article was amended on 11 March 2013. The original gave the date of Glaser’s second marriage as 1960. Amendments have also been made to a section on the development of the bubble chamber at Berkeley.
Permanent link to this article: http://www.animatedscience.co.uk/2013/donald-glaser-obituary
Dark matter as elusive as ever – despite space station results | Stuart Clark
Permanent link to this article: http://www.animatedscience.co.uk/2013/dark-matter-as-elusive-as-ever-despite-space-station-results-stuart-clark
LHC to enter ‘new realm of physics’ http://www.bbc.co.uk/news/science-environment-21941666
Permanent link to this article: http://www.animatedscience.co.uk/2013/lhc-to-enter-new-realm-of-physics
Synchrotron yields ‘safer’ vaccine http://www.bbc.co.uk/news/health-21958361
Producing vaccines against viral threats is a potentially hazardous business and that’s why manufacturers have to operate strict controls to ensure that no pathogens escape.
British scientists have developed a new method to create an entirely synthetic vaccine which doesn’t rely on using live infectious virus, meaning it is much safer.
What’s more the prototype vaccine they have created, for the animal disease foot-and-mouth, has been engineered to make it more stable.
That means it can be kept out of the fridge for many hours before returning to the cold chain – overcoming one of the major hurdles in administering vaccines in the developing world.
The research, published in the journal PLOS pathogens, was a collaboration between scientists at Oxford and Reading Universities, the Pirbright Institute, and the UK’s national synchrotron facility, the Diamond Light Source near Oxford.
Diamond is a particle accelerator which sends electrons round a giant magnetic ring at near light speeds.
The electrons emit energy in the form of intense X-rays which are channelled along “beamlines” – into laboratories where they are used to analyse structures in extraordinary detail.
Synchrotrons have been used before to analyse viruses at the atomic level, but the technology has advanced considerably to enable scientists to create a stable synthetic vaccine.
“What we have achieved here is close to the holy grail of foot-and-mouth vaccines.
Unlike traditional vaccines, there is no chance that the empty shell vaccine could revert to an infectious form,” said Dave Stuart, Life Sciences Director at Diamond, and MRC Professor of Structural Biology at the University of Oxford.
“This work will have a broad and enduring impact on vaccine development, and the technology should be transferable to other viruses from the same family, such as poliovirus and hand-foot-and-mouth disease, a human virus which is currently endemic in South-East Asia.”
These human disease threats, like foot-and-mouth, are all picornaviruses.
Viruses are inherently unstable and fragile, but picornaviruses can be studied using X-ray crystallography.
This enables the protein shell of the virus to be analysed at the atomic level – something a billion times smaller than a pinhead.
As with any vaccine, the aim is to prompt the immune system to recognise this outer shell and destroy the pathogen before it has time to lock onto cells and infect them with its genetic material.
In this research the scientists created a synthetic viral shell, but lacking its pathogenic RNA interior – the genetic material the virus uses to replicate itself.
Crucially they were able to reinforce the structure of the viral shell to make it stronger, to improve the stability of the vaccine.
Pre-clinical trials have shown it to be stable at temperatures up to 56C for at least two hours. Foot-and-mouth is endemic in central Africa, parts of the Middle East and Asia, so this would be a significant improvement over existing vaccines.
With current foot-and-mouth vaccines it is difficult to distinguish between immunised livestock and those which have been infected.
That proved to be a major hurdle in controlling the foot-and-mouth outbreak in the UK in 2001 because it would have prevented the export of livestock.
But the synthetic vaccine should allow scientists to show the absence of infection in vaccinated animals.
“The foot-and-mouth-disease virus epidemic in the UK in 2001 was disastrous and cost the economy billions of pounds in control measures and compensation,” explained Dr Bryan Charleston, Head of Livestock Viral Diseases Programme at the Pirbright Institute.
“This important work has been a direct result of the additional funding that was provided as a result of the 2001 outbreak to research this highly contagious disease.”
The potential hazards of working with viruses was underlined in 2007 when the Pirbright laboratory site was identified as the source of a leak which led to an outbreak of foot-and-mouth disease.
Polio, another picornavirus, which exclusively affects humans, has been eliminated from nearly every country in the world, although it stubbornly persists in Nigeria, Pakistan and Afghanistan.
The need for secure vaccine production will become even more vital should polio be wiped out.
“Current polio vaccines, which use live virus for their production, pose a potential threat to the long-term success of eradication if they were to re-establish themselves in the population.
“Non-infectious vaccines would clearly provide a safeguard against this risk”, said Dr Andrew Macadam, a virologist specialising in polio at the National Institute for Biological Standards and Control in Hertfordshire.
“This technology has great potential in terms of cost and biosafety.
“Any design strategy that minimises the chances of accidental virus release would not only make the world a safer place but would lower the bio-containment barriers to production allowing vaccines to be made more cheaply all over the world.”
Permanent link to this article: http://www.animatedscience.co.uk/2013/synchrotron-yields-safer-vaccine
Planck telescope maps light of the big bang scattered across the universe
Permanent link to this article: http://www.animatedscience.co.uk/2013/planck-telescope-maps-light-of-the-big-bang-scattered-across-the-universe
LHC wraps up antimatter ‘flip’ story http://www.bbc.co.uk/news/science-environment-21594357
Permanent link to this article: http://www.animatedscience.co.uk/2013/lhc-wraps-up-antimatter-flip-story
Diamond to shine light on infections http://www.bbc.co.uk/news/science-environment-21481223
The UK’s national synchrotron facility – the Diamond Light Source near Oxford – is to become a world centre for studying the structure of viruses and bacteria that cause serious disease.
Diamond uses intense X-rays to reveal the molecular and atomic make-up of objects and materials.
It will now use this capability to image Containment Level 3 pathogens.
These are responsible for illnesses such as Aids, hepatitis and some types of flu.
Level 3 is one step down from the most dangerous types of infectious agent, such as Ebola, which can only be handled in the most secure government facilities.
“Viruses, as you know, are sort of tiny nanomachines and you can’t see them in a normal microscope.
“But with the crystallography and X-ray techniques we use, we are able to get about 10,000 times the resolution of the normal light microscope,” explained Dave Stuart, the life sciences director at Diamond and a professor of structural biology at Oxford University.
“This takes us from the regime of not being able to see them to being able to see individual atoms.
“And if we can look at ‘live’ viruses and get an atomic-level description of them, it opens up the possibility of using modern drug-design techniques to produce new pharmaceuticals.”
Prof Stuart was speaking in Boston at the annual meeting for the American Association for the Advancement of Science (AAAS).
Permanent link to this article: http://www.animatedscience.co.uk/2013/diamond-to-shine-light-on-infections
Star is caught devouring planet (from the BBC)
Astronomers have found evidence for a planet being devoured by its star, yielding insights into the fate that will befall Earth in billions of years.
The team uncovered the signature of a planet that had been “eaten” by looking at the chemistry of the host star.
They also think a surviving planet around this star may have been kicked into its unusual orbit by the destruction of a neighbouring world.
Details of the work have been published in Astrophysical Journal Letters.
The US-Polish-Spanish team made the discovery when they were studying the star BD+48 740 – which is one of a stellar class known as red giants. Their observations were made with the Hobby Eberly telescope, based at the McDonald Observatory in Texas.
Rising temperatures near the cores of red giants cause these elderly stars to expand in size, a process which will cause any nearby planets to be destroyed.
“A similar fate may await the inner planets in our solar system, when the Sun becomes a red giant and expands all the way out to Earth’s orbit some five billion years from now,” said co-author Prof Alexander Wolszczan from Pennsylvania State University in the US.
The first piece of evidence for the missing planet comes from the star’s peculiar chemical composition.
Spectroscopic analysis of BD+48 740 revealed that it contained an abnormally high amount of lithium, a rare element created primarily during the Big Bang 14 billion years ago.
Lithium is easily destroyed in stars, so its high abundance in this ageing star is very unusual.
“Theorists have identified only a few, very specific circumstances, other than the Big Bang, under which lithium can be created in stars,” Prof Wolszczan explained.
“In the case of BD+48 740, it is probable that the lithium production was triggered by a mass the size of a planet that spiralled into the star and heated it up while the star was digesting it.”
The second piece of evidence discovered by the astronomers is the highly elliptical orbit of a newly discovered planet around the red giant star. The previously undetected world is at least 1.6 times as massive as Jupiter.
Co-author Andrzej Niedzielski of Nicolaus Copernicus University in Torun, Poland, said that orbits as eccentric as this one are uncommon in planetary systems around evolved stars.
“In fact, the BD+48 740 planet’s orbit is the most elliptical one detected so far,” he added.
Because gravitational interactions between planets are often responsible for such peculiar orbits, the astronomers suspect that the dive of the missing planet toward its host star before it became a giant could have given the surviving massive planet a burst of energy.
This boost would have propelled it into its present unusual orbit.
Team member Eva Villaver of the Universidad Autonoma de Madrid in Spain commented: “Catching a planet in the act of being devoured by a star is an almost improbable feat to accomplish because of the comparative swiftness of the process, but the occurrence of such a collision can be deduced from the way it affects the stellar chemistry.
“The highly elongated orbit of the massive planet we discovered around this lithium-polluted red giant star is exactly the kind of evidence that would point to the star’s recent destruction of its now-missing planet.”
Permanent link to this article: http://www.animatedscience.co.uk/2012/star-is-caught-devouring-a-planet
This is a fantastic article I wanted to share…
Bose-Einstein condensate (BEC), a state of matter in which separate atoms or subatomic particles, cooled to near absolute zero (0 K, − 273.15 °C, or − 459.67 °F; K = kelvin), coalesce into a single quantum mechanical entity—that is, one that can be described by a wave function—on a near-macroscopic scale. This form of matter was predicted in 1924 by Albert Einstein on the basis of the quantum formulations of the Indian physicist Satyendra Nath Bose.
Although it had been predicted for decades, the first atomic BEC was made only in 1995, when Eric Cornell and Carl Wieman of JILA, a research institution jointly operated by the National Institute of Standards and Technology (NIST) and the University of Colorado at Boulder, cooled a gas of rubidium atoms to 1.7 × 10−7 K above absolute zero. Along with Wolfgang Ketterle of the Massachusetts Institute of Technology (MIT), who created a BEC with sodium atoms, these researchers received the 2001 Nobel Prize for Physics. Research on BECs has expanded the understanding of quantum physics and has led to the discovery of new physical effects.
BEC theory traces back to 1924, when Bose considered how groups of photons behave. Photons belong to one of the two great classes of elementary or submicroscopic particles defined by whether their quantum spin is a nonnegative integer (0, 1, 2, …) or an odd half integer (1/2, 3/2, …). The former type, called bosons, includes photons, whose spin is 1. The latter type, called fermions, includes electrons, whose spin is 1/2.
As Bose noted, the two classes behave differently (see Bose-Einstein and Fermi-Dirac statistics). According to the Pauli exclusion principle, fermions tend to avoid each other, for which reason each electron in a group occupies a separate quantum state (indicated by different quantum numbers, such as the electron’s energy). In contrast, an unlimited number of bosons can have the same energy state and share a single quantum state.
Einstein soon extended Bose’s work to show that at extremely low temperatures “bosonic atoms” with even spins would coalesce into a shared quantum state at the lowest available energy. The requisite methods to produce temperatures low enough to test Einstein’s prediction did not become attainable, however, until the 1990s. One of the breakthroughs depended on the novel technique of laser cooling and trapping, in which the radiation pressure of a laser beam cools and localizes atoms by slowing them down. (For this work, French physicist Claude Cohen-Tannoudji and American physicists Steven Chu and William D. Phillips shared the 1997 Nobel Prize for Physics.) The second breakthrough depended on improvements in magnetic confinement in order to hold the atoms in place without a material container. Using these techniques, Cornell and Wieman succeeded in merging about 2,000 individual atoms into a “superatom,” a condensate large enough to observe with a microscope, that displayed distinct quantum properties. As Wieman described the achievement, “We brought it to an almost human scale. We can poke it and prod it and look at this stuff in a way no one has been able to before.”
BECs are related to two remarkable low-temperature phenomena: superfluidity, in which each of the helium isotopes 3He and 4He forms a liquid that flows with zero friction; and superconductivity, in which electrons move through a material with zero electrical resistance. 4He atoms are bosons, and although 3He atoms and electrons are fermions, they can also undergo Bose condensation if they pair up with opposite spins to form bosonlike states with zero net spin. In 2003 Deborah Jin and her colleagues at JILA used paired fermions to create the first atomic fermionic condensate.
BEC research has yielded new atomic and optical physics, such as the atom laser Ketterle demonstrated in 1996. A conventional light laser emits a beam of coherent photons; they are all exactly in phase and can be focused to an extremely small, bright spot. Similarly, an atom laser produces a coherent beam of atoms that can be focused at high intensity. Potential applications include more-accurate atomic clocks and enhanced techniques to make electronic chips, or integrated circuits.
The most intriguing property of BECs is that they can slow down light. In 1998 Lene Hau of Harvard University and her colleagues slowed light traveling through a BEC from its speed in vacuum of 3 × 108 metres per second to a mere 17 metres per second, or about 38 miles per hour. Since then, Hau and others have completely halted and stored a light pulse within a BEC, later releasing the light unchanged or sending it to a second BEC. These manipulations hold promise for new types of light-based telecommunications, optical storage of data, and quantum computing, though the low-temperature requirements of BECs offer practical difficulties.
Permanent link to this article: http://www.animatedscience.co.uk/2012/bose-einstein-condensate-bec
Researchers have come up with a way to glimpse the infant Universe by decoding the earliest ripples in its light.
They say this can be achieved by capturing the specific radio wavelength of 21cm from the heavens.
The trick is to tell the difference between 21cm waves from our galaxy and those from distant, ancient sources.
The fact that “dark matter” moved faster than normal matter in the early Universe should help amplify the distant signal, they report in Nature.
That could yield a look at the Universe when it was just 1% of its current age.
The scientists first revealed their 3-D computer simulations on Monday at the Gamma Ray Bursts in the Era of Rapid Follow-up conference, hosted by Liverpool John Moores University.
The current record-holder for the oldest object ever spotted is a galaxy named UDFy-38135539, seen in an optical image captured by the Hubble telescope. Its light escaped more than 13 billion years ago, when the Universe was already a youth of less than 700 million years.
What is redshift?
- The term “redshift” arises from the fact that light from more distant objects shows up on Earth more red than when it left its source
- The colour shift comes about because of the Doppler effect, which acts to “stretch” or “compress” waves from moving objects
- It is at work in the sound of a moving siren: an approaching siren sounds higher-pitched and a receding one sounds lower-pitched
- In the case of light, approaching objects appear more blue and receding objects appear more red
- The expansion of the Universe is accelerating, so in general, more distant objects are moving away from us (and each other, and everything else) more quickly than nearer ones
- At cosmic distances, the shift can profoundly affect the colour – the factor by which the wavelength is “stretched” is called the redshift
Scientists measure these literally astronomical distances with the “redshift” of a given light source; it is a measure of how much the source’s light is stretched as it races away from us in the ever-expanding Universe.
UDFy-38135539 has a redshift of 8.55, but the new work shows promise for looking at stars and galaxies at a redshift of 20.
However, if it works, the view will be a statistical one – astronomers will not actually see individual stars and galaxies, but rather be able to estimate how many objects of what sizes were around in those early days.
But instead of seeing only the largest and brightest objects, as studies with telescopes such as Hubble typically do, it should work down to galactic haloes as small as a millionth the mass of the Milky Way’s halo.
“It’s very small galaxies from very far away; it’s completely hopeless to see them individually with any telescope in the next few decades,” said Rennan Barkana of Tel Aviv University, a co-author on the study.
“That’s why this is so interesting – it’s an indirect detection of the whole population of these galaxies, but it would be a very clear confirmation that these galaxies are there,” Prof Barkana told BBC News.
The 21cm wavelength arises from changes within the atoms of hydrogen, the Universe’s most abundant element, and one that can tell us much about the early Universe before heavier elements were formed.
A key insight lies in the different speed limits for dark matter and normal matter in the early Universe, first pointed out in a 2010 Physical Review D paper.
The early Universe was shaped in part by pressure waves – just like sound waves – created in the wake of the Big Bang. Like air molecules shifted around by sounds, these waves carried and distributed normal matter in regular patterns we can now observe.
But dark matter, because it does not interact with normal matter, was not swayed by the waves, responding only to gravity.
The distributions of dark and normal matter in the early Universe changes just where the matter – mostly hydrogen – ended up, in turn changing where the 21cm emission should come from, and how intense it should be.
Averaged over the sky, there should be a greater variation in this signal than we see locally, and the new paper makes the case that heating by X-ray radiation in those early days should make this statistical fluctuation even easier to spot.
Prof Barkana said that although there are no current radio telescope arrays designed to catch these 21cm waves, several are under construction that could be put onto the task.
“This whole subject of 21cm cosmology is about to open up; there are at least four different groups building radio telescope arrays focussing on about redshift 10,” he said.
“But until now no one has had the incentive to build an array optimised for this (redshift 20) wavelength range.”
By Jason Palmer Science and technology reporter, BBC News
Permanent link to this article: http://www.animatedscience.co.uk/2012/dark-matter-tracks-could-give-earliest-view-of-universe
This is a great article which shows the process of having a theory and then trying to prove it, even if the experiement is huge!
Supersymmetry fails to predict the existence of mysterious super particles.
Results from the Large Hadron Collider (LHC) have all but killed the simplest version of an enticing theory of sub-atomic physics. Researchers failed to find evidence of so-called “supersymmetric” particles, which many physicists had hoped would plug holes in the current theory. Theorists working in the field have told BBC News that they may have to come up with a completely new idea. Data were presented at the Lepton Photon science meeting in Mumbai.
They come from the LHC Beauty (LHCb) experiment, one of the four main detectors situated around the collider ring at the European Organisation for Nuclear Research (Cern) on the Swiss-French border.
According to Dr Tara Shears of Liverpool University, a spokesman for the LHCb experiment: “It does rather put supersymmetry on the spot”.
There’s a certain amount of worry that’s creeping into our discussions”
Dr Joseph Lykken Fermilab
The experiment looked at the decay of particles called “B-mesons” in hitherto unprecedented detail. If supersymmetric particles exist, B-mesons ought to decay far more often than if they do not exist. There also ought to be a greater difference in the way matter and antimatter versions of these particles decay.
The results had been eagerly awaited following hints from earlier results, most notably from the Tevatron particle accelerator in the US, that the decay of B-mesons was influenced by supersymmetric particles. LHCb’s more detailed analysis however has failed to find this effect.
Bitten the dust
This failure to find indirect evidence of supersymmetry, coupled with the fact that two of the collider’s other main experiments have not yet detected supersymmetic particles, means that the simplest version of the theory has in effect bitten the dust. Collisions inside the LHC should have found some evidence of Supersymmetry by now. The theory of supersymmetry in its simplest form is that as well as the subatomic particles we know about, there are “super-particles” that are similar, but have slightly different characteristics. The theory, which was developed 20 years ago, can help to explain why there is more material in the Universe than we can detect – so-called “dark matter”. According to Professor Jordan Nash of Imperial College London, who is working on one of the LHC’s experiments, researchers could have seen some evidence of supersymmetry by now. “The fact that we haven’t seen any evidence of it tells us that either our understanding of it is incomplete, or it’s a little different to what we thought – or maybe it doesn’t exist at all,” he said. Disappointed the timing of the announcement could not be worse for advocates of supersymmetry, who begin their annual international meeting at Fermilab, near Chicago, this weekend.
“Supersymmetry… has got symmetry and its super – but there’s no experimental data to say it is correct” said Professor George Smoot Nobel Laureate
Dr Joseph Lykken of Fermilab, who is among the conference organisers, says he and others working in the field are “disappointed” by the results – or rather, the lack of them. “There’s a certain amount of worry that’s creeping into our discussions,” he told BBC News. The worry is that the basic idea of supersymmetry might be wrong.
“It’s a beautiful idea. It explains dark matter, it explains the Higgs boson, it explains some aspects of cosmology; but that doesn’t mean it’s right. “It could be that this whole framework has some fundamental flaws and we have to start over again and figure out a new direction,” he said. Experimental physicists working at the LHC, such as Professor Nash, say the results are forcing their theoretical colleagues to think again. “For the last 20 years or so, theorists have been a step ahead in that they’ve had ideas and said ‘now you need to go and look for it’. “Now we’ve done that, and they need to go scratch their heads,” he said.
That is not to say that it is all over for supersymmetry. There are many other, albeit more complex, versions of the theory that have not been ruled out by the LHC results. These more complex versions suggest that super-particles might be harder to find and could take years to detect. Some old ideas that emerged around the same time as supersymmetry are being resurrected now there is a prospect that supersymmetry may be on the wane. One has the whimsical name of “Technicolor”. According to Dr Lykken, some younger theoretical physicists are beginning to develop completely novel ideas because they believe supersymmetry to be “old hat” . “Young theorists especially would love to see supersymmetry go down the drain, because it means that the real thing is something they could invent – not something that was invented by the older generation,” he said.
And the new generation has the backing of an old hand – Professor George Smoot, Nobel prizewinner for his work on the cosmic microwave background and one of the world’s most respected physicists. “Supersymmetry is an extremely beautiful model,” he said. “It’s got symmetry, it’s super and it’s been taught in Europe for decades as the correct model because it is so beautiful; but there’s no experimental data to say that it is correct.”
Permanent link to this article: http://www.animatedscience.co.uk/2011/lhc-results-put-supersymmetry-theory-on-the-spot
This is an article from BBC Science, really useful for how we work out units!
By Jason Palmer Science and technology reporter, BBC News, Teddington
Studies of the clock’s performance, to be published in the journal Metrologia, show it is nearly twice as accurate as previously thought.
The clock would lose or gain less than a second in some 138 million years.
The UK is among the handful of nations providing a “standard second” that keeps the world on time.
However, the international race for higher accuracy is always on, meaning the record may not stand for long.
The NPL’s CsF2 clock is a “caesium fountain” atomic clock, in which the “ticking” is provided by the measurement of the energy required to change a property of caesium atoms known as “spin”.
By international definition, it is the electromagnetic waves required to accomplish this “spin flip” that are measured; when 9,192,631,770 peaks and troughs of these waves go by, one standard second passes.
Inside the clock, caesium atoms are gathered into bunches of 100 million or so, and passed through a cavity where they are exposed to these electromagnetic waves.
The colour, or frequency, is adjusted until the spins are seen to flip – then the researchers know the waves are at the right frequency to define the second.
The NPL-CsF2 clock provides an “atomic pendulum” against which the UK’s and the world’s clocks can be compared, ensuring they are all ticking at the same time.
That correction is done at the International Bureau of Weights and Measures (BIPM) in the outskirts of Paris, which collates definitions of seconds from six “primary frequency standards” – CsF2 in the UK, two in France, and one each in the US, Germany and Japan.
For those six high-precision atomic pendulums, absolute accuracy is a tireless pursuit.
At the last count in 2010, the UK’s atomic clock was on a par with the best of them in terms of long-term accuracy: to about one part in 2,500,000,000,000,000.
What time is it, exactly?
- The international time standard is maintained by a network of over 300 clocks worldwide
- These are sent by satellite and averaged at BIPM, a measurement institute in France
- But the “tick” of any one of them could drift out of accuracy, so BIPM corrects the average using six “primary frequency standards” in Europe, the US and Japan
- Their corrected result, “International Atomic Time”, is occasionally compared with the time-honoured measure of time by astronomical means
- Occasionally a “leap second” is added or subtracted to correct any discrepancy
But the measurements carried out by the NPL’s Krzysztof Szymaniec and colleagues at Pennsylvania State University in the US have nearly doubled the accuracy.
The second’s strictest definition requires that the measurements are made in conditions that Dr Szymaniec said were impossible actually to achieve in the laboratory.
“The frequency we measure is not necessarily the one prescribed by the definition of a second, which requires that all the external fields and ‘perturbations’ would be removed,” he explained to BBC News.
“In many cases we can’t remove these perturbations; but we can measure them precisely, we can assess them, and introduce corrections for them.”
The team’s latest work addressed the errors in the measurement brought about by the “microwave cavity” that the atoms pass through (the waves used to flip spins are not so far in frequency from the ones that flip water molecules in food, heating them in a microwave oven).
A fuller understanding of how the waves are distributed within it boosted the measurement’s accuracy, as did a more detailed treatment of what happens to the measurement when the millions of caesium atoms collide.
Without touching a thing, the team boosted the known accuracy of the machine to one part in 4,300,000,000,000,000.
But as Dr Szymaniec said, the achievement is not just about international bragging rights; better standards lead to better technology.
“Nowadays definitions for electrical units are based on accurate frequency measurements, so it’s vital for the UK as an economy to maintain a set of standards, a set of procedures, that underpin technical development,” he said.
“The fact that we can develop the most accurate standard has quite measurable economic implications.”
Permanent link to this article: http://www.animatedscience.co.uk/2011/uks-atomic-clock-is-worlds-most-accurate