How science works: the Higgs particle, and neutrinos that aren’t really faster than light

I didn’t really blog about the exciting Higgs boson verification* a couple of weeks ago. I was too busy reading about it myself, and was also on a work trip.

But it is thrilling. It’s exceedingly strong experimental support for a subatomic particle that was the missing piece of the Standard Model. It’s one more piece in the jigsaw we’re assembling about how the universe came to be and how it works. (But it’s clear we’re going to need more than just the Standard Model.)

I think the recent discovery is just as important as a public illustration of how science works: posit a theory that fits the facts, then go about trying to prove and disprove it.

And the Higgs wasn’t the only newsworthy item that happened at CERN this year to demonstrate this. Remember the big news several months ago that experimental results showed that neutrinos were travelling faster than the speed of light, which nothing should be able to do according to Einstein’s theory of relativity? Those results have since been shown to be incorrect: there was a cable loose somewhere. This guest post on the Freakonomics blog spells it all out really well.

Checking and re-checking: that’s how science works.

If you want more on the Higgs, I really like the way this guy explains what the discovery means.

*Pretty much. Within 5 standard deviations.

LHC gets scientists excited

You may have seen the press excerpts that indicate the Large Hadron Collider has been running – finally – and has been working.

But what’s working so far, exactly?

This article in the Guardian is written by John Ellis, one of the senior physicists at CERN. He’s very excited. But he explains what they’ve seen so far, what surprises we’ve already glimpsed, and why he’s encouraged.

Further LHC delays

Nooooo!

From the BBC:

A director at the Large Hadron Collider in Geneva has told BBC News that some mistakes were made in construction.

Dr Steve Myers said these faults will delay the machine reaching its full potential for two years.

The atom smasher will reach world record power later this month at 7 trillion electron volts (TeV).

But the machine must close at the end of 2011 for up to a year for work to make the tunnel safe for proton collisions planned at twice that level.

The machine only recently restarted after being out of action for 14 months following an accident in September 2008.

LHC: Beams are back

From ScienceDaily:

Particle beams are once again zooming around the world’s most powerful particle accelerator — the Large Hadron Collider — located at the CERN laboratory near Geneva, Switzerland. On November 20 at 4:00 p.m. EST, a clockwise circulating beam was established in the LHC’s 17-mile ring.

From the BBC: pictures of the happy moment.

CERN’s Twitter feed shows that initial testing complete, and commissioning is now underway. Their tweets have links to info and animations.

Large Hadron Collider to get fired up again this weekend

Tests are going well, and it looks like CERN may fire up the Large Hardon Collider (LHC) again this weekend. They’ve shown that they can fire protons around the collider, and that the detectors are working. If things continue, they should soon be smashing particles.

You’ll remember that testing has been delayed because on 19 September 2008 a bad electrical connection caused a fault that damaged a bunch of the superconducting magnets. It’s taken over a year to repair that and put in safeguards in light of the fault. I’m sure CERN are anxious to move ahead (but to avoid further delays whilst doing so).

And if you’ve forgotten what they’re trying to prove with all this particle-smashing nonsense, let me remind you:

LHC to fire up again in November

From the Guardian:

Last week, Cern announced that the LHC will finally begin firing protons around its 27km circular tunnel again in November. Initially, it will run at an energy of 3.5 tera-electronvolts (TeV) per beam – just half of what it’s meant to achieve at full blast, but still several times more than the LHC’s American competitor, the Tevatron at Fermilab, can manage. After operating at this lower level for a period, the energy will be increased to 5TeV per beam.

According to Cern spokesman James Gillies, the mood at Cern is optimistic.

“We’re looking forward to getting going,” he said. “There’s consensus that the choices that have been taken to run the machine safely at 3.5TeV per beam are good choices. They allow the machine operators to learn how to drive the machine, if you like, under what should be very easy conditions for them, and they don’t compromise the physics.”

Gillies is confident that there won’t be another serious mishap this time around.

IT throughput at the LHC data grid

A post this week from a work-related blog I read mentioned the Large Hadron Collider. This post was not really about the LHC itself, or CERN, or the experiments they’re doing, which have already been heavily covered recently. It was about the massive data collection network involved in the experimentation.

This page from CERN has information about the LHC Computing Grid (LCG). If you’re into data centres, have a read.

Having worked in networking for many years I find the data throughput visualization tools to be really interesting. You can see, hour by hour, the amount of data throughput for all the different experiments that make up the LHC. A couple of hours before I wrote this they hit 380 megabytes per second across all systems, with Alice (where they hope to detect quark-gluon plasma), Atlas (where they hope to spot the Higgs boson, evidence of dark matter, and answer questions about whether there might be a higher number of spacetime dimensions than we think there are), and CMS (similar to Atlas, but using different methods of detection) being by the largest.

Switch to a daily view, though, and you’ll see that they’re actually in a very low data-collection mode at the moment: there was a local peak around 1250 MB/s back on 24 August. The largest average throughput this year was about 2100 MB/s in late May. Lots of data, and a really neat way of watching progress.

LHC data throughput year to date (click to enlarge)

A word on the similar goals of the Atlas/CMS detectors: I heard on BBC Radio 4 the other morning that there’s a pretty healthy dose of rivalry between the two scientific teams about who might make their discoveries first.

IT throughput at the LHC data grid

A post this week from a work-related blog I read mentioned the Large Hadron Collider. This post was not really about the LHC itself, or CERN, or the experiments they’re doing, which have already been heavily covered recently. It was about the massive data collection network involved in the experimentation.

This page from CERN has information about the LHC Computing Grid (LCG). If you’re into data centres, have a read.

Having worked in networking for many years I find the data throughput visualization tools to be really interesting. You can see, hour by hour, the amount of data throughput for all the different experiments that make up the LHC. A couple of hours before I wrote this they hit 380 megabytes per second across all systems, with Alice (where they hope to detect quark-gluon plasma), Atlas (where they hope to spot the Higgs boson, evidence of dark matter, and answer questions about whether there might be a higher number of spacetime dimensions than we think there are), and CMS (similar to Atlas, but using different methods of detection) being by the largest.

Switch to a daily view, though, and you’ll see that they’re actually in a very low data-collection mode at the moment: there was a local peak around 1250 MB/s back on 24 August. The largest average throughput this year was about 2100 MB/s in late May. Lots of data, and a really neat way of watching progress.

LHC data throughput year to date (click to enlarge)

A word on the similar goals of the Atlas/CMS detectors: I heard on BBC Radio 4 the other morning that there’s a pretty healthy dose of rivalry between the two scientific teams about who might make their discoveries first.