Monthly Archives: June 2019
The ozone hole and depletion of the ozone layer is old news. We have had hard data on the layer for a more than 20 years and that data tells us, unequivocally, that the ozone layer is disappearing at a rate of about three percent per year. The process that drives this has also not been subject to much attention for sometime: we thought we understood it very well. In a series of four reaction steps, an unstable oxidated chlorine dimer (ClO) combines with a second dimer—a reaction that is mediated by collision with a third party molecule—to form Cl2O2. One of the chlorine molecules is split off by a high energy photon and the second chlorine is split off through collisions with other molecules. The two free chlorine atoms then react with ozone molecules to recreate the two dimers and over the whole cycle, converts two ozone molecules (O3) into three oxygen molecules (O2).
The rates of each step needed to be determined before it could be concluded that chlorine based molecules have been responsible for the depletion of the ozone layer, and chemists thought they had that nailed. The weak link in our knowledge of the reaction rates has always been the light initiated cleaving of chlorine. Although various labs had attempted to measure this step, the reported reaction rates (more precisely, the absorption cross section of Cl2O2) varied by a factor of five, leaving this step poorly understood. It appears that atmospheric chemists weren't too concerned by this because even the lowest reported value was fast enough. Models based on this reaction chain, combined with a few other, less significant paths, accounted for the observed ozone depletion. Case closed, right?
Wrong, apparently. Concerned over the wide distribution of values and the complacency of scientists, a crank, working in his garage, has managed to overturn 20 years of dogma in a blindingly simple experiment. Umm, no, that isn't correct either. In fact, a team of scientists from the Jet Propulsion Laboratory have put together a rather complicated experiment, one that allowed them to isolate the Cl2O2 molecule in a form that was much more pure than what had been previously obtained. They did this through a combination of laser induced reactions, cooling, and trapping. Having obtained a much more pure form, the researchers were able to use UV lasers to cleave the chlorine molecule and measure the rate of the subsequent reactions. To their shock, they found that the reaction rate was not just at the lower end of the published results, but about an order of magnitude slower than the average of previously reported values.
Although this work needs to be replicated, it may have far-reaching consequences in both atmospheric physics and politics. Firstly, using the new reaction rate, scientists can no longer account for 60 percent of the observed ozone depletion. Although it is still thought that chlorine based catalytic reactions are the major cause of ozone depletion, we no longer have a strong link between theory, experiment, and observation. This may give policy makers all the excuse they need to begin (or continue) ignoring the Montreal convention. Most importantly, scientists had, based on chlorofluorocarbon emissions and the chemical reaction pathways, predicted a slow recovery of the ozone layer. We can no longer be confident about that either. Lastly, and this sounds like a bit of a stretch, we can't predict how ozone chemistry and global climate change will interact. Basically, all these reactions are temperature dependent and are probably dependent on light from the sun. Previously, we thought the influence of climate change on ozone chemistry would be minor. Now the honest answer is that we don't know.
I will finish by explaining the line about a crank in his garage overturning scientific dogma. All of my life, I have been exposed to people who believe that scientists are more attached to their theories than they are to data. They think that scientists, as a group, persecute and suppress anyone who tries to demonstrate that the existing theory is in any way flawed. This is especially true when science connects with politics and/or our beliefs. I cannot think of a finer immediate example to present as a counterpoint.
Journal of Physical Chemistry, 2007, DOI: 10.1021/jp067660w
Sometimes, researchers report results that are absolutely astounding and I don't report them here because I can't find a way to convey the significance of what they have achieved. On other occasions, the press hypes the research to such an extent that I feel the only objective thing to do is to balance that with a view of why the research isn't such a huge step or is, in fact, complete rubbish. Now I find myself in unfamiliar territory, where the reported research is very interesting but there is a fatal flaw in their description of it and I am not sure if that invalidates all of the research, or just a part of their conclusions, or is completely irrelevant.
The researchers claim to have demonstrated some noncommuting operations on light that had been predicted, but not previously observed. A noncommuting set of operations is one that gives different results depending on the order in which they are performed. A perfect example is rotation: hold your left hand in front of you with the palm away from you, fingers up, and the thumb as horizontal as you can. Now rotate about the thumb 90 degrees forward (your palm should now be facing the floor). Now rotate about the fingers 90 degrees anticlockwise (your thumb should be pointing upwards). Now re-do the rotations in the opposite order. The second rotation was a rotation about the axis pointing outwards from your body, so that one should leave you with the palm still facing forwards and the thumb point upwards. The first rotation was about the axis running parallel to the front of the body and the correct rotation should leave you with your thumb pointing away from your body and your wrist feeling like it is dislocated. The differences between the results are an indication that the actions are noncommuting.
A particularly weird example of noncommuting behavior is the relationship between the creation and annihilation operators of light. Basically, destroying a photon and then creating a photon does not give the same result as creating a photon and then destroying a photon. A team of researchers, mostly based in Florence, have elegantly demonstrated this. Controlled photon subtraction is achieved by a very low reflecting mirror and a detector in the path of the reflected photon. Statistically speaking, when the detector goes bing, a single photon has been removed from the light.
A single photon can be added back using a particular form of amplifier. A high energy photon can be split into two low energy photons. By choosing the high energy photon correctly, the generated photons are exactly the same color as those of the beam we want to add a photon to. Performing the addition in a specific crystal medium allows one photon to be added to the light while the second goes in a different direction and can be detected by a light detector—a bing from this detector means a photon has been added. After that, a second very low reflectance mirror can be used to perform a subtraction, which is detected by a third detector. Finally, the number of photons in the beam is measured.
When the light source is switched on and off rapidly, there are a number of possible results. Sometimes a photon is subtracted and nothing else occurs. Sometimes a photon is added and nothing else occurs. Occasionally a photon is added and then another photon is subtracted. Occasionally a photon is subtracted and then another photon is added. Finally all three operations may occur: a photon is subtracted, a photon is added, and another photon is subtracted. If photon creation and annihilation were commutative, then both subtraction and addition and addition and subtraction should result in the same number of photons. However, the researchers showed that they do not.
A very elegant experiment and case closed, right? Well no, I have a problem. The annihilation and creation operators are only noncommuting when applied to an incoherent light source. Think of it like this: light from a laser has phase coherence—that is, the photons have a strict spatial and temporal relationship to one another (called a mode). To add a photon to this beam, it must be in the same mode. When I subtract a photon, it can only be subtracted from the photons in the mode, because there are no other modes available. Under these circumstances, it doesn't matter which order I do addition and subtraction. However, with an incoherent light source, the photons have no fixed spatial and temporal relationship to each other (e.g., they are spread across many modes), so the adder can place a photon in any mode, provided one other photon is in that mode. In this way, the order of addition and subtraction can produce different results.
So, to do this experiment properly the source of light must be incoherent; instead, the researchers used a pulsed laser. To make the light incoherent, they used ground glass to mess up the laser light. They refer to this as thermal light, but, it ain't thermal light. After light goes through something like ground glass, it is spatially incoherent, but still temporally coherent. We have seen this previously by looking at how laser light can be sent through an opaque substance. Furthermore, the light from the ground glass is then coupled into a single mode optical fiber. This means that the researchers are selecting a single spatial mode from the scattered light—that their light is coherent in every respect.
The only loophole that I can see is that the ground glass rotates so that they obtain a different scattered mode from one laser pulse to the next (the laser pulses are very short, so the glass doesn't rotate during a pulse). They don't tell us how fast their glass sample rotates or how long they average for, which means we are not in a position to say that there are statistical errors in there (e.g., the beat frequency between the rotating ground glass and the pulse repetition frequency).
This is the second time I have seen results like this, and the researcher involved in the first case did not really answer this question, either. However, given that the session chair had just abused his power by giving an impromptu talk of his own to the effect that the results were rubbish (for different reasons), before calling for questions, I can forgive the less than coherent response to the questions that focused on coherence issues. Given this publication, however, it is time for us to see proper data that clearly demonstrates that the light is truly identical to thermal light.
Science, 2007, DOI: 10.1126/science.1148947
Nintendo has something of a surprise for us this week: two games that were never released outside of Japan. This is a departure from the usual weekly business of giving us games we've most likely already played or heard about; these games may be news to younger gamers or non-obsessive Mario fans. Let's see what we have:
Super Mario Bros.: The Lost Levels (NES, 600 point or $6)
This was Super Mario Bros. 2 in Japan, but it was deemed too hard for American audiences. In this game there is no two-player option, but the player can control either Mario or Luigi, who now control differently. This game was previously available in the Super Nintendo collection Super Mario All-Stars, but this is a great way to play a Mario Bros. variations that may be obscure to many people. A very neat addition to the Virtual Console.
Sin and Punishment (N64, 1200 point or $12)
Pretty expensive for a Virtual Console game, but when you start to talk about Treasure shooters that you have to import, downloading a game like this directly for $12 starts to sound like a steal. From the people who brought you Radiant Silvergun and Ikaruga, this is a rail-based shooter where your main goal is to dodge the incoming bullets while hopefully somehow finding the time to fire back. While not the prettiest game, this still features the hard-as-nails Treasure gameplay and should satisfy shooter fans. If nothing else, it's a game that North America has yet to have access to, and now anyone can play it. That's progress.
Not a bad Monday, even with only two games. I hope Nintendo reaches back in the vaults of their systems more often for games like these; while collectors may be annoyed by how easy these games can now be bought, the mainstream player now has more choices than ever when it comes to playing games they've missed the first time around.
As expected, the reaction to the iPhone-breaking 1.1.1 firmware has been quite strong, with owners of hacked iPhones being particularly teed off. But as it turns out, Apple's policy of refusing to honor the warranty of hacked iPhones may be hurting owners of pristine iPhones, too. We're hearing that bringing any kind of bricked iPhone into an Apple Store is being taken as an admission that the phone was unlocked or otherwise hacked, leaving people with legitimate complaints out in the cold. As a result, a class action lawsuit over iPhone warranty refusals seems to be brewing on Apple's discussion boards.
Granted, a few forum posts seeking more members for the suit doesn't amount to much yet, but the posts could lead to something bigger if enough people respond. There are three proposed classes for the suit, covering the various degrees of iPhone hacking. Class one is made up of anyone who was refused service because they accessed the iPhone's flash storage but didn't install applications, including anyone who used software like iToner. Class two is made up of more adventurous iPhone owners, who have installed third-party applications and were subsequently refused service for a hardware issue. Finally, class three is anyone who fully unlocked their iPhones, restored it before getting service, but still got denied.
Although many people have weighed in on how they feel about the issues, I'm not sure how many of those users are actually part of any of the classes above. All three classes require some degree of hacking, which many people realize might hurt their chances of getting service. Nonetheless, I'm sure we'll see a class action suit over the warranty refusals before long. I think it will first involve owners of unhacked iPhones that have had problems, and then one from unhappy hackers.
Microsoft today unveiled another piece of its Office web strategy, and what is perhaps most interesting about it is what it isn't: entirely web-based. Rather, Microsoft's move hopes to leverage the success of their Office client software to drive users to a new online collaboration site dubbed Office Live Workspace.
Office Live Workspace is not an online office suite. The aim of OLW is simple: give web-connected users a no-cost place to store, share, and collaborate on Office documents. To that end, the company will give registered users 250 MB of storage space, which can be used to store documents "in the cloud" or even "host" them for comments by other users equipped with just a web browser (you will be able to manage the access rights of other users). However, and this is important: you cannot create new Office documents with this feature nor can you edit documents beyond adding comments without having a copy of Microsoft Office installed locally.
As you can see, this is not a "Google Docs killer" or even an "answer" to Google Docs. This is not an online office suite, it's "software plus service." Microsoft's move here protects the company's traditional Office business, in that it's really positioned as a value-add to Office, rather than an Office alternative. Microsoft has seen success with its business-oriented SharePoint offering, and Microsoft is taking a kind of "SharePoint Lite" approach with OLW.
Microsoft says that OLW will integrate back into the client software, which means that users will be able to manage OLW documents from within Office itself. You can save directly from Word, Excel or PowerPoint to the online storage. OLW will also allow users to store files from other office suites, although integration will not be complete.
At first blush, the 250 MB of storage space will be more than enough for casual users, but hardcore Office junkies will be quickly looking for more storage space. Indeed, the idea of "anywhere access" that Microsoft is pushing here is really only complete once users can store and search all of their documents online. Of course, in the Microsoft ecosystem, there are other tools that let you access files that are stored on a remote Windows PC, but complete integration would be better.
OLW will officially support Office 2003 and 2007 when it opens later this year. Currently users can pre-register for the beta here. We experienced problems with the pre-reg site earlier this morning, but eventually got through.
Microsoft's announcement called Windows Live Workspace "among the first entries in the new wave of online services." Microsoft has plenty more up its sleeve, and for those of you looking for clues, continue to think about software plus service approaches to integration. Will we ever see a full-blown online office suite from Microsoft? Not any time soon, I'm afraid, as Microsoft's "software plus service" vision is just picking up steam. We'll see quite a bit more from Microsoft in this area over the coming year.
In related news, Microsoft Office Live, which has been aimed at businesses looking for web hosting, is going to be renamed "Office Live for Small Business." The company is really confusing users with their constant toying with the "Live" branding, and I think it's a mistake to bring the Office brand into something that has little to do with Office, as in the case of Office Live for Small Business. Contrary to what their names imply, Office Live Workspaces and Office Live for Small Business have nothing to do with one another.