Monthly Archives: February 2019
Platform: Xbox 360
Price: $59.99 (shop for this title)
Rating: M (Mature)
We've already reviewed the single-player campaign of Halo 3, andour thoughts on that experience speak for themselves. But the single-player game is only a very small part of why people buy Halo 3, and the online features of the title are almost insanely ambitious. In this review we're going to look at what Bungie wanted to accomplish with the online portion of Halo 3 and how well they achieved those goals. People are still playing Halo 2 online in large numbers after almost three years, so Halo 3 has somebig shoes to fill.
This review isn't an easy thing to write: Halo 3 features many, many multiplayer options spread across 11 maps, a co-op game with its own built-in scoring system, a new level editor called The Forge, and, of course, you have to take into account the numberless game types and downloadable tweaks that the game supports (not to mention the social aspect of playing with the unwashed masses of Xbox Live).For the past week I've been playing at home, at friends' houses, online, offline… I've been living Halo 3 with other players. It's not a bad assignment to be given.
So let's see what is added when you throw more than one player into the mix. I'll drive, you grab the gun, andwe'll see who we can shoot.
Load up on guns, bring your friends….
When you play through the campaign of Halo 3 in single-player mode, you start the scene, move forward, kill everything, move forward, see a cut-scene, move forward, repeat. The gameplay can feel a little monotonous, and the AI-controlled soldiers who are supposed to be fighting with you don't do much. God help you if you grab the Warthog’s turret and let the AI drive.
The enemy AI gets better on Heroic and Legendary, but it mainly makes you a more conservative player. I played through the entire game in two sittings, thinking I would never again play the campaign mode. For me, it was competition play and the Forge from now on.
How wrong I was.
Co-op makes the Warthog a very efficient killing machine
During my first co-op game I was riding on the back seat of the Mongoose, Halo 3's ATV-like transport, zipping between the Covenant Wraiths (large tanks) as they battered our forces. We were outgunned, surrounded by enemy turrets and vehicles, and didn't know what to do. So we kept moving, dodging fire where we could. "Get me close to that Wraith; I have an idea," I said, and my partner did so. I leapt off the Mongoose behind the Wraith, jumped on, and slammed a sticky-grenade onto the side of the vehicle. Of course, then everyone knew where I was, and opened fire. My shield was almost gone when I heard…
"I have ya!" my cohort said and pulled a quick U-turn, allowing me to again jump onto the back of the Mongoose and zip away, still alive. It was, if I say so myself, a daring maneuver on the difficulty we were playing at, and I was impressed that we both survived. This never could have happened in single-player; the coordinated attacks you can pull off with another good player are amazing to behold. The co-op in Halo 3 allows up to four players at once, and in the games I've played so far, I haven’t seen a hint of lag. You haven't lived until you've attacked an enemy's fortified position with a Scorpion tank providing covering fire while three of your teammates assault the base on foot. The co-op play makes the campaign feel alive, and you can do things working together that you wouldn't dream of alone. While playing the campaign by yourself can feel clunky and artificial, playing with other skilled players is like living the best buddy-action movie you've ever seen.
Playing through the first two Halos on co-op was fun, but nothing matches playing four-player co-op on Xbox Live, with each player having a full screen, voice chat, and no lag. This is the best possible way to play the game, and it is incredibly addicting.
The other aspect that makes co-op so enjoyable is the co-op scoring feature of the game. If you turn this on, the game begins to track who kills what and awards points for your performance. What you kill, how you kill it, how quickly you finish the mission, and how many times you die can boost or diminish your score, and at the end of each mission you get a nice little tally that lets you know how well you did in comparisonwith the other players.
This adds a whole new level to co-op play, with players either competing for the high score or choosing to work as a team, and you'll learn more about your style of play from your totals. For instance, I tend to get many more kills than the people I play with, but I also die a whole helluva lot. Learning this, I held back instead of charging in for melee kills, which helped the entire team as I stayed alive much longer. Replayability also goes through the roof with this feature enabled. Can you beat each mission while beating your old scores? Can you beat your times? Can you try to get double the kills of someone else?
You can also help your scoring by scouring the game for a series of skulls hidden in each mission. These skulls give you point multipliers andother effects as you carry them around. While there are guides to finding these, you'll have much more fun finding them for yourself.
Keep in mind that the best way to play co-op is online. The game offers splitscreen, and it's fun, but for some reason the game becomes letterboxed, eating up some of your screen real estate. No fun at all.
Co-op, along with the optional scoring system, adds literally hundreds of hours of play time to the campaign. There is always something new to try, new tactics to perfect with new players, and higher scores to go after. It's a thrill that playing alone just can't match. Time to make some friends.
Some of my recent articles on quantum computation have generated a bit of criticism for being negative. This is because these computational systems have been either light based systems or trapped atom/ion systems, neither of which can be scaled up to make a useful quantum computer. Although quantum computers based on standard optical networking techniques have been talked about, things haven’t gone beyond talk because optical components are big and they can’t get much smaller. The researchers who play with ions are usually willing to admit that their systems have no hope of scalability. Neutral atom gases on a chip have some potential, but only if quantum states can be communicated without wafting atoms around the place, not to mention that the users must be attracted to state of the art vacuum technology.
Enter superconducting quantum interference devices (SQUIDs), which are a promising candidate for scalable quantum computers. SQUIDs are made from a ring of superconducting material that is interrupted by a small break. The current circulating in the ring acquires a phase shift that depends on the properties of the break and a bias voltage. However, the current and the phase shift are quantum objects, meaning that they can be in superpositions of different states and can play the role of a qubits. Furthermore, multiple qubit operations (e.g., quantum logic) can be performed by controlling the inductive coupling (think of the SQUIDs as electrical transformers) between different SQUIDs. Unfortunately, SQUIDs are not perfect; the qubits never stay in their superposition states for very long, entanglement between two SQUIDs—the correlation between the states of the currents in different SQUIDS—is difficult and lasts for even less time, and the equipment needs to be operated at near absolute zero. Furthermore, like most quantum computer implementations, there was no way to easily transfer a qubit from one SQUID to a more distant one.
Now a multinational team of researchers have created a quantum bus to transfer states between SQUIDs. To do this, they have taken advantage of something called Rabi oscillations. Rabi oscillations are best described by optical experiments. Imagine one excited atom between two mirrors that form an optical cavity. The atom emits a photon that reflects back and forth between the mirrors; eventually it is reabsorbed by the atom, allowing the cycle to repeat. If we perform a measurement, we will either find the photon in the cavity or “in” the atom. If we perform the experiment lots and lots of times, we find that the probability of finding the photon in cavity varies in a cyclic fashion. Practically, this means that if we excite the atom and wait for a particular period of time, we will find the photon in the cavity with a probability of one. If we wait twice as long, the probability of finding the photon in the cavity is zero.
The researchers manufactured a microwave cavity from a strip line (a thin conductor separated from a large conductive surface by an insulating material). The strip line acts just like an optical cavity but for microwave radiation, where the ends of the strip line form the mirrors. Inside the strip line, they manufactured two SQUIDs—these will play the role previously occupied by atoms. The key thing here is that the frequency of light emitted and absorbed by the SQUIDs can be tuned by adjusting the voltage across the junction. In this way, the SQUID can be placed in resonance with the microwave cavity. When it is in resonance with the cavity, photons in the cavity can excite the SQUID to a higher level state or, if the SQUID is already excited, it can emit a photon into the cavity. That photon will then cycle back and forth between the cavity and the SQUID in exactly the same way that the photon cycled between the atom and optical cavity. To transfer a state from one SQUID to another, the researchers tune both SQUIDs off resonance and prepare one SQUID in an arbitrary quantum state. The SQUID is then placed in resonance with the cavity and, at the appropriate moment, returned to a non-resonant state. This transfers the state of the SQUID into the cavity with a probability of one. The second SQUID is then tuned to the cavity resonance and again the researchers wait for a short time before detuning the SQUID from resonance. The quantum state now resides in the second SQUID with a probability of one.
Although the researchers only demonstrated this research with two SQUIDs, there is no reason for this not to scale. The strip line can be made much longer and SQUIDs placed at intervals along the cavity. Arbitrary transfers between two SQUIDs can then be achieved by adjusting voltages on the correct sequence of SQUIDs. Unfortunately, the cavities aren’t very high quality and the SQUIDs aren’t perfect either, which means that the probabilities don’t cycle between one and zero as they would in the ideal case. In the real world, the peak is about 40 percent and I don’t know if that is good enough for a quantum bus. On the other hand, error correction is a built in necessity for quantum computing, so this may be good enough.
This is probably where the future lies in quantum computing; small scalable devices with short coherence times. In the meantime, we can expect further complicated trapped ion work and optical tables covered in fiber optic cabling, mirrors, and lenses.
Nature, 2007, DOI: 10.1038/nature06124
Nature, 2007, DOI: 10.1038/nature06184
It's become clear over the last few decades that life, from birds to bacteria, can sense the orientation of the Earth's magnetic fields. Some of the basic proteins involved in performing this sensing have even been identified. But, in birds at least, merely recognizing the direction of the field is only a small part of the problem; the sensory input must be interpreted by the brain and integrated with other information to help guide the animal's migratory behavior. A new, Open Access publication in PLoS ONE builds on past data to help build the case that birds perform this sensory integration via a specialized adaptation of the visual system. In other words, they perceive magnetic fields using part of the visual system, so in a very real sense brids "see" the magnetic field.
The work takes advantage of the previous identification of a gene, called ZENK, that is expressed in neurons after they have been active. Using ZENK activity as a marker, a group of cells in the forebrain of birds—called "Cluster N"—was shown to be active when migratory birds performed magnetic orientation in the dark. Cluster N was located adjacent to an area of the forebrain that helps process visual information. Other research has shown that the proteins believed to detect magnetic changes are present in the retina of birds. What was lacking was a connection between the two.
The new work makes that connection. The researchers injected a green dye in Cluster N, and a red dye into the retina of birds that perform magnetic orientation. Those dyes diffused down the length of the axons that connect these structures, and wound up in the same place: a structure called the dorsolateral geniculate complex, which is involved in visual perception. Other experiments showed that the connections between these structures closely parallels the wiring of the visual system. The authors refer to it as involving "restricted subregions" of the visual pathway.
The authors conclude by saying, "our findings strongly support the hypothesis that migratory birds perceive the magnetic field as a visual pattern and that they are thus likely to 'see' the magnetic field." This seems to be how things typically happen in biology: rather than evolving something entirely new, natural selection tends to take advantage of an existing system, modifying it just enough to do something new. In this case, the ability to interpret spatial information that's present in the visual system makes a nice fit for interpreting the magnetic field.
PLoS ONE, 2007. DOI: 10.1371/journal.pone.0000937
The Linux Foundation has announced a new collaborative agreement with Japan's Information Technology Promotion Agency (IPA), a government research institute that promotes information technology development and broadly supports the use of open standards and open-source software.
The collaborative agreement is part of a plan to mutually assist in promoting open standards and the acceleration of open-source software adoption in Asia. The Linux Foundation will be participating in the upcoming IPA Forum 2007 User Conference in Tokyo.
"Our two organizations are leading the adoption and use of Linux and open-source software, and by working together on joint summits, technology developments and legal activities, we can help Japanese companies promote the use of Linux," said IPA chairman Buheita Fujiwara in a statement. "Japanese open-source software will continue to play a very important role in the worldwide open source revolution."
The Linux Foundation was formed earlier this year when the Open Source Development Labs merged with the Free Standards Group. The continuing evolution of the commercial Linux ecosystem had steadily diminished the need for both organizations and a restructuring was deemed necessary to compensate for the realities of the present day. The Linux Foundation has since started several new projects of value to the Linux community, including the creation of the Linux Weather Forecast and hosting upcoming legal summits.
The IPA has prior ties with the Open Source Development Labs, most notably as a contributing member of the Data Center Linux working group and as the primary source of funding for the Data Center Linux Open Printing project. The IPA is also heavily involved with open-source advocacy efforts in Asia as a facilitator of the Japan OSS Promotion Forum, which aims to broaden software choice in Japan and is a member of the Northeast Asia OSS Promotion Forum.
Open-source software is rapidly becoming an essential part of Asia's IT infrastructure. According to analyst firm Gartner, over 60 percent of large and mid-size government agencies in Asia will use open-source software for critical components of their operations by 2010.
Last month, the Linux Foundation announced that the Asianux 3.0 Linux distribution conforms to the Carrier Grade Linux specification, a set of standards for security, performance, and availability devised by the Linux Foundation.
Governments are beginning to acknowledge that open technologies will play a crucial role in bringing lower costs and greater empowerment to IT in coming years. This collaborative agreement will hopefully provide insight into how governments can work with the open-source software community to ensure that the tangible benefits of leveraging open-source software are realized.
Here's one for you rumor-mongers early in the morning. It's long been speculated that Apple is working on a second version of the iPhone that will be stripped down a bit from the current iteration (often referred to as the "iPhone nano"). September has unsurprisingly come and gone without its introduction, and so rumors about the second iPhone have quieted down for now. But what if there was some evidence that Apple is, in fact, planning to introduce a new iPhone?
This can be considered a little bit of a stretch, but the iPhone Feedback page on Apple's website lends us one clue. The page looks innocent enough, until you view the HTML to the page. There, you will find a hidden field with the product name that will be submitted along with the entered information. That product name is "iPhone Extreme."
As Tom from iPhoneBugList.com points out, the iPhone Extreme moniker doesn't necessarily indicate a more "extreeeeeeme" version of the iPhone, but rather a different naming convention than what we're used to at the moment. It makes sense that once a less-expensive, less-feature-rich version of the iPhone is released, it will be dubbed the iPhone (or iPhone mini, or iPhone nano, or iPhone somethingelse) while the current iPhone may gain an "Extreme."
That, or this code snippet means absolutely nothing. If that's the case, given what Apple knows about crazed zealots who check the code on various web pages regularly (*cough*), it either knows that it seeded this for our extreme amusement or is just plain dumb. I'm going to bet on the former. In the case of the latter, however, we can probably expect this code to mysteriously change to "iPhone" later in the day.