Home » Cognitive Computing » Some Thoughts on the Future of Computing

Some Thoughts on the Future of Computing

December 2, 2011

supplu-chain

An article in The Economist states, “We are entering what some in the technology industry refer to as a post-PC era.” [“Beyond the PC,” 7 October 2011] The article clarifies that the post-PC era “does not mean that the personal computer is about to disappear”; rather, the post-PC era will be characterized by more mobile computing devices. The article explains:

“According to estimates from Gartner, a research firm, combined shipments of web-connected smartphones and tablet computers are likely to exceed those of desktop and laptop computers for the first time this year, putting PCs in the shade. According to Morgan Stanley, an investment bank, there could be 10 billion mobile devices in circulation by 2020. Many of these will use bite-size chunks of software known as ‘apps’, some 18 billion of which are likely to be downloaded this year. As mobile, web-connected devices become ubiquitous, the volume of data they produce will soar. Cisco, a technology company, reckons that by 2015 some 6.3 exabytes of mobile data will be flowing each month, or the equivalent of 63 billion copies of The Economist.”

Obviously, the post-PC era will contribute to another so-called era — the era of big data. The article started me thinking about how computing could be changing in other ways in the future. Probably the best place to start such a discussion is with the heart of a computer — the microchip. Some of the more interesting research in this area is being done using light instead of electronic signals. IBM, for example, claims to have “made important progress toward creating silicon circuits that communicate using pulses of light rather than electrical signals.” [“IBM develops speed of light chip to chip communication device,” by Dario Borghino, Gizmag, 10 March 2010] IBM researchers are working on “a device called [a] nanophotonic avalanche photodetector (NAP), which, as detailed in the journal Nature, is the fastest of its kind and is a major step toward achieving energy-efficient computing that will have significant implications for the future of electronics.” Borghino continues:

“Working on the so-called avalanche effect, which occurs when a photon starts a chain reaction that involves more and more electrons to build up a significant electrical current, the device is part of an ongoing effort by IBM to develop photon-based computing and communication. ‘This invention brings the vision of on-chip optical interconnections much closer to reality,’ T.C. Chen, vice president of science and technology at IBM Research, commented. ‘With optical communications embedded into the processor chips, the prospect of building power-efficient computer systems with performance at the exaflop [billions of billions of floating point operations per second] level might not be very distant.’ The component engineered by IBM is not only by far the fastest of its kind, but also the most efficient on the energy front. It can receive optical information signals at 40 Gb/sec and multiply them tenfold using a mere 1.5V voltage supply — which can be provided by a regular AA-size battery — compared to the 20-30V supplies required by standard photodetectors.”

IBM researchers are not the only ones working using light to create ever-faster computer chips. Researchers at the University of California at Berkeley are working with silicon-based lasers to do the job. [“Berkeley Researchers Hope to Light Way with Tiny Laser,” by Don Clark, Wall Street Journal, 9 February 2011] Clark reports:

“Many tiny building blocks have already been developed, but lasers themselves remain a big hurdle; silicon doesn’t generate light like some other semiconductors. That has led to other approaches. Researchers at Intel, for example, developed what they call a hybrid silicon laser that exploits indium phosphide, bonding lasers made with that material to silicon chips that guide the laser beams to their destinations. But a Berkeley research team is claiming a more radical feat, essentially growing lasers on a silicon surface. The lasers, created from indium gallium arsenide, appear like tiny, hexagonal pillars; light circulates up and down the pillars and is amplified by a sort of optical feedback mechanism. ‘It’s like a spiral staircase,’ says Connie Chang-Hasnain, a UC Berkeley professor of electrical engineering and computer sciences who is principal investigator of a study reported in the journal Nature Photonics. ‘The light goes around the pillar.’ … The potential implications? One day, for example, microprocessor chips could more efficiently use many more electronic brains, all able to pass data to each other at ultra-high speeds. Such chips, in theory, could turn PCs into tools that could carry out chores now reserved for room-sized supercomputers at national laboratories.”

When it comes to analyzing big data, such computers would be godsends. The Holy Grail of computing, however, may be quantum computing. Quantum computing holds great promise; but, the challenges that must be overcome to achieve it are also great. An article in The Economist explains why researchers are so eager to overcome those challenges:

“In the bizarre world of quantum mechanics, … subatomic particles can exist in several states at once. Such ‘superposition’ means, for instance, that the property of an electron known as its spin can be not only ‘up’ (representing, say, one) or ‘down’ (representing zero) but also some combination of the two. In quantum computing, such superposed values are named qubits. Additional qubits can be added by a process called entanglement. Each extra entangled qubit doubles the number of parallel operations that can be carried out. A three-qubit device could run eight operations, a four-qubit system 16, and so on. At least in theory.” [“A quantum hop,” 24 June 2010]

Although that sounds very promising, the article notes that “experimental quantum computers require exotic materials and work only at a tad above absolute zero.” The latest breakthrough, it reports, is silicon-based. It explains:

“Thornton Greenland of University College, London, and his colleagues, … first doped the silicon with phosphorus atoms. They then used a short, high-intensity pulse from a fancy laser to jolt electrons orbiting those phosphorus atoms into what is known as a Rydberg state. In this state, each atom’s quantum wave-function—its effective size—extends several billionths of a metre from its nucleus. That is huge in atomic terms, and permits the entanglement of atoms that would normally be too far apart to interact. This, in turn, means they can become collaborative qubits. Once their wave-functions had been expanded, the atoms were zapped again, inducing them to emit a controlled burst of light called a photon echo. This would be used to read out the result of any calculation the system had performed. … The result is something that could, in principle, do calculations and spit out the answers—and which could be made industrially using the sort of equipment that turns out conventional chips.”

Besides the fact that their system “does its computing in silicon,” it also operates at “the balmy temperature of -269˚C, or four degrees above absolute zero.” The article reports, “It might sound pretty chilly, but it is a temperature that can be maintained fairly easily using liquid helium, which boils at 4.22 degrees above absolute zero.”

 

Earlier last year, Dario Borghino reported that an assistant professor of physics at Princeton, Jason Petta, had “demonstrated a technique to isolate one or two electrons at a time and managed to control their behavior by purely electrical means.” [“Quantum computing researchers achieve control over individual electrons,” Gizmag, 9 February 2011] Although Borghino calls this “a very significant step forward that opens the door to high-performance quantum computing, Petta remarked, “We are still at the level of just manipulating one or two quantum bits, and you really need hundreds to do something useful.” A month after Petta’s advancement was announced, Tannith Cattermole reported, “A new device developed by Harvard scientists which uses nanostructured diamond wire to provide a bright, stable source of single photons at room temperature represents a breakthrough in making this quantum technology a reality.” [“Quantum computing breakthrough uses diamond nanowires,” Gizmag, 12 March 2010] Each of these “quantum hops” brings quantum computing a step closer to reality.

 

While many of the advances discussed above will change the face of computing in the long-term, a new programming standard will likely change your computing experience in the near-term. According to Don Clark, “A set of programming techniques called HTML5 is rapidly winning over the Web.” [“HTML5: A Look Behind the Technology Changing the Web,” Wall Street Journal, 11 November 2011] Clark explains:

“The technology allows Internet browsers to display jazzed-up images and effects that react to users’ actions, delivering game-like interactivity without installing additional software. Developers can use HTML5 to get their creations on a variety of smartphones, tablets and PCs without tailoring apps for specific hardware or the online stores that have become gatekeepers to mobile commerce. … ‘HTML5 is a major step forward,’ declares venture capitalist Marc Andreessen, who helped invent the first successful browser, Netscape, in the 1990s. Another Silicon Valley investor, Roger McNamee, predicts the technology will let artists, media companies and advertisers differentiate their Web offerings in ways that weren’t practical before. ‘HTML5 is going to put power back in the hands of creative people,’ he says. … Some 34% of the 100 most popular websites used HTML5 in the quarter ended in September, according to binvisions.com, a blog that tracks Web technologies. Resume searches by hiring managers looking for HTML5 expertise more than doubled between the first quarter and the third quarter, according the tech job site Dice.com. The excitement has spread despite the fact that HTML5 is missing some key features. Many users, moreover, won’t notice striking differences from websites that use Flash.”

Clark explains that “HTML5 takes its name from hypertext markup language, the standard commands used to create Web pages. But the term is a catchall for multiple techniques to handle elements like typography, graphics and video, creating an app-like experience.” If you are only a user of the Internet, the magic behind the screen needn’t bother you. Your web surfing, however, should be a lot more entertaining.

 

It wasn’t all that long ago that a useful, if not particularly powerful, computer would fill a large room. Each new breakthrough has resulted in smaller, more powerful computers. Sometimes we fail to see the wonder in devices like smartphones, but we shouldn’t. The day will surely come when quantum computing becomes a reality. Who knows what wonders the computing power of those machines will help us discover?

Related Posts:

Full Logo

Thanks!

One of our team members will reach out shortly and we will help make your business brilliant!