Senin, 22 Agustus 2016

Silicon Week: 10 biggest moments in computing history

Silicon Week: 10 biggest moments in computing history

Then and now

Biggest moments in computing history

There have been huge advancements in computing technologies since the early 1900s, when calculators and typewriters were 'new-fangled' contraptions. And, in the 70 some-odd years since the true birth of the computing era, there's been an exponential explosion in technology and capabilities.

No longer are we in the age of room-sized computers. No longer are we in the age of analog. Yet, it's good to sometimes look back and see where we have come to imagine where we're going. With that in mind, here are our top 10 biggest moments in computing history.

This article is part of TechRadar's Silicon Week. The world inside of our machines is changing more rapidly than ever, so we're looking to explore everything CPUs, GPUs and all other forms of the most precious metal in computing.

1939: Alan Turing's Mad Genius

Biggest moments in computing history

Born in 1912 in Paddington, London, Alan Turing left a remarkable legacy on the computing industry. By 1932, he was writing mathematical whitepapers concerning algorithmic truths and conceptualizing Turing Machines (early computers). He led a group of cryptologists and scientists attempting to decipher German messages, though unlike in the movie retelling, it was actually a team of Polish scientists that cracked the Enigma code.

In 1939, Turing, fellow codebreaker Gordon Welchman and his team created the giant machine that easily decrypted daily messages. Named Bombe (or 'Christopher' in the movie), it was a complicated electrically-powered machine programmed to decrypt the message key of the double-encrypted German messages.

1941: First Mouse

Biggest moments in computing history

The concept for the 'mouse' was invented in 1941, though the first publicly announced mouse didn't come into being until around 1968.

In 2003, Professor Ralph Benjamin (retired) from Bristol UK stated that he was the original creator of a mouse-like device. Secretly patented 1946 by the British Royal Navy, the 'ball tracker' was a metallic ball that would roll between wheels to send coordinates to a radar system, however, it wasn't public knowledge until decades later.

Doug Engelbart and his team of Stanford researchers created a wooden box shell on two wheels which would send movement information back to the computer. Engelbart and his team created the wooden mouse in 1968 and by 1973, point-and click computers were already developed.

1971: Microprocessor Computing

Biggest moments in computing history

In the early days, computers were huge. They were impressive to be sure, but having entire buildings devoted to a single machine wasn't really efficient. Smaller and smaller computing components came into being with one of the key invention of much smaller components being the microprocessor.

Intel developed the first micro-sized central processing unit (CPU), the 4004, in 1971. It could process a whole 4 bits at any given time. Only three years later, the Intel 8008 made it into its first consumer computer an 8-bit IBM PC. The Intel 8008 ended up kicking off the Pentium series in only 5 model revisions, and by the end of the run had 20,000-times the transistors and a nearly 3,000 times quicker clock speed.

1973: Portable Computers

Biggest moments in computing history

The lightweight laptops and tablets out today have nothing on the original 'Portable Computer'. The first portable computer – the Osborne 1– boasted a 4 MHz CPU, 64 kilobytes of memory, a detachable (but wired) keyboard that was also part of the protective casing for transport, and a built-in 5-inch black and white screen. Back in 1973, that was the penultimate in computing technology.

Taking inflation into account, these Portable Computers came in around $4,500 ($1,800 at the time). However, it spawned a mobile-computing revolution. Even 35-years-ago, the first portable computer came with the first computer games, a text based game and later Pong installed via 5.25-inch floppy disks.

1984: Birth of the Macintosh

Biggest moments in computing history

During the rise of personal (desktop) computing, there was the Commodore 64, the IBM Personal Computer, and the Macintosh 128K. But before anything even hit the market, though, there were many black-and-white screened computers with Motorola processors (Apple's original CPU supplier) featuring mostly DOS-based user interfaces. In the early days, Apple and Microsoft were even friendly (with some Apple PCs running early Microsoft OS software).

After Apple and Microsoft split over not-so-friendly terms, Steve Jobs accused Microsoft of stealing its graphic user interface. Thus sparked the Apple Macintosh versus Microsoft war that rages on today. Arguably, both got a huge boost from the user interface created in the Xerox PARC labs their CEOs admired. Apple's Macintosh OS debuted in 1984 with a Super Bowl ad made by Ridley Scott.

2005: Apple and Intel

Biggest moments in computing history

Since 1991, Apple had been utilizing IBM PowerPC processors in its PowerBook laptop computers (MacBook's precursor) and ended up discontinuing use of IBM's chipset due to a lack of significant increases in computing power. In 2005, Apple announced they were utilizing Intel x86 processors for their popular line of computers after nearly 15 years of IBM processing. Only three years after transitioning to Intel CPUs, Apple released the Intel-only OS 'Snow Leopard' making the PowerPC processor obsolete.

This was around the time of the MacBook, allowing it to expand their mobile computer lineup with the MacBook Air and eventually the revived MacBook. Apple used both the PowerPC processors and those from Intel for quite a while, until finally doing away with PowerPC support in late 2013.

2002: The dawn of hyperthreading

Biggest moments in computing history

In 2002, Intel once again revolutionized another aspect of computer processing through hyper-threading. In each core of a processor, hyperthreading parallelizes computing of multiple tasks allowing them to be processed simultaneously (basically creating a virtual processor for the second set of tasks). Intel gave the example of turning a one-lane highway into a two lane highway, speeding up traffic for all lanes (or processes in hyperthreading). In multi-core processors, multiple tasks are being run through each core at the same time ensuring the quickest processing possible.

While it was left on the back burner for a little more than a decade, hyperthreading came back in full swing for Intel's 6th generation of Skylake processors. Most computers running the Intel Core i-series is using hyperthreading, including Apple and Microsoft computers, tablets, and more.

2006: AMD Buys ATI

Biggest moments in computing history

AMD (Advanced Micro Devices) was founded in 1969 and created microchips and circuits for early computer components, later expanding into many kinds of computing processors and building parts for Intel chips. Eventually, they'd create their own spin off from the Intel lineup to offer competitive products to the computing behemoth.

ATI started making integrated graphics processors for computers in 1985 and created stand-alone graphics processors by the early 90s.

But it was a big shakeup in the computing world when these two merged in 2006. Both already huge names in the business, AMD bought ATI for $4.3 billion and 58 million shares of stock (around $5.4 lobillion total). The big name in chipsets purchasing the big name in graphics processing to create one mega computer processing dynamo is a big thing. Though, the ATI name only lasted a few more years before it was retired, the Radeon graphics line they created has endured to this day.

2013: Console crazy

Biggest moments in computing history

While the first console was arguably the Atari 2600, with Nintendo Entertainment System soon to follow, it was the introduction of the Sega Genesis that truly sparked the first great console war.

However, over time even Nintendo's popularity wanned, leaving only Sony and Microsoft to duke it out for the top-spot in console computing since 2001. The heated competition ushered in a new age of entertainment and versatile computing.

The Xbox had better specs, but Playstation 2 had the longevity and following, having released the first Playstation in 1994. In an upset, the Xbox 360 dethroned the PS3 as the top console of the yester-generation. Now the tables have shifted once again in the PS4's favor over the Xbox One – though, the recent introduction of the Xbox One S is turning the tide for Microsoft.

Even fifteen years later the console wars haven't changed. What has changed, though is the two systems are closer to each other with both essentially running computer processors and x86 processes. Both consoles have essentially become little PCs and Microsoft only means to merge the two platforms more closely together with Xbox Play Anywhere and Windows 10.

20xx: Artificial intelligence

Biggest moments in computing history

And, alas, we end back where we began: Alan Turing.

The main identifier of artificial intelligence is that the machine is a freely and critically thinking entity, not just programmed responses to standardized inputs. Turing developed three ways to test for actual artificial intelligence in a computer 50 years in advance of a machine that had even the remotest of AI capabilities.

Described as "The Imitation Game" (Yes, like the movie), the goal of the game is for the computer to successfully imitate being a human. A computer (A) and a male human (B) try to convince an interrogator (C) that the computer was human.

Nothing has managed to pass Turning's test yet. Even one of the most advanced robots in the world, Honda's running and dancing bipedal ASIMO (Advanced Step in MOtion) robot, doesn't have the the intelligence to be considered artificial intelligence. It may have a humanoid look and speech pattern, but it wouldn't be able to live up to Turing's artificial intelligence test.

In Halo, Cortana is an artificial intelligence, a glimpse of what may be ahead for us. Yet, Microsoft's current version of Cortana or Apple's Siri aren't nearly there yet as they both still require static responses to proposed problems. While true artificial intelligence hasn't yet come to fruition, that hasn't stopped us from trying.


August 22, 2016 at 07:00PM
Deanna Issacs

Tidak ada komentar:

Posting Komentar