Technology

Apple Watch? Commodore did it first: Best of the Web

Andrew Sadauskas /

After many years of research, computer technology had finally been combined with the wristwatch. The question was how the world’s tech giants were going to fare against jewellery makers.

The year? 1976.

Over at Technologizer, Harry McCracken has an interesting look back at what happened when Commodore (the maker of the Commodore 64 and Amiga) became a watchmaker:

The company entered the watch field when digital watches were still a newfangled wonder, having debuted when Hamilton released its first Pulsar in 1972. That model sold for $2100, which was more than a new Ford Pinto went for at the time. By the time Commodore released its first models three years later, digitals had become mass-market items. The company was part of a great price plunge, much as it would be in the 1980s as the Commodore 64 went from its starting price of $595 to selling for under a hundred bucks.

Intel, for instance, had bought watchmaker Microma in 1972, when digital models were still high-ticket items and the market for its microprocessors barely existed; Micromas were some of the first consumer products with Intel Inside. (They also had LCD displays that didn’t make you press a button to see the time.) At the high end, Hewlett-Packard introduced the HP-01, an amazing $650 calculator watch with 28 minuscule buttons you had to press with a tiny stylus. On the low end, Texas Instruments helped to knock the price of digital watches down to $10 by flooding the market with plasticky models.

It will be interesting to see if history repeats.

Is Mozilla the new Sun Microsystems?

In a past Control Shift, I’ve taken a look Mozilla and how it’s diversifying into a range of different products basically to cut its reliance on search revenue from Google.

However, Paul Krill at InfoWorld notes that this is leading Mozilla to tackle a range of different products while market share for its core product – the Firefox web browser – atrophies.

He likens the situation to another now fallen tech giant, known as Sun Microsystems, which developed innovative new technologies such as Java at the expense of its core product line, the Solaris workstation computer platform:

Mozilla has become the modern-day Sun Microsystems: While known for churning out showstopping innovation, its bread-and-butter technology now struggles.

In its glory days, Sun had on staff prominent people like Java founder James Gosling, Unix whiz Bill Joy, and XML co-inventor Tim Bray. It produced groundbreaking technologies, such as Network File System and, of course, Java. But the company’s once-high-flying, principal source of revenue — its SPARC hardware paired with its Solaris Unix OS — got trampled in the stampede to commodity Intel hardware and Linux. This led to Sun being acquired by Oracle in 2010 after an extended period of losing many millions of dollars.

Classic Apple icons now in the New York Museum of Modern Art

While computer icons are an essential part of our digital lives, few stop to think about the process involved with coming up with these designs. Over at I Programmer, David Conrad discusses how the works of Susan Kare, the Apple engineer responsible for designing the icons used in the original Macintosh is now being recognised by New York’s Museum of Modern Art:

Susan Kare is the artist responsible for many of the classic Mac icons that are universally recognized. Now her impact as a pioneering and influential computer iconographer has been recognized by the Museum of Modern Art in New York.

Susan Kare designed all of her early icons on graph paper, with one square representing each pixel. Now this archive of sketches has been acquired by MoMA, jointly with San Francisco’s Museum of Modern Art, and has gone on show as part of a new exhibition, This is for Everyone: Design Experiments For The Common Good.

A short history of AI

Finally, big data, machine learning and artificial intelligence are all the rage at the moment. However, as Or Shani explains, it’s far from a new concept:

AI isn’t a new concept; its storytelling roots go as far back as Greek antiquity. However, it was less than a century ago that the technological revolution took off and AI went from fiction to very plausible reality. Alan Turing, British mathematician and WWII code-breaker, is widely credited as being one of the first people to come up with the idea of machines that think in 1950. He even created the Turing test, which is still used today, as a benchmark to determine a machine’s ability to “think” like a human. Though his ideas were ridiculed at the time, they set the wheels in motion, and the term “artificial intelligence” entered popular awareness in the mid- 1950s, after Turing died.

Advertisement
Andrew Sadauskas

Andrew Sadauskas is a former journalist at SmartCompany and a former editor of TechCompany.

We Recommend

FROM AROUND THE WEB