A fast car, lots of music, and a really good corned-beef sandwich, these are a few of my favorite things, and of those, music is the most prevalent. It’s part of my life nearly 24×7. At home, the radio is always playing. In the office, unless I’m on the phone, digital music is streaming from my Mac’s iTunes into a big Pioneer receiver. In the fast car, digital songs comes out of an iPod connected to an Alpine head unit.

Where does that digital music come from? CD-Audio discs – quite an extensive library, I’m proud to say, amassed over many years, and now fully ripped into iTunes. How much of my music was purchased from the iTunes Store? None whatsoever. And there’s none from any other digital music store either.

Why not? Because of digital rights management (DRM) software. As someone in the intellectual property business myself (as a writer and publisher), I respect the rights of IP creators to protect their work. But as the saying goes, your freedom to swing your arm ends when it impacts my nose.

As a consumer, I don’t trust DRM software to protect my purchases. I don’t trust any DRM system: not Apple’s FairPlay, not Microsoft DRM, not any number of open-source or commercial efforts.

I know that my physical CDs will be usable for years. How long will I be able to use music that’s encoded using a DRM system? One year? Five years? 20 years?

There are stories out right now about people who are having problems running iTunes after upgrading from Windows XP to Windows Vista: it seems that some iTunes users can’t see their purchased music without dancing through some workaround. Apple suggests that iTunes users wait for updated software before installing Windows Vista. Microsoft disagrees with that advice.

But this isn’t about Apple or Microsoft. What about Musicmatch and Yahoo? My colleague Larry O’Brien lost all the music he purchased from Musicmatch Universal after Yahoo bought it.

I can buy a 10-song album in digital form from the iTunes Store, MSN Music Store, AOL Music Store or another service for US$9.99, and have DRM worries. Or I can buy the CD for a few dollars more… and not have DRM worries. Guess which option I prefer?

Guess which option Apple’s Steve Jobs prefers? Surprisingly, it’s the same one I do: the option that doesn’t have DRM. In a missive posted today, he explains that the iTunes Store uses DRM only because Universal, Sony BMG, Warner and EMI insist on DRM as a condition for his selling their music.

Jobs insists that because the only acceptable DRM systems (to those companies) must contain secret data (crypto keys), this limits his ability to license Apple’s FairPlay DRM system to other hardware or software makers:

“Apple has concluded that if it licenses FairPlay to others, it can no longer guarantee to protect the music it licenses from the big four music companies. Perhaps this same conclusion contributed to Microsoft’s recent decision to switch their emphasis from an “open” model of licensing their DRM to others to a “closed” model of offering a proprietary music store, proprietary jukebox software and proprietary players.”

Jobs presents a compelling alternative:

“Imagine a world where every online store sells DRM-free music encoded in open licensable formats. In such a world, any player can play music purchased from any store, and any store can sell music which is playable on all players. This is clearly the best alternative for consumers, and Apple would embrace it in a heartbeat. If the big four music companies would license Apple their music without the requirement that it be protected with a DRM, we would switch to selling only DRM-free music on our iTunes store.”

And I’d stop buying CDs.

Z Trek Copyright (c) Alan Zeichick

The floodgates have opened… with many of the spam comments coming from false “registered users” of the Blogspot/Blogger system. Therefore, commenting on this blog is now disabled.

What a shame. My apologies to those who have legitimate comments.

>> Update 2/6: Well, I feel foolish. A good friend pointed out that there’s a Blogspot/Blogger feature that requires that people leaving comments type a graphically displayed word to validate that it’s not spam. With luck, that’ll solve the problem. Guess I should have read the fine manual. Commenting is now re-enabled.

Z Trek Copyright (c) Alan Zeichick

Last week, Jean Ichbiah, the lead designer of the Ada programming language, passed away.

Dr. Ichbiah worked on the language in the late 1970s and early 1980s, when he worked at Cii Honeywell Bull in France. He later left to found Alsys Corp., which continued the development of Ada and built commercial Ada 83 compilers. Here’s a paper he co-authored about the need for the language, which was created for the U.S. Dept. of Defense.

Though I did a small amount of Ada work “back in the day,” I never had the pleasure of meeting Dr. Ichbiah, and so I will refer you to this obituary by Bertrand Meyer, and to the official notice in the Boston Globe.

(I found Dr. Ichbiah’s photo on this neat site, which shows the authors of various programming languages.)

Z Trek Copyright (c) Alan Zeichick

Z Trek reached a dubious milestone today. There were more than 50 pieces of blog spam when I checked in this morning.

This blog is set up so that all comments are moderated, and must be approved by yours truly before they can become visible on the site. However, the “leave a comment” link is easily detectable by robots, which are flooding the site with bogus comments. Some of them are simple “This is a great site! For more on this topic, see such-and-such site.” Others are packed with keywords and multiple links for things like mortgages, phishing schemes and pharmaceuticals.

Do spammers really think that people will read and click on those often-nonsensical postings? Maybe, maybe not. But even if nobody ever sees their spam comment, the robot will have succeeded in gaming searching engines, like Google and Yahoo. That’s because the search rankings algorithms for a page take into account the number of other sites that link to that page. If my site links to a spam site, that the spammer’s search engine ranking goes up. Or at least it used to; top search engines have adapted their algorithms and spiders to ignore these type of bogus links.

Fifty. And it was only about 20 a couple of weeks ago. Fortunately, it only takes a moment to delete the spam comments using Blogger’s control panel. However, I feel the pain of owners of very heavily trafficked blogs, as the number of spam comments must be overwhelming. No wonder so many of my friends and colleagues have simply turned off commenting.

>> Update 2/4: Literally within 5 minutes of posting the above, its first spam comment showed up. It read (links removed): “Hello! You have a very nice blog. I’m here to share valuable info with you visit my blog ,about Mozilla Firefox web browser.” The post links to a site which has a hacked version of Firefox with spyware installed.

Z Trek Copyright (c) Alan Zeichick

It’s been five years — and the nominations for the 2007 SD Times 100 are now open. You can learn all the rules and access the nomination forms online. There’s no cost to nominate.

Unlike some other awards in our industry, the SD Times 100 isn’t a product award. It’s an award for a company, a non-commercial organization, or even an individual who showed leadership and innovation in the software development community during the past year.

You can read last year’s winners on the sdtimes.com site (and see great artwork from Erin Broadhurst). Nominations are open through March 1, 2007.

Z Trek Copyright (c) Alan Zeichick

Here comes the new boss — literally, the same as the old boss. Michael Dell, founder of Dell Inc., reclaimed the CEO position, as his longtime #2 guy, Kevin Rollins, left the company. Rollins took over the command chair in 2004, and led Dell Inc. through some tough times.

What does this mean? Well, a couple of years ago, Dell looked like a miracle company. Today, it’s still a top computer provider, but it’s no longer evident that its direct-sales model will drive the company’s growth. Commodity desktops, notebooks and servers, yes, people will buy them, sight unseen, based on the specs and price. Televisions and music players? That’s a different story.

A couple of years ago, Dell was the top PC maker. Today, HP is. Meanwhile, Dell is plagued by concerns about its customer service, after a disastrous experiment with outsourcing and a number of battery recalls. And the SEC has some questions about Dell’s accounting, and NASDAQ isn’t happy.

The most recent numbers weren’t great, and Dell’s stock (as of the time of writing) is at $24.22 per share, down from its 52-week high of $32.24.

The most recent quarterly earnings report, from November 2006, had this statement:

Company Outlook The company said that the actions it has taken to drive improved operating and financial performance long-term with a better balance of liquidity, profitability and growth are starting to take hold. However, in the near term, improvement in growth and profitability may not be linear due to a variety of factors, including the timing of continued investments in Customer Experience, global expansion, and new product introductions, as well as a muted seasonal uplift due to changes in the mix of product and regional profit. In addition, the fourth quarter of fiscal year 2006 included one extra week.

Were all the problems caused by Kevin Rollins? Will Michael Dell (pictured) turn around the beleaguered computer giant? We’ve seen that a shift in CEO can work miracles — see the astonishing reinvention of HP, for example, after Carly Fiorina’s departure and Mark Hurd’s ascension. Dell needs that kind of miracle.

Z Trek Copyright (c) Alan Zeichick

Microsoft Vista Has Gates Saying “Wow”

That’s the headline of a CBS News story about the launch of Windows Vista today. Oooh, the headline writers at The Associated Press must have worked overtime to come up with that one.

No, this isn’t a knock on Microsoft. This is a knock on the AP (this is their picture, too), who have apparently forgotten the difference between reporting and shilling, and on CBS News, which is happy to reprint fluff from the AP without thinking about it.

The second paragraph of the story revealed the depth of this puff piece:

“This ‘wow’ thing is a great way of describing what we’ve got here,” Microsoft’s chairman told The Associated Press on Monday as the software maker began a slate of splashy events in New York. “There are chances for ‘wows’ all over the product.”

Do I smell a Pulitzer? Sheeesh.

>> Update 1/30: CBS News changed the story’s headline to “Vista Makes Its Debut,” and added a lot more detail and commentary. Yesterday, it was purely an AP story, but now CBS News says that the AP “contributed to this report.”

Z Trek Copyright (c) Alan Zeichick

1. Windows Presentation Foundation and Aero user interface. The new GUI is clean and slick, and when run on the right hardware so that it can use its “glass” effects, is much more visually appealing than Aqua, the GUI in Mac OS X 10.4… though, frankly, Aero is not as intuitive to use as Aqua.

2. Windows Communication Foundation. This new messaging subsystem makes it easier for developers to create .NET applications that use Web services. The WCF plumbing takes care of a lot of middleware-dependent issues.

3. Windows Workflow Foundation. This subsystem, which includes a workflow runtime engine, offers tremendous potential for creating complex applications and mashups. To be honest, I’m not sure how useful this will be on the desktop, but it’s an impressive concept and a well-considered implementation.

4. Windows Mail. Outlook Express was tired, and Windows Mail is wired. However, given the every-improving quality of Web-based mail systems, it’s unclear how useful this will be for the average consumer or even office worker.

5. Built-in support for x64. New computers, such as those running Intel’s Core 2 Duo or Xeon processors, or AMD’s Athlon 64 or Opteron processors, can run 64-bit operating systems, and most versions of Windows Vista come with both 32-bit and 64-bit kernels in the box.

6. ReadyBoost. This feature uses Flash memory (like in a USB flash drive) as a memory cache. Remember, Flash memory is faster than a rotating disk – and there’s virtually no seek time. This is the one feature that I wish I had on the Mac right now.

7. Built-in support for ink. Forget about the separate Windows XP Tablet PC Edition 2005… Windows Vista has all the tablet bits built in, and actually goes much farther by supporting new gestures for command-and-control. (One of my pending projects is to install Windows Vista onto my Fujitsu T4010 tablet.)

Z Trek Copyright (c) Alan Zeichick

For the past week or so, Microsoft has been dribbling out press releases touting a benefit of Windows Vista: The upgrade will drive IT jobs and spending.

Today’s release brags that Windows Vista will generate US$10 billion in new revenue for the California IT industry in 2007, and will drive 16,000 new jobs in the state. This is based on research from IDC, which was sponsored (read: paid for) by Microsoft. I wrote about one part of that research on Dec. 12.

For someone’s revenue to increase, of course, someone else’s costs have to go up.

Microsoft and IDC have previously released proportionally similar figures for Massachusetts, New York, New Jersey and Florida. It’s fair to expect more states to be “researched,” and their increased IT costs touted as a Windows Vista benefit.

Now, is this increased IT cost a good thing? I would argue that while it’s a huge plus for California’s IT industry, it’s a huge minus for the people of California, who are expected to fork out $10 billion and hire 16,000 people to handle the migration from one version of an operating system to another. (I’m focusing on California here, but my comments apply elsewhere.)

IDC’s figures include the cost not only of migrating the operating system itself, but also of developing and deploying software updates required for Windows Vista, upgrading or replacing hardware, and so-on. Firms providing those services include (to quote from the IDC report):

Microsoft partners and OEMs sell PCs and servers running Windows; software vendors write applications that run on Windows using Microsoft application development tools; retail outlets and resellers employ people to sell and distribute these products; and service firms install and manage Microsoft-based solutions, train consumers and businesses on Microsoft products, and service customers for their own applications.

According to IDC, this is terrific! To quote again,

This is good news for California. Over the next four years, software-related IT employment should grow by almost 150,000 jobs and be the sole reason IT-related jobs increase at all.

IDC says that in California alone, Microsoft will ship nearly 5 million copies of Windows Vista in 2007, and the company will make $500 million in this one state alone. Quoting IDC:

For every dollar of Microsoft revenue from Windows Vista in 2007 in California, the ecosystem beyond Microsoft will reap more than $19 in revenues. In 2007 this ecosystem should sell more than $10 billion in products and services revolving around Windows Vista.

To put it another way, it’s going to cost California’s businesses and consumers $500 million in Microsoft license fees to adopt Windows Vista. And it’s going to cost the state’s consumers $10 billion overall. And that’s good news?

This is good news for the IT industry, of course, and for IT professionals seeking employment. But it’s bad news for the people of California who aren’t in the IT industry, say, a dentist’s office, or a clothing manufacturer, or a school, or a restaurant chain, who will be footing the bill.

It’s hard to celebrate a software update when its creator boasts that it will increase the cost of IT. Isn’t newer technology supposed to save us money?

Z Trek Copyright (c) Alan Zeichick

Two important server-development elements have been finalized this week.

ASP.NET AJAX, which is Microsoft’s tooling to support rich Web applications on its ASP.NET application server, reached 1.0 status on Tuesday. ASP.NET is a well-respected app server, and from what I’ve seen, Microsoft has done an excellent job with its AJAX implementation. See my comments from last October about ASP.NET AJAX.

Also yesterday, the W3C formally released its XQuery 1.0 spec, along with XSLT 2.0. XQuery describes a platform-neutral query language that can work across XML data. While there’s a lot of work to be done with XQuery (see my earlier comments), this is a key milestone on that project.

Z Trek Copyright (c) Alan Zeichick

Bruce Schneier, in a blog posting today, argues (convincingly) that it’s important for researchers and white hats to publicly release details about security vulnerabilities in hardware, software and Web sites. He writes,

Before full disclosure was the norm, researchers would discover vulnerabilities in software and send details to the software companies — who would ignore them, trusting in the security of secrecy. Some would go so far as to threaten the researchers with legal action if they disclosed the vulnerabilities…. It wasn’t until researchers published complete details of the vulnerabilities that the software companies started fixing them.

I agree with Bruce. I think that public scrutiny, which can lead to PR fiascoes, and from there to lost sales and lawsuits, is arguably the only factor driving big companies to fix their products in a timely fashion. Without the pressure of public disclosure, problems — and not just those limited to security vulnerabilities — would be addressed more slowly, or not at all.

Not everyone shares that viewpoint, of course. Big companies abhor the practice, and many believe that security flaws should be kept strictly confidential. There’s also a debate about whether software companies should be given advance notice of vulnerability discoveries, so they can issue patches before the vulnerability is publicly unveiled.

Bruce (pictured) points us to an excellent article just published on CSO Online, “The Chilling Effect,” by Scott Berinato. You should read it, and also its two sidebars by Microsoft’s Mark Miller and Tenable Network Security’s Marcus Ranum.

Now think for a bit. It’s one thing for you to argue for (or against) security vulnerability disclosure for the products you consume, say, from Microsoft or Sun or IBM or Oracle or Novell. Is it another thing for you to argue for (or against) security vulnerability disclosure for the products your development team create? Often, there’s a double standard: disclosure is good for the other guy.

Why should that be?

Separately: If you don’t read Bruce Schneier’s blog, you should. It’s always informative, and sometimes scary.

Z Trek Copyright (c) Alan Zeichick

Where have all the great tech journalists gone? They’ve gone to vendors, every one.

First, last month, InfoWorld’s Jon Udell went to work for Microsoft. See his auto-interview about the new job. He’ll be working as an evangelist on MSDN’s Channel 9. This is a savvy move by Microsoft, and I’m sure it’s good for Jon. It’s a bummer for InfoWorld, of course.

And now, Peter Coffee (pictured), arguably both the tallest and wisest technology pundit in the business, has left eWeek to work for Salesforce.com as director of platform research. Wow. Salesforce shoots, Salesforce scores. Good for Peter; he’ll be fantastic there. eWeek is significantly lessened by his departure.

Z Trek Copyright (c) Alan Zeichick

According to a CBS News report, Sun will soon be using Intel processors in some of its servers. In return, Intel will endorse the Solaris operating system.

If true, this is a good move for both companies, and potentially, for everyone except AMD, which has enjoyed an exclusive deal with Sun since November 2003.

Sun doesn’t win when it’s dogmatic. For years, it fought against Wintel with its SPARC processors. While there were sound technical reasons why the company chose AMD’s Opteron processors instead of Intel’s Xeon processors for its move to x86, there was some classic Sun attitude in there as well. Microsoft, despite its support for AMD’s processors, has long been close to Intel, and Sun’s arch-competitor, Dell, was always an Intel-only shop. Thus, one way for Sun to differentiate itself when selling commodity hardware was to snuggle up to AMD.

Times change. Dell ships a few AMD-based servers now, and Microsoft does a lot with AMD. In fact, in its public communications and technical documentation, Microsoft treats both companies’ processors equally, and it does marking with both. For example, AMD and Microsoft are jointly sponsoring a contest, called Vanishing Point.

AMD makes great processors. So does Intel. Sun, and Sun’s customers, win when the company is pragmatic, not dogmatic. Similarly, Intel gains no benefit from shunning Solaris; it should be an OS-neutral platform provider.

That leaves the obvious holdout, on the computer front, as Apple. Apple uses Intel processors exclusively… something that AMD execs have expressed frustration about. I’d love to see an iMac or Mac Pro running with Opteron processors.

>> Update 8:35am: Just received an invitation to a press conference with Sun ‘s Jonathan Schwartz and Intel’s Paul Otellini, at 10:00am today in San Francisco. Guess the deal’s real.

Z Trek Copyright (c) Alan Zeichick

Larry O’Brien tagged me. Larry, that prankster, is playing Blog Tag, where you dare people to reveal things that most people won’t know. (I traced Larry’s tag back and found this entry on “Storm & Wind” from August 2006. After that, the trail went dark.)

I’m cool. I’m hip. But I also want to stay somewhat on topic; this is a professional blog, not a personal diary. That said, here are things that you may not know about me:

1. One of my first published papers presented a recursive solution to the Tower of Hanoi problem – for the HP-67 programmable calculator. Rather innovative, I still think. Another paper formally presented a “Tiny BASIC” implementation for the HP-41C calculator.

2. The first magazine I worked for, as a staff editor, was called Portable 100, for users of the TRS-80 Model 100 computer (pictured). The first article I wrote for P.100 was a review of a Votrax speech synthesizer. The magazine was so successful that many people referred to the Model 100 computer as a Portable 100.

3. My consulting company, Camden Associates, was launched when I lived in Camden, Maine, more than two decades ago. It was initially a VAR/systems integration/custom programming business called Camden Computer Services.

4. When studying computer science, my favorite course was in compiler design. It was a great class. I also enjoyed taking EE classes, and wished that I could take more.

5. The first personal computer I owned was a TRS-80 Model I, but unfortunately, I couldn’t afford many mods for it. Later I upgraded to a TRS-80 Model III. I never owned a Commodore or an Apple, until acquiring a Fat Mac.

6. I worked my way through college typing term papers and dissertations (I was a wicked good typist) and tutoring. The best money and most gratification was in tutoring non-traditional students who were receiving veteran’s benefits.

7. While Larry is a great puzzle/game solver (see his backgammon comment), I’m not good at it. I’m lousy at backgammon and chess, for example, and can’t solve the Rubik’s Cube – despite Larry’s attempts to teach me the algorithm. My brain just doesn’t work that way.

8. I love doing what I do. But if I had to do something else, I’d become a professional photographer.

Part of the game is that I must tag other people. Okay: Alexa Weber Morales, Bruce Schneier, Esther Schindler, Bernard Golden and Joel Spolsky.

Z Trek Copyright (c) Alan Zeichick

The SCO Group released its earnings for the quarter that ended Oct. 31, 2006. To no surprise, they’re worse than ever, as CEO Darl McBride (pictured) continues to drive the company into the ground.

Some highlights:

* Revenue for their fiscal 4th quarter dropped to $7,349,000, vs. $8,528,000 for the same quarter last year.

* Net loss for the quarter increased to $3,743,000; in the same quarter last year, it was $3,431,000.

* Revenue for the fiscal year dropped to $29,239,000, down from $36,004,000.

* Net loss for the fiscal year surged to $16,598,000, a huge increase from the previous year’s $10,726,000.

Now, you might think that this was due to the brain-dead IBM/Novell lawsuit. You’d be wrong. According to the financials, the costs of the lawsuit were only $2,220,000 for the fourth quarter, down from $3,380,000 for the same quarter in the previous year. That’s right — the losses from bad operations swamp the losses from bad lawsuits. Even without the lawsuit, they’d still be hemorrhaging money.

Is there hope that the company’s fortunes will improve? Not if McBride’s comment is any indication. “We remain committed to our UNIX business, introducing new mobile services to the marketplace and defending our intellectual property through the legal system,” he wrote.

How much longer will SCO survive? The best guess: Not long.

PS: Did you know that Darl McBride has his own Web site? It hasn’t been updated for a while, though.

Z Trek Copyright (c) Alan Zeichick

I’ve seen relatively little coverage about yesterday’s approval of the PCI Express 2.0 specification by PCI-SIG, the industry consortium that, well, defines the PCI Express specification.

With the introduction of PCI Express (aka, PCIe) a couple of years ago, server expansion cards got a huge performance boost, with the biggest impact on RAID and other disk controllers, and also on high-speed, low-latency interconnects, like you’d see in high-performance clusters. (PCI Express is found on desktop systems, too, with the biggest payoff in graphics cards. There, PCIe replaced older standards like AGP.)

The beauty of the PCI Express system is that it’s transaction-based. Yes, this makes it considerably more complex than many previous bus designs. In fact, PCIe incorporates a full networking stack. However, that gives it much greater flexibility and scalability.

Each PCI Express interface and peripheral engages in transactions using some number of asynchronous one-bit-wide serial lanes, which are used to aggregate bandwidth. Devices can use links that are between 2 and 32 lanes wide. (That’s in contrast to older buses, which were synchronous and parallel at 16, 32 or 64 bits, and which were therefore hard to sync at faster clock speeds.)

The PCI Express 2.0 spec doubles the data rate from 2.5GT/s (that’s 2.5 billion transactions per second) to 5.0GT/s. Each lane’s performance jumps from 2.5Mb/s to 5.0Mb/s.

That extra bandwidth opens up great possibilities. Think of what it can mean when linking CPUs and GPUs, for example. It also raises the ceiling for low-latency interconnect performance for clusters.

According to PCI-SIG, the PCI Express 2.0 spec adds additional features: device drivers can dynamically control link speed, for example. The bus also supports devices that consume higher power.

As servers and desktops get faster (think about two quad-core processors in a desktop, or four quad-core in a typical server) the bottleneck isn’t the processor or the peripheral. It’s the bus. PCI Express 2.0 is a solid step forward.

Z Trek Copyright (c) Alan Zeichick

Best e-mail of the week so far: A press release from Pantone, a company that sets color standards, telling us that “they” have selected Chili Pepper, which they catalog as color #19-1557, as the Color of the Year for 2007. So, if you go to a paint store, or to a commercial printer, you can specify this color for your house, your office, or your business cards. (Professional graphics applications like QuarkXPress, Photoshop and InDesign can spec color using Pantone codes.)

To quote from their official news release:

“Whether expressing danger, celebration, love or passion, red will not be ignored,” explains Leatrice Eiseman, executive director of the Pantone Color Institute. “In 2007, there is an awareness of the melding of diverse cultural influences, and Chili Pepper is a reflection of exotic tastes both on the tongue and to the eye. Nothing reflects the spirit of adventure more than the color red. At the same time, Chili Pepper speaks to a certain level of confidence and taste. Incorporating this color into your wardrobe and living space adds drama and excitement, as it stimulates the senses.”

“In 2007, we’re going to see people making greater strides toward expressing their individuality,” says Lisa Herbert, executive vice president of the fashion, home and interiors division at Pantone. “The color red makes a bold statement. We’re seeing shifts in people’s opinions on current events and major changes in the way they are expressing themselves through new technology. People are open to the possibilities of the future and Chili Pepper celebrates that.”

I’ve tried to insert Chili Pepper as the graphic for this entry. Do you think the color is as they describe? “Chili Pepper is an engaging, deep, spicy red, which provides a jolt of energy and inspiration to fashion, beauty, interiors and design.”

Z Trek Copyright (c) Alan Zeichick

Before 2006, only one company – IBM – managed to gain more than 2000 U.S. patents in a single year. But in 2006, five companies broke that barrier: IBM with 3651 patents, Samsung with 2453, Canon with 2378, Matsushita at 2273, and Hewlett-Packard at 2113.

That’s a huge amount of patent activity. In fact, according to IFI Patent Intelligence, my source for all these numbers, there were 173,772 U.S. patents issued in all of 2006, a record amount that was 20.8 percent higher than in 2005.

To quote from IFI’s analysis,

“2006 panned out as a banner year in terms of the number of individual patents granted and leads us to believe that the USPTO is making headway in addressing its backlog of patent applications,” said Darlene Slaughter, general manager of IFI Patent Intelligence.

“Although the number of patents being granted is not the only gauge of technological advancement, the relative increase in number of patents being generated indicates a growing emphasis on the value of intellectual property.”

The USPTO is the U.S. Patent and Trademark Office.

Some other big patent winners in 2006: Intel, at 1962, Microsoft, at 1463, and Sun Microsystems, at 776. You can see the same list I’m looking at the IFI site.

When you factor in the power of patent cross-licensing, patents are an incredibly potent weapon. Patents not only help a company gain market share, but also help them successfully defend that market share against competitors – particularly upstarts. Big competitors share their patents. Smaller competitors don’t have the IP to play that game, so they either have to pay through the nose, or stay out of the game. Novell, in its lopsided IP licensing deal with Microsoft, may be a casualty of the IP wars: it was simply out-gunned.

Z Trek Copyright (c) Alan Zeichick

Last week, I visited CodeGear — the tools business from Borland, which has been spun off into a wholly owned subsidiary. CodeGear will be evolving JBuilder, Delphi and C#Builder, and will also be introducing some new tools for dynamic languages and for otherwise enhancing developer productivity.

You can read some notes about my visit with David Intersimone (pictured) and Michael Swindell in today’s Zeichick’s Take, in SD Times News on Thursday.

Z Trek Copyright (c) Alan Zeichick

Threads are breaking out of the server into the desktop and notebook computer – and even in servers, the advent of dual-core and quad-core processors is drastically changing the landscape for applications. To put it bluntly, applications need to be designed, coded and tested to run optimally in a multithreaded environment – not just to get the best possible performance or throughput, but also to ensure that applications will run correctly when there are four, eight or even 16 cores in the machine.

The problem is that most developers, and development teams, don’t think about threading on the desktop. Even when it comes to servers, which have long supported multiprocessing, a lot of the thinking is that “the operating system will handle it,” or “my libraries will handle it,” or “my runtime will handle it.” Well, no. Most of the time, if the architect, coder or tester doesn’t put in the time, you won’t see the results.

When I talk to companies like AMD or Intel, that’s the frustration I hear all the time: Threading is still in the earliest stages of adoption among developers. That’s true on the desktop and the server. That’s true of ISVs and enterprise developers. That’s true for those building libraries and runtimes, and for those building complete applications.

But how to measure how far a company, a development team or a developer is in regard to threading? During lunch this week with James Reinders, director of marketing and business for Intel’s Software Developer Products Group (that is, the compiler folks), I suggested a Threading Maturity Model (ThMM), as analogous to things like the Capability Maturity Model, or the SOA Maturity Model. James liked the idea, and I’m presenting it here for comment. As far as I know, this is an original concept.

Note that companies would likely be at different stages in the Threading Maturity Model for server applications versus for desktop applications.

The Threading Maturity Model (ThMM)

5. Adoption. All developers trained to use threading. Threading is addressed at the design, requirements, and architectural states of development, in addition to coding and testing. Broad incorporation of threading tools into the toolchain. Newly adopted code, such as libraries and components, must demonstrate support for threading. Funded efforts to eliminate all non-threaded libraries and runtimes. All threaded applications are tested against platforms with different cores/processors to identify runtime issues. Formal source-code validation techniques are used to identify potential failures.

4. Utilization. A few developers receive formal training on programming using threads, and how to avoid hazard such as deadlocks and race conditions. Threading considered a programming issue, not a design, requirements or architectural issue. Use of threading encouraged, but not mandated, except in performance-intensive routines. Some libraries and runtimes required to support threading. Specific tools to test threaded code are purchased and used from time to time. Some, but not all, threaded applications are tested against platforms with different cores/processors to identify runtime issues. No formal validation of threaded code for correctness.

3. Hotspots. Developers are largely self-taught about threading. Threading used to address known performance trouble spots within applications by applying simple techniques, such as OpenMP, to optimize loops on a desktop application, or the use of thread pools to manage user connections on a server application. Results measured with code profilers and casual benchmarks. Use of threading is allowed, but not particularly encouraged or required. Minimal testing of applications against platforms with different cores/processors to identify runtime issues.

2. Experimentation. Some developers have studied threading, such as by reading articles, but are not trained. Simple tests of threading conducted with trivial or ad-hoc projects; some tests will appear to indicate failure, because the tests weren’t properly designed or executed. Some allocation of enthusiast team members’ time is channeled into exploring threading, often as a “skunkworks” project. Some understanding of the different threading models. Minimal incorporation of threading into production code. No testing of applications against platforms with different cores/processors to identify runtime issues.

1. Awareness. General awareness of the potential benefits of threading desktop or server applications, but unsure of the specific benefits of different techniques for threading. Threading not incorporated into the development process. Developers trust that compilers, libraries and runtimes are handling threading automatically. No serious consideration of threading as a way to solve performance problems. No testing of applications against platforms with different cores/processors to identify runtime issues.

0. Unawareness. They don’t know and they don’t care: Developers, architects and managers are either unaware of the existence of threading, or don’t believe that it applies to their software development practices. Completely trusts that the operating system handles everything automatically when more cores or processors are added. Nobody is following technical discussions regarding threading.

(My thanks to James Reinders for his enthusiastic response, and to my colleague Andrew Binstock, who reviewed an early version of the model. Andrew offered ways to make it more generally applicable to both desktop and server development, and suggested adding ThMM stage 0.)

Z Trek Copyright (c) Alan Zeichick

Today, Cisco Systems Inc. sued Apple Inc. over unauthorized use of the iPhone trademark.

You see, Cisco’s Linksys division already has a line of wireless VoIP telephones called the iPhone, introduced last month. The press has been speculating about how Apple would handle this situation — rumors about Apple’s iPhone have been swirling around for weeks. When Apple made its big announcement yesterday, many of us assumed that Apple and Cisco had worked out the trademark details.

Guess we were wrong. Quoting from Cisco’s news release:

“Cisco entered into negotiations with Apple in good faith after Apple repeatedly asked permission to use Cisco’s iPhone name,” said Mark Chandler, senior vice president and general counsel, Cisco. “There is no doubt that Apple’s new phone is very exciting, but they should not be using our trademark without our permission.

Stupid, stupid. Steve Jobs is losing his touch.

>> Update 1/11: I’ve read speculation that this is all part of a marketing gambit by Apple, and the company intends to relaunch the iPhone as the Apple Phone. It’s not out of the question: the device rumored to be called the iTV was launched as Apple TV. However, this seems extreme for a publicity stunt.

Z Trek Copyright (c) Alan Zeichick

This is my 100th blog entry on Z Trek. Profound, ain’t it?

Z Trek Copyright (c) Alan Zeichick

Last Thursday, I wrote my SD Times newsletter column on CUA compliance in general, and Microsoft Office 2007’s lack of CUA compliance specifically. I also mentioned it briefly here in the blog.

Some nice comments came in the column, and I followed up with another Zeichick’s Take on Monday that shared ’em with the broader community.

What do you think about Microsoft’s Office 2007 user interface, and the relevance of the Common User Access specifications? Keep the cards and letters coming!

Z Trek Copyright (c) Alan Zeichick

Apple’s decision to change its name from Apple Computer Inc. to Apple Inc. reinforced my nascent feelings about the company, and by extension, IDG’s independent conference for the Macintosh industry. Apple and Macworld Expo aren’t about computers any more. They’re all about consumer electronics.

Sure, in the Macworld North Hall there were a few enterprise exhibitors. Behind the main Apple pavilion in the South Hall (which was mainly showing off the iPhone) there were a handful of software development tools companies. However, they were lost in the maze of iPhones, Apple TVs, and iPod cases. Macworld is a consumer event, and Apple has declared itself a consumer electronics company.

I wouldn’t go to Nokiaworld to see their new phones, so why should I go to a conference to see the Apple iPhone? I’ll wait and see it in my local cell-phone store.

I wouldn’t go to TiVoworld to see their latest digital video recorder, so why should I go to a conference to see the Apple TV device? I’ll wait and see it in Best Buy.

Sure, there was some eye candy for people like me — you know, the ones who care more about computers than music players. Apple did demonstrate some neat features of “Leopard,” the next version of Mac OS X. But those were basically the same bits that Apple’s been showing for the past few months.

Apple’s announcement that two billion songs have been downloaded from its iTunes Store is impressive. But that’s not a reason to attend a conference. Neither is the opportunity to view yet another “I’m a Mac” advertisement.

One can only see so many MacBook carrying cases, iPod skins and people drooling about the iPhone to realize that Macworld has become simply irrelevant. I’m a big Apple customer myself: I’m typing this on an iMac, I carry a PowerBook, both are connected to an Airport WiFi hub, and there’s an iPod hooked into my car stereo. But there’s nothing for me at Macworld.

Z Trek Copyright (c) Alan Zeichick

At the Consumer Electronics Show, Bill Gates announced the Microsoft Windows Home Server, designed to serve as the digital hub of the modern house. It’s a great idea, and frankly, it’s about time someone addressed this need.

Consider digital photos. My iMac has my camera’s photos. My wife’s Dell has her photos. How do we access the full library of family photos? Either by setting up a temporary file share, or sit at each other’s machine. Suboptimal.

Consider digital music. My iMac has my music. My son’s Mac Mini has his music. My wife’s machine has her music. We all have iPods, and use iTunes. Popular songs are replicated on all three machines, and the maintenance that I do to ensure that all my songs have covers embedded in the MP3 file isn’t reflecting in my family’s library.

Consider data backups. Shared work files. E-mail services. Faxing. There’s nothing on either Windows XP or Mac OS X that helps turn our house into something efficient on a digital basis, beyond support for Internet connection sharing (which we don’t do, since we have a router) and print sharing (which we don’t do, since our printer is directly connected to the LAN).

We’ve long considered setting up a separate machine (probably an iMac) to use as a family server. But frankly, none of the software that it includes would make the job easy. Setting up a centralized instance of iTune, for example, to support all three of us would not practical, since we need to go to that server to update our iPods. We’d also have to leave the machine logged in all the time, which is not good security for a server.

The same thing with backup. It would be nice to buy a single tape drive or external storage device (like a LaCie Big Disk) to backup all three of our machines securely and automatically. But that’s not really possible today, and so we all have our own external hard drives. (I could buy a NAS device and set up three different volumes, but that’s not the point. I can cobble together solutions. The typical family can’t.)

What about Microsoft Windows Home Server? Clearly, it’s too soon to know how good a job it will do. Certainly, my assumption is that it will only support Windows Vista clients, and so it won’t be useful for my multi-platform family. (Microsoft’s video of the announcement wouldn’t play on my Mac, I had to use my wife’s Windows PC to watch it. I’m not surprised.) But it’s a good idea. Frankly, I can’t wait to see it.

Z Trek Copyright (c) Alan Zeichick

Jon Bosak informed me this morning that UBL 2.0 has been approved as an OASIS standard. UBL, or Universal Business Language, is designed to provide a consist way for businesses to share common documents using XML; that is, documents like invoices, purchase orders, catalog updates, transportation manifests, things like that. UBL is an alternative to business groups creating ad-hoc schemas for documents that are truly universal to all businesses. Why reinvent the wheel — or the billing form?

While UBL 1.0 made a good start, and laid some fundamental groundwork, the scope of its work was pretty limited to order processing. Version 2.0 adds 23 new document types of the previous version of the XML spec. Those include catalog requests and updates; forwarding instructions, packing lists, bills of ladings and waybills; credit and debit notes, freight invoices, payment reminders; account statements and remittance advice notes.

Sexy? Heck no. But efforts like UBL (you can download the spec here) are essential to helping businesses conduct business online, and can significantly help reduce software development time. There’s also a SOA perspective here: the UBL document and schemas can be used both internally and externally, and can serve as the foundation for a company’s internal workings as well.

Z Trek Copyright (c) Alan Zeichick

While we’re on the subject of storage: don’t forget solid-state drive technology. This week, SanDisk released a 32GB SSD that’s going to cost about US$600, and fits into a standard 1.8-inch form factor.

Sure, that’s a lot of money for a 32GB hard drive today. But given the incredible rate of change in flash memory, the capacity will go up and price will come down fast.

Let’s predict that the SSD doubles in capacity in the next 12 months, and the price comes down by 1/3. Would you pay an extra $400 to have a 64GB SSD in your notebook PC? Silent, less power consumption, no moving parts, super fast?

According to SanDisk, their solid-state drive consumes only 0.4 watts, compared to 1.0 watts or more for a standard rotating notebook hard drive. They also say the SSD has a sustained read date of 67 megabytes/second, more than 100x faster than a standard hard drive.

Z Trek Copyright (c) Alan Zeichick

Andrew Binstock pointed me to this CNet article, regarding the forthcoming introduction of terabyte hard drives from Hitachi Global Storage Technologies. Too bad they didn’t ship in December. My, that burger tastes good.

Hitachi GST, by the way, is IBM’s old storage division, which Hitachi bought in 2003 and absorbed into their own storage group. I visited the IBM hard-disk plant in San Jose about a decade ago. It had very impressive displays of old washing-machine-sized 5MB hard drives in its museum.

Z Trek Copyright (c) Alan Zeichick

I love the ability generate gorgeous graphics using Excel 2007 and PowerPoint 2007. But otherwise I’m unimpressed by the software update, and have no plans to upgrade our company to Office 2007 any time soon.

One particular issue I have is the new user interface, which replaces the familiar File-Edit menu system (which was Common User Access compliant) with a new “ribbon” that combines the menu bar with the toolbar — and which moves things around dynamically, depending on how often they’re used.

The last thing PC users need is a new GUI paradigm that attempts to gloss over the software’s incredible complexity, vs. addressing the underlying causes of that complexity. See my comments in this week’s Zeichick’s Take.

Z Trek Copyright (c) Alan Zeichick

Standards are important. Not only for interoperability, but also for sanity.

Take, for example, our two family cars, a Mazda3 hatchback and an Acura TSX sedan. Both are great, peppy cars, and both have five-speed automatic transmissions with a manual shift override feature.

There are odd differences between the cars. For example, on the Mazda3, the controls for the sunroof are above the windshield. On the TSX, they’re on the left side of the cockpit. I think that Mazda’s design is more intuitive, but both are reasonable.

Another difference is in the naming for the stability/traction control feature. Mazda calls it DSC, for Dynamic Stability Control. Acura calls it VSA, for Vehicle Stability Assist. Same feature, though.

Of course, the Mazda3’s gas cap is on the right, and the TSX’s is on the left.

The real annoying difference, though, is in the gate for the manual shift override (which Mazda calls Manual Shift Mode, and Acura calls Sequential Sport Shift). I use this mode on the highway to keep the engine in a high RPM range for decent torque.

With both cars, with the gear selector in “D,” you shove the gearshift over to the left to activate the manual mode.

On the Mazda, to downshift to a lower gear, you push the gearshift FORWARD. To upshift to a higher gear, you pull the gearshift BACKWARD.

On the Acura, to downshift to a lower gear, you pull the gearshift BACKWARD. To upshift to a higher gear, you push the gearshift FORWARD.

This, as you can imagine, drives me nuts. Where are standards when you need them?

Z Trek Copyright (c) Alan Zeichick