My wife, Carole, had been using an eFax account for a few years, but decided to it just wasn’t worth continuing. She asked me to cancel it for her, since I was the one who set it up in the first place. Sounds easy, right? Of course not.

Step 1: Try to go through the Web site. Result: Failure.

After browsing to their Web site, I logged into the account, with her e-mail address and PIN, and look all around for “cancel” options. There were none. So, I did a search on their online help under the word cancel, and here’s what they say:

How to Cancel your eFax Account

If you are considering cancelling your eFax account because you are having a problem using the service, keep in mind that the solutions to many common problems can be found in this “Help” section.

If our online help is insufficient or you wish to cancel your eFax account for another reason, please click the blue “Chat Now” button below or click HERE and a Customer Service representative will assist you. Please note that your account should not be considered cancelled until so confirmed by Customer Service.

Step 2: Contact via e-mail. Result: Failure.

Not wanting to telephone or chat, I decided to send them an e-mail, which I did from my wife’s account. It was hard to find an e-mail address, by the way; there’s none listed under their “contact us” tab. Finally, I found one: email hidden; JavaScript is required, and wrote a simple message: “Please cancel my account and send a confirmation.”

I received a response back from a guy who signed himself Alvin F., but whose return e-mail address was CS – Tier 1 – India (Email), who replied,

In order to cancel your account, please contact us at 1-323-817-3205,
or by visiting our Live Chat service, at
https://www.efax.com/en/efax/twa/page/chat where a Customer Service
representative will assist you in the cancellation process. We are
available 24 hours a day, 7 days a week.

Not interested in doing a phone call, I (feeling slightly annoyed), decided to try their chat line. See the transcript in Part 2.

Z Trek Copyright (c) Alan Zeichick

Last week was a big one for technology enthusiasts and software developers.

The Eclipse Foundation shipped Europa, the huge simultaneous upgrade/release of 21 open-source projects. Apple shipped the iPhone, its groundbreaking multi-purpose mobile device. And the Free Software Foundation updated its General Public License for open source software to v3.

What do each of these seemingly unrelated events have in common, other than their appearance during the last week in June? Plenty.

First: Each of these new releases was planned very visibly, and each was eagerly anticipated.

• There have been a myriad betas of each Eclipse project, and many developers were already working with early code.

• High-paid corporate lawyers, armchair lawyers, software experts, entrepreneurs and technology pundits have publicly debated every inch of each GPLv3 draft.

• The hype around the iPhone reminded me of nothing less than Windows 95, where people stood in line at midnight to receive their devices.

Second: The actual impact of each of these releases will be big – but we don’t know how big.

• The Eclipse Europa release is part of a consistent annual cycle of software updates. It’s anchored by version 3.3 of the IDE, which has strong incremental improvements. Changes to other projects vary from significant to esoteric. The Eclipse Foundation, in its press release promoting Europa, highlighted nine of the projects with such language as “The Eclipse Modeling project has updated the Eclipse Modeling Framework (EMF) to support Java 5 generics, allowing for creation and management of more complex and flexible data models.” Despite this bland description, the Europa release demonstrates that Eclipse has legs.

• While not perfect, the iPhone is technologically impressive. I’m not going to buy one until my carrier offers it (I refuse to switch mobile carriers in order to use a specific handset). Hype aside, what’s significant about the iPhone is that it’s the first truly user-friendly device to have a real browser, and real full-time connectivity. Expect to see many other similar “no compromises” mobile devices, not only from Apple, but also from Microsoft, Symbian, Motorola, Nokia, Samsung, RIM and other players. What the iPhone enthusiasm means, beyond that Apple is unparalleled at marketing, is that people want those “no compromises” devices, and they want them to be easy to use.

• GPLv3’s potential impact on the open source movement does not depend on how many projects decide to migrate to it. The license offers enterprise customers and major software companies more confidence in the intellectual-property foundations behind open source software in general, and Linux in particular (if Linux goes to GPLv3, which is uncertain). This reinforced the message that open source is seriously business.

Third: Microsoft is not doing any of these technology breakthroughs.

This isn’t Windows 95, this isn’t the Xbox, this isn’t Windows Vista, this isn’t MSN. Also, note that today’s other big-hype company, Google, isn’t there either. That means that other companies beyond Microsoft and Google can innovate. But there is going to be a Microsoft response to each of these, you can count on it.

• The primary motivation behind Eclipse is to help is member companies (such as IBM, Oracle and BEA) save money by collaborating on their toolchains. However, you can’t look at the cross-platform Eclipse without viewing it as the primary competitor to Microsoft’s Visual Studio. While Microsoft has never challenged Eclipse directly, it’s clear that Bill & Co. must view it as a threat. After all, Microsoft knows that developers are incredibly influential when it comes to deployment platforms, particularly on the back end. Every developer who chooses to build server apps using Eclipse instead of VS is potentially denying Microsoft a place in corporate data center for Windows and Windows Server apps. Expect Microsoft to respond sooner or later. It may not be soon, but Microsoft can’t keep ignoring Eclipse.

• The iPhone dovetails nicely with Apple’s iTunes service and iPod music player. On the surface, doesn’t compete with anything strategic from Microsoft, whose MSN Music service and Zune music players are minor players. However, Microsoft is trying to be a big kahuna in the phone market, with Windows CE, Windows Mobile and Windows Smartphone. It’s not just about phones: Microsoft wants to supply clients end-to-end, from desktops to tablet to phone, tying them together with Windows servers and MSN services, all programmed through Visual Studio. The iPhone is a threat on Microsoft’s own turf. Expect Microsoft to respond as soon as the hype has calmed down. I expect to see a Microsoft announcement this summer.

• GPLv3 was inspired, in part, by some longstanding holes in GPLv2. But it was also inspired by Microsoft’s threats against the open source movement, its attacks against Linux, and most recently, its IP-sharing agreements with major Linux distributors, namely Novell. In other words, Microsoft is attempting to compete against open source software by waving around its intellectual property portfolio, instead of competing on price, features and security. Microsoft has a huge number of well-paid lawyers. The open source movement does not. This is a problem for the open source movement; GPLv3 is an attempt to solve that problem. Since IANAL, I don’t know if GPLv3 will accomplish that goal. Expect Microsoft respond obliquely, by by spreading FUD about GPLv3 and open source in general.

Z Trek Copyright (c) Alan Zeichick

The Traveling Wilburys released their two official albums in 1988 and 1990. Ever since then, my wife and I have listened to the cassettes we bought nearly two decades ago, later supplanted by a “rip” of those cassettes onto CD-R discs.

But now, our lives have changed, with the newly released CD set comprising both of their albums, with some bonus tracks as well as a DVD. The CD set came in today’s mail (thank you, Amazon!), and we’ve been enraptured. The sound is crisp and clean… but then again, we’ve been listening to a cassette since 1988, for heaven’s sake.

The best are the first-disc tracks featuring the voice of the late Roy Orbison, but there’s not a bad song in the bunch. If you’re a fan of Bob Dylan, Jeff Lynne, Tom Petty, George Harrison and Orbison, you need this.

Z Trek Copyright (c) Alan Zeichick

To many people, it seems that open source software movement begins and ends with Linux. Take, for example, the popular news Web site, NewsForge, which has done a good job of covering the entire open source universe. Last week, its owners, SourceForge Inc. (formerly known as VA Software), announced that it was folding NewsForge into its Linux.com site.

Does that imply that everyone who is interested in news about open source software is focused on Linux? You’d think that SourceForge would know better.

As a fan of open source software, but as someone who doesn’t use Linux on my desktop (I use Mac OS X 90% of the time, Solaris 5% and Windows XP 5%), I guess I find the lofty position of Linux atop the entire open source software movement troubling.

There are many, many open source projects other than Linux, folks. And not all open source projects see themselves as locked in a religious life-and-death struggle against Microsoft Windows.

For example, how about the myriad Eclipse projects? Most people that I’ve seen coding with Eclipse are running it on Windows workstations, and many are targeting Windows servers as a runtime for Java apps built on Eclipse. How about OpenOffice? Sure, it’s a way to bring a first-class Office suite to Linux – but it’s also a way to bring a cross-platform Office suite to Windows and the Mac as well.

Mozilla? Lots of people run Firefox on Windows and Mac. Apache? Many of Apache’s projects are centered around Linux, but many of them aren’t. NetBeans? It’s cross-platform. OpenMake runs on Linux, Windows and Solaris. Speaking of which, you can’t tell me that OpenSolaris is part of the Linux agenda.

Many people like using the gcc/gdb tools from the Free Software Foundation precisely because they’re excellent and cross-platform, not just because they’re available for Linux.

Yet, in many open source projects, there is nothing but contempt for anyone who would use an operating system other than Linux, even if the project itself supports other platforms. I remember, not very fondly, posting a message on a project newsgroup about a problem we had running an open-source package on a Windows server. The sole response advised me to run the package on a Linux server instead. Thanks for nothing.

Yes, Linux is arguably the highest profile open source software project. I won’t dispute that. However, the common belief that it’s at the center of the open source universe (as in NewForge’s mission to be “the Online Newspaper for Linux and Open Source”) does credit to nobody.

The open source community should realize that not every supporter of open source software uses Linux, or even cares about Linux. The open source community, also, needs to be less contemptuous of open source projects that don’t focus on Linux, or which may not even run on Linux.

Equally, of course, enterprise IT managers need to realize that Linux is merely one of many open source projects. You can choose to run Linux, or choose not to run Linux. You can also choose to use other open-source platforms and tools, or choose not to. But your support of open source in general, should be distinct from your interest in Linux. Sadly, this is difficult, since so many open source zealots focus on promoting Linux to the exclusion of the movement’s other success stories.

By pretending that Linux is the epitome of the open source movement, the open source community is acting against its own best interest.

Z Trek Copyright (c) Alan Zeichick

A highlight of the Linux Foundation Collaboration Summit (which I attended on June 13) was an afternoon panel entitled “How do we get more applications on Linux?”

Moderated by Dan Kohn, COO of the Linux Foundation, the panelist were Mike Milinkovich of the Eclipse Foundation, Darren Davis, from Novell, Kay Tate from IBM, Scott Nelson from RealNetworks, Ed Costello from Adobe, and Brian Aker from MySQL. (A word of caution: These quotes are based on my real-time notes at the conference, and I can’t guarantee that they’re word-for-word exact.)

I hadn’t expected that the panel would be dominated by a discussion of the Linux Standard Base, but it became clear about eight seconds into the discussion that the LSB is the end-all and be-all of getting third-party developers to target generic Linux.

Think about it: If you want to target Windows, you know which APIs to write to; if you want to target Mac OS X, you know which APIs to write to; if you want to target Solaris, you know which APIs to write to. Linux? You shouldn’t have to choose to write to Red Hat Linux, or SUSE Linux, or Debian Linux. You should be able to simply write to Linux.

That’s why the Linux Standard Base was envisioned as a set of consistent programmatic interfaces for applications developers. All Linux distros would implement the Linux Standard Base, and therefore, app developers could just target the LSB, secure in the knowledge that their source code should compile to run everywhere without modification. Customers, by looking for apps certified to run against the LSB, could be secure in the knowledge that apps would run on their own preferred distro.

Unfortunately, said Kohn, “There are a few hundred LSB certified applications, and a couple of dozen commercial ones. Why is Linux not getting thousands of applications?” Compare that against thousands upon thousands of certified Windows applications.

MySQL’s Aker said, “If the question is, how to you get [Windows applications] ported to Linux, well, you don’t! You never win there. You win in getting Linux used in new development. If the LSB can push new development onto Linux, you’ll win. If you find folks who are doing Windows applications today and get them to write in the first place for Linux tomorrow, that’s how you win.”

Novell’s Davis agreed. “The low-hanging fruit’s already done. I agree, the next step is new application development. We have to get ISVs involved in targeting Linux as a platform. What comes through Novell are customer requests, and that’s all customers care about. sure, We want to promote Linux, but customers don’t look at it that way. They’re looking at a business problem, and then finding an application to solve that problem. you have to have those on Linux.”

IBM’s Tate said, “We have great Linux certification by distro, and that’s the industry expectation. But being able to do certification at the application level, and teaching customers to look for that, is something that the Linux industry isn’t used to doing.” And indeed, the various commercial Linux distros push their own specific certification programs.

Eclipse’s Milinkovich expanded on this: “What you want is applications running on Linux. It’s a virtuous cycle. It’s not about getting products on top of Linux — it’s about getting the bank teller using Linux at the counter. Make Linux more ubiquitous, so you have an installed base of machines that ISVs will want to sell.”

He continued, “You’ve already eaten all the early adopters, now you have to get to the mainstream. Developers aren’t looking at the [software development] technology you’re including, like gcc and gdb — they think you’re nuts! You need a culture change. You have to give them application frameworks that they like. You can knock Microsoft all you want, but many developers like what Microsoft gives them to write apps. There’s nothing like that in Linux. Sure, I have an agenda, the Eclipse Rich Client Platform, which does that, but you have to recognize that if you want app developers, the status quo won’t get you there.”

A challenge, agreed the panel, is that the LSB, as a platform, doesn’t appeal to developers who aren’t hard-core Linux enthusiasts, but rather are considering it as merely as just another target platform.

Novell’s Davis said, “I have sent ISVs out to the LSB Web site, and they’ve come back screaming that they don’t know how to use it. You need tools to make using LSB easier. ISVs do the math and look at the market size, and can pick one or two distros — and that leaves the other distros out in the cold. We need LSB to work to stop it all from being dominated by Novell and Red Hat.”

MySQL’s Aker amplified that point: “I think it’s gotten worse. Different kernels on different distros means lots of different bugs that we have to support. That’s hard to describe to a customer, when there’s a problem that’s specific to a specific vendor build. This is getting worse.” He said, “There’s no money to be made in making tools. Getting vendors to implement new tools is really hard. Eclipse is the only forward option for tools.”

The moderator, the Linux Foundation’s Kohn, admitted, “The biggest competitor to the LSB is the makefile. if you have an open source app that you’re releasing, you create a makefile for each of the distros, BSD, Solaris, and so-on. The user compiles and it just works. The LSB project is trying to make the world safe for binaries, instead of source code, which isn’t the sexiest rallying cry. But Linux has to be that way to become a modern platform.”

That’s true. If your platform requires end users to compile applications, you’re not even in the game. As MySQL’s Aker said, “People use the distributions, and don’t want to compile their own applications.”

That’s a problem that this panel didn’t even come close to addressing.

Another problem that wasn’t addressed is that, for the most part, the majority of desktop application developers don’t want to target Linux specifically – they just want to sell or give away lots of software. The economics aren’t there – unless they have a specific business reason for targeting Linux directly, they’d be better off focusing on platforms with larger installed bases, and which therefore present a bigger business opportunity.

As the Eclipse Foundation’s Milinkovich said (and this was easily the best quote of the entire conference), “If you’re a startup, and the only platform you’re supporting is Linux, it sucks to be you.”

Z Trek Copyright (c) Alan Zeichick

If the Linux community has a hero other than Linus Torvalds, it’s Mark Shuttleworth, a dot-com gazillionaire who started the Ubuntu Project, and who funds it out of his own pocket. If you’re not familiar with Ubuntu, it’s a distro based on Debian Linux, which is designed to offer a one-disc install of everything you need: operating system, device drivers, and lots of free applications.

Unlike many Linux distros, Ubuntu was intended to be used by civilians. Shuttleworth has pledged that Ubuntu will always be free.

I’m a fan of Ubuntu Linux, and think that Mark Shuttleworth is a good guy. However, to many in the Linux community, the South African entrepreneur is a rock star, a god. What’s more, Ubuntu Linux is seen by many as the first, and only, Linux distribution that might have a shot at gaining market share on the desktop. That makes it the only viable open-source alternative to Windows.

No matter whether you like Shuttleworth, or worship him, or hate him, or are fairly ambivalent, there’s no doubt that he’s very influential in the Linux community. So, his keynote address at the Linux Foundation Collaboration Summit (which I attended on June 13) was worth listening to, particularly because it presented a frank window into the state of the Linux platform.

Shuttleworth, naturally enough, praised the concept of the Collaboration Summit, the first time that such a wide swatch of the Linux community came together ostensibly to find better ways to work together to improve the platform, both technologically and in the marketplace.

“It’s a chance for us to build relationships, and accelerate collaboration,” he said. “I enjoy seeing the extent to which people who aren’t generally ‘free software folks’ are hearing about free software, and are starting to see that this world is more interesting than they thought it was.” (A word of caution: These quotes are based on my real-time notes at the conference, and I can’t guarantee that they’re word-for-word exact.)

Speaking to the non-Linux attendees (such as press and analysts), he said, “It’s not just Windows vs. Mac, there’s a whole other world out there. Many people want it to be Red Hat vs. Microsoft, or Ubuntu vs. Microsoft. but that’s not what it is.”

What separates Linux from Windows and the Mac, he said, is the large number of different parties, and different vested interests, who have come together to make the platform work. “Innovation works best when you have people who are inspired, and you let them work. However, you need a framework for innovation that brings them together.”

Enthusiasm isn’t enough, though, Shuttleworth said. “We have some disadvantages. The other guys can spend more, in hard dollars, than our guys. On the other hand, we have an extraordinarily rich pool of talent, and have the freedom to innovate. Collaboration sounds beautiful, but it’s hard. Collaboration means working together, but that’s labor —that’s real work.”

Shuttleworth warned against pitfalls caused by biases, or lack of knowledge, or the type of unhealthy group dynamic when projects are run by close-knit meritocracies with little patience for those people who are less technical, or who have a different perspective.

“Too often, people say ‘I don’t want to work with that distro,’ or ‘I didn’t know where to send my patch.’ We have lots of tools, like mailing lists. Unfortunately, they shake things up so that the loudest ideas bubble up. Think about poisonous people! How can you run a project so that the best ideas win, vs. the loudest ideas win?”

But even worse than collaboration within a project, he said, is broader collaboration between the myriad projects within the open source community. “We also have version control, wikis, Web sites, lots of tools, but most of those tools focus on collaboration within a project. If you’re in a project, you know where to go, and who to talk to within your project. The big problem is collaboration between projects.”

Shuttleworth pointed out a couple of areas where things fall down, like translations into other languages, bug management and patch distribution.

“One clear thing is that translations fail to move upstream. People translate what they’re using, but it doesn’t move up. It’s not intentional, but we don’t have standard processes or conduits so they can send their translations upstream. The same thing with bugs: You may have people who reported a bug to Red Hat, or SUSE, or Debian, or Ubuntu — but they’re like silos. We need a way to aggregate those eyeballs.”

He added, “It’s the same with patches. A patch solves one person’s need. How can we accelerate the process of moving patches upstream? We need something that’s more of a star topology, to move patches up and around.”

A bigger question, Shuttleworth asked is, “How do we minimize the barriers to participation, and include the v3 guys, and BSD, etc? A central database is not the answer.”

A minor question, which clearly irks Shuttleworth: “Why can’t I programmatically access the kernel development team’s database through APIs?”

Shuttleworth then jumped back to translations. “Here’s the point of friction: The tools are difficult. In order to contribute translations, you have to have Committer access [to a project], and that’s not good. How can we get people to contribute translations without requiring they be developers, or that they use development tools? There are also different frameworks [for localization]. Mozilla’s framework is different than OpenOffice, for example. It’s worth the effort to break down the barriers. We need to agree that the inability for work to flow across projects is a problem.”

On bug tracking, he added, “The biggest problem is the friction to file the bug! You don’t understand the bug tracker, or the project’s conventions, or even know if you would get a response. If you have to have a personal relationship [with someone on a project] in order to get a bug into the project, that ‘s a problem. We should think about federation, and find standard ways to describe bugs, and common APIs for submitting bugs.”

Shuttleworth commented, “In Ubuntu, we took as role models Mozilla because we like their roadmaps, Gnome because of their commitment to release cycles. We like blueprints of specifications, since they let micro communities form around a particular piece of work.”

He pointed out a real problem, that the meritocracies found in many open source projects may help development, but may hinder deployment and adoption. “How do we remove the hard lines between committers and everyone else, i.e., the line between first-class citizens and everyone else? Why is someone who receives a tarball [instead of accessing the source code repository] a second-class citizen? Getting everyone to participate is important.”

Shuttleworth concluded by speaking to people who run projects, and the lack of tolerance found within many projects: “Ultimately, we need to remember that while tools are important, it’s people that collaborate. People, not robots. The decision-making process is important. Unfortunately, meanness spreads: I get disappointed when I hear that if you disagree with me, you’re stupid. We agree on an awful lot. I would appeal to all of you to remember that leadership is important, and collaboration is more important than our differences.”

Z Trek Copyright (c) Alan Zeichick

If you talk to anyone about open source software these days, the topic of the Free Software Foundation’s General Public License v3 is sure to come up. People – or rather, those people who think about open source licenses – generally fall into three categories:

• GPL v3 is the worst thing since Rome was sacked by Alaric, King of the Visigoths, and will result in the destruction of the open source movement. This seems to be the majority opinion, and is argued very passionately.

• GPL v3 is an improvement over GPL v2, primarily because it might do a better job of reining in Microsoft. This seems to be the minority opinion, and is expressed without much passion.

• GPL v3 is just another open source license, use it if you want, and don’t if you don’t. Not too many people express this opinion, but it’s the one that I hold.

My own preference would have been that the Free Software Foundation, which “owns” the General Public License, had spawned GPL v3 as a brand new license, rather than presenting it as a simple update of the very successful GPL v2. For example, there is already the LGPL, the Lesser General Public License. Perhaps what we’re seeing as GPL v3 could have been called the GGPL v1, for the Greater General Public License, or SGPL v1, for the Strong General Public License.

Why does this matter? Well, some software (I don’t know how much) is written to say that it conforms to “GPL v2 or later.” Thus, such software will automatically will fall under the provisions of GPL v3 when that license is finalized, which I doubt is what the people who wrote “GPL v2 or later” intended.

Note: I think that Richard Stallman and the FSF knew exactly what they were doing, and see the automatic migration of all “GPL v2 or later” projects to GPL v3 as essential to swiftly establishing a base of GPL v3 intellectual property.

I was pleased that at the Linux Foundation Collaboration Summit (which I attended on June 13), many presenters held a level-headed view of the GPL v3 issue. One speaker, Dan Frye for IBM, seemed to speak for many there when he said, “Just chill when v3 comes out. Let’s just see how things go.”

In all fairness to those who were appalled by the first drafts of GPL v3, the initial versions were terrible, and would have done tremendous harm. Richard Stallman deserves kudos for his uncharacteristic flexibility. A kinder, gentler Stallman – who’d have thought it?

So, chill, everyone.

However, IANAL: I am not a lawyer.

There was a panel of attorneys at the Summit discussing intellectual property and license issues, and it was fascinating. McCoy Smith, a senior attorney at Intel, made the point that while individuals and even companies can try to adhere to what a community generally believes that license says, that’s only valid as long as people are working together directly. As soon as a dispute goes to court, he said, decisions are made by the specific language in the license – and not by what everyone “knows” the licenses supposed to mean.

Smith, who worked on the GPL v3 project, said, “This came up a lot with GPL v3. The FSF said we had this v2 license for a number of years, but there are potential issues with the way that it was written that could be exploited by its enemies. So, there’s a problem where amateur lawyers might not capture their intentions property. However, this largely polices itself. If everyone in the community abides by the intention, and it’s not going to court, you’re okay.”

But if Linux does go to court – perhaps in cases involving Microsoft or SCO – licensing terms like the GPL v2, which were written by amateur lawyers, might not be good enough.

As Jason Wacha, general counsel for MontaVista Software, said, “Sometimes amateurs are better at describing what they want to do. The five most dangerous letters in the English language are IANAL, especially when you try to interpret copyright or patent law. If you’re not a lawyer, don’t try to interpret the law. You have to go by the actual words in the statutes, not by how people have been interpreting that statute.”

So, chill – carefully.

Z Trek Copyright (c) Alan Zeichick

At the Linux Foundation Collaboration Summit (which I attended on June 13), Jim Zemlin, executive director of the Linux Foundation, accurately portrayed that the Linux movement has changed. He stated that, from the enterprise perspective at least, the days of having to build awareness for Linux, and for open source in general, are long since over. He’s right. Within most organizations, in my experience, Linux is seen as just as viable an option for servers as Windows and Unix (specifically, Solaris). That’s both good –and bad.

We’re past the days of religious wars, and we’re also past the days where Linux is chosen merely because it’s free or because it’s open source. The costs of using and deploying Linux aren’t significantly different than those of deploying Windows or Unix on server. To be honest, the licensing cost of cost of software is only a small part of the long-term total cost of ownership.

However, Linux itself has challenges, which I was pleased that the Linux Foundation meeting was honest enough to admit.

For example, because Linux is developed primarily by individuals working on things that they find interesting, Linux lacks the directed evolution of Windows, Unix, Solaris or Mac OS X. Thus, there were many people at the conference talking about the inconsistent state of power management within the Linux kernel and kernel-level device drivers. Everyone acknowledged that it is a problem – but nobody could do anything about it.

Similarly, there are many views as to the best way to prevent fracturing of commercial Linux distributions around kernel levels, but no agreed-upon way to solve that problem. While the Linux kernel team itself is run as a benevolent dictatorship, most other decisions are left up to the individual commercial distributions, who pointedly do not coordinate with the Linux Foundation or with each other.

Of course, not all the issues facing Linux have to do with process. There’s a lot of dissent within the community regarding licensing. The Free Software Foundation’s General Public License v3 is a huge polarizing factor, and as the Linux Foundation explains, even if the bulk of the community wished to adopt the new license (which is uncertain), the process of moving code to the GPLv3 would be incredibly time consuming. It just ain’t gonna happen, folks.

For now, and for the next several years at least, it seems clear that be three separate Linux worlds:

• Linux on servers: Hugely successful. Because servers typically run on a fairly limit set of hardware; most enterprises choose an operating system when they buy server hardware; and because a particularly server runs only a small number of applications at one time, Linux’ limitations in terms of device drivers and applications are not a significant factor.

• Linux on mobile devices: Hugely successful. As the representative from Motorola, Christy Watt, said during the Linux Foundation meeting, “We believe that about 60% of our base will be on Linux soon. We have shipped 6 million devices on Linux already.” The recompilable kernel, ability to create custom drivers, open-source licensing and cost factors are excellent for phones, PDAs and other devices.

• Linux on the desktop: Not even close. There’s been tremendous strides in this area, but device drivers remain a challenge, particularly for top-end graphics cards. Another challenge is the proliferation of user interfaces. Despite the amazing success of Ubuntu Linux, desktop and notebook PCs will be found mainly in three locations: task-specific desktops (such as cash registers or point-of-sale systems); on machines used by True Believers; and in low-cost desktops, such as those deployed into the third world. For the mainstream home and office market, the world belongs to Windows (and the Mac, as a distant runner up), and it’s going to stay that way for a long, long time.

Z Trek Copyright (c) Alan Zeichick

According to Managed Objects, a company that sells software service management software, software is at the root of all evil.

Managed Objects has conducted a survey of 200 IT managers in the United States, and report that 61% of those IT managers say that software is generally the culprit for IT downtime, compared to 21% who blame hardware. (The rest don’t know if it’s hardware or software.)

Further, according to Managed Objects, IT managers are more likely to blame software for IT failures “if their organization relies heavily on home-grown applications.”

To quote:

Complicating the battle to reduce downtime is the pervasiveness of revenue-driving homegrown or custom applications within large companies’ infrastructure, which according to survey results can represent up to 90 percent of some organizations’ application mix. Within organizations relying on more homegrown applications than off-the-shelf offerings, more than 80 percent of respondents blamed software as the primary cause of most outages.

These findings are particularly significant when placed against the backdrop of results that showed just how often application downtime occurs. 82 percent of surveyed organizations reported application outages in the last year significant enough to impact their businesses, at an average cost of more than $10,000 per hour and an average duration of between three and four hours.

Bear in mind, of course, that it’s in Managed Objects’ best interest to make this issue look dire. The company wants to sell you tools designed to measure application uptime, enforce software service level agreements and implement end-to-end software management. This study was done as part of their marketing push around a new product called Application Configuration Study, launched last week. Plus, a survey of only 200 IT managers constitutes a pretty small sample, and it’s unclear how the study defines an IT failure, if it does at all.

Even so, the results are interesting, and accord with my own observations. Hardware failures in a modern data center are fairly rare. If proper precautions are taken (such as using network load balancers, fault-tolerant devices and redundant systems), a device fault should not result in an “IT failure” at all.

Z Trek Copyright (c) Alan Zeichick

The software security tools market has been ripe for picking, for two reasons:

• We had a lot of small, privately held companies developing exciting, but in many cases, overlapping, technology, but those companies had trouble finding customers and going to market. Their exit strategy is to be gobbled up by a big fish.

• The big fish were significantly behind the times when it came to integrating security into their developer tools. Their business models favor buy-it instead of build-it for this type of technology, but they weren’t buying.

Until recently, that is. Until this month, none of the big fish, like IBM Rational, Mercury (now part of HP), Microsoft, Oracle, Sun and Borland/CodeGear, had integrated security as part of their IDEs, test/QA tools, or applicaction life cycle suites. The notable exception has long been Compuware, with its DevPartner SecurityChecker. It’s only been a matter of time before the fish starting munching, and frankly, I expected the feeding frenzy to start quite a while ago.

The dam broke a couple of weeks ago, when IBM announced the acquisition of Watchfire. Today, we have the second big move, as Hewlett-Packard today said it will buy SPI Dynamics.

This is only the start.

Microsoft needs to incorporate a top-quality security solution into Visual Studio Team System, encompassing not only the edition for software testers, but also the editions for architects and developers. This is a glaring weakness. While Microsoft could build the functionality itself (as it did with the other ALM tools in Team System), it would do better buying one of the existing players, and that’s what I think the company will do.

Oracle is in a similar situation: It doesn’t have software security functionality, either baked into its tools or as stand-alone offerings. It’s hard to predict what Oracle will do, whether they’ll build it, buy it or ignore it.

Borland and its CodeGear subsidiary are also behind the times. Software is a core part of the application life cycle, but there’s no specific security offering in the Borland pantheon, even in its Silk and Gauntlet tools. As with Microsoft, Borland could build or buy, but I expect them to buy it. I hope whatever they do gets baked back into the CodeGear IDEs; that’s where software security belongs, as tightly integrated into the developer desktop as spellcheck is integrated into a word processor.

The Eclipse and NetBeans projects need to start software security initiatives. The lack of a security project, either as a top-level element or as part of the Eclipse Test & Performance Tools Project, is a huge oversight, but given how the Eclipse Foundation works, it takes a commercial vendor to initiate a project. Now that IBM is buying Watchfire, it’s less likely that Big Blue will push Eclipse in this direction. NetBeans, by contrast, is driven by one company’s strategic vision, However, Sun needs to be more strategic and visionary here, and get on the ball.

Fortunately, there are more software security companies available for purchase. It won’t be long until we see the big fish gobble up Agitar, Armorize, Cigital, Fortify, Klocwork and Ounce Labs. Who do you think will go first?

Z Trek Copyright (c) Alan Zeichick

A great article, for all you methodology fans, is a cover story in today’s SD Times: “It’s Lean, But Is It Agile?,” written by Jennifer deJong.

As Jenn writes,

Are lean software development and agile software development—of which XP is the most prominent example— one and the same? They both are iterative approaches to developing software, and in some respects lean and agile are closely aligned, said Beck. Several other agile experts interviewed by SD Times agreed.

But when asked whether lean software development is an agile methodology, alongside the others that fall under the agile umbrella—Adaptive, Crystal, Dynamic Systems Development Method, Feature-Driven Development, Scrum and XP—only one of the experts said yes.

the story continues,

Forrester analyst Carey Schwaber agreed that lean software development and agile methodologies are two complementary schools of thought. “There are quite a lot of people out there using the terms ‘agile’ and ‘lean’ interchangeably, but they are not the same thing,” she said. She has also seen agile consultants attempt to differentiate themselves by citing expertise in lean. “But often that expertise isn’t much to speak of.”

It’s my experience that many developers are unfamiliar with the concept of lean software development. Take a few minutes, read Jenn’s, tell me what you think about lean software development. If you find the concept intriguing, check out “Implementing Lean Software Development,” by Mary and Tom Poppendieck.

Z Trek Copyright (c) Alan Zeichick

This morning, CodeGear sent me a new picture of David Intersimone, to replace the several-years-old file photo that I used in my previous blog posting and on our conference Web site. David I, as he’s widely known, is an incredibly well-respected developer evangelist at CodeGear, and is a keynote speaker at EclipseWorld 2007.

So, cast your votes: Which do you like better: New David I (pictured), or David I Classic?

Z Trek Copyright (c) Alan Zeichick

“Life’s too short to run beta software.” That’s long been my philosophy, specifically in regard to my own personal workstation.

Back when I did a lot of hardware and software testing, there were servers and workstations designated as testbeds, while my personal workstation (sometimes a Mac, sometimes a PC) was sacrosanct.

Every so often, however, I’m tempted to use the latest-and-greatest beta code, and this generally proves to be a disaster. It sure did with the Safari 3 beta for the Mac (announced at the Apple Worldwide Developer Conference), which I installed Monday afternoon, and uninstalled this morning.

There were three significant problems with this beta of Safari 3 on Mac OS X 10.4.9 — even when the new browser itself wasn’t running:

1. It devastated iChat, Apple’s instant messaging application. The application itself ran slower than molasses: even typing text into the entry field appeared o n e c h a r a c t e r a t a t i m e s o m e t i m e s s e c o n d s a p a r t. Pressing Enter to send a message caused a delay of between 10 and 30 seconds. Invitations to chats were painful: I’d hear the sound effect, indicating that someone wanted to chat, and then iChat would hang for up to two minutes before an error message dialog would pop up, saying that there was a problem joining the chat.

2. It wounded Skype, which kept shutting down. Sometimes Skype would close immediately after launch, and sometimes it would stay up for a few hours… but when I went to use it, I’d find that it had quietly closed.

3. It hurt the overall system, in that many apps seemed to be running slower. I have a menubar CPU meter, and also kept an eye on the Activity Monitor, and couldn’t see an obvious cause for the slowdown. The biggest issues were with apps that accessed the Internet, such as NetNewsWire, my RSS reader.

This is all a pity, in that Safari 3 itself seemed to be a real improvement over Safari 2.0.4, the current “stable” version. During this test period, I’d switched from Firefox 2.0.0.4 to Safari 3 as a primary browser, and it was doing fine… other than those unbearable problems.

Fortunately, Apple included an uninstaller with the Safari 3 beta, which brought back the Safari 2.0.4 code. Everything’s back to normal. And next time, I’ll wait for the finished code before putting it on my personal workstation. Frankly, I knew better.

The unfortunate aspect is that if Apple followed safe programming practices, and treated Safari as “just another application,” these compatibility issues should not have occurred, especially when the application itself wasn’t running.

I’ve not done the forensics, but it seems a reasonable guess that the Safari 3 beta installer replaced some shared libraries or altered the network stack in some way… which is bad, bad, bad. Applications shouldn’t be able to affect the operating system and its shared libraries. That type of naughty behavior is one of my biggest beefs with Microsoft (whose installers for products like Windows Media Player, Internet Explorer and Office replace key Windows DLLs), and I’m disappointed to think that Apple may have fallen into that trap.

Z Trek Copyright (c) Alan Zeichick

It’s an incredible coincidence. The same day that BZ Media put out a press release about the record-setting June 1 issue of SD Times, CMP Media put out a press release about its continuing shift to an online media company. That involves laying off 200 people, and shuttering three publications. Network Computing and Optimize. Both of them will be folded into InformationWeek, CMP’s newsweekly. SysAdmin is being closed outright.

To quote from the release:

“We found last year for the first time, our non-print revenues outstripped our print revenues,” CEO Steve Weitzner said. “This year that trend is continuing and the gap is actually growing. We want to realign internal resources around these growth areas and look at opportunities in the marketplace and really go after them.”

Consequently, Weitzner said, the company is putting its online businesses “at the center of everything we do and changing how we do print.”

SysAdmin and Network Computing will be missed. (Frankly, I don’t think that Optimize will be: I never quite understood who that magazine was for, and what value it provided.) The inclusion of Network Computing into InformationWeek seems like a terrible mistake – I’m not sure what value it offers either magazine’s subscribers or advertisers.

My own relationship with Network Computing goes back about a decade. That’s before CMP (which published Network Computing) merged with Miller Freeman (which published Network Magazine, of which I was the editor-in-chief).

Network Magazine was, in my opinion, the better-written, and more thoughtful publication discussing the emerging trends and high-level issues around LANs and WANs, SANs and NAS. However, Network Computing pounded us into pudding, with its heavy emphasis on labs-based product reviews. Written very pragmatically, talking to LAN and WAN managers about the stuff they needed to buy, Network Computing was unstoppable.

In September 2005, after CMP and Miller Freeman merged, Network Magazine was renamed IT Architect. The publication shut down after its March 2006 issue. CMP did the right thing in trying to differentiate Network Magazine and Network Computing.

The decision to merge Network Computing into InformationWeek makes no sense. Network Computing is/was a magazine for network managers and senior network administrators, who are solving specific technical problems, primarily around network and infrastructure software, as well as wide-area connectivity. InformationWeek is a news-and-analysis weekly written for CIOs and senior IT managers, discussing the “vital issues of the day” and providing guidance for making strategic IT decisions.

It’s not a fit. InformationWeek’s readers aren’t network managers and network admins. Frankly, looks like a naked ploy to keep some of Network Computing’s advertising revenue. CMP will doubtlessly spin a good tale, of course, as to why this IS a good fit. But if it’s such a good fit, why weren’t those advertisers already advertising in InformationWeek before this?

Z Trek Copyright (c) Alan Zeichick

We just announced a second keynote for the EclipseWorld 2007 conference: Robert C. Martin, founder and CEO of Object Mentor. “Uncle Bob” is a top expert in object-oriented design, with tremendous expertise in C++ and Java, and also speaks and writes brilliantly on agile methodologies and software craftsmanship.

Also, at EclipseWorld, “Uncle Bob” is teaching one of our full-day tutorials, on test-driven development.

It’s going to be great conference!

Z Trek Copyright (c) Alan Zeichick

Steve Jobs’ keynote at the Apple World Wide Developers Conference is sometimes huge with news, sometimes less so. This year’s news was weaker than most. Jobs didn’t unveil new hardware or new developer tools. He didn’t announce new software, but did distribute betas of Mac OS X 10.5 “Leopard” to paid attendees (not to press/analysts).

The announcement that won the biggest applause from WWDC’s 5000+ developers was that two major game companies – Electronic Arts and ID Software – will be increasing their support of the Mac platform. In EA’s case, a number of best-selling games will be released on for Mac OS X, and in the future, there will be some simultaneous releases for Windows, Mac and game platforms. ID’s game engine, which normally targets the Xbox and Playstation, will soon be available for the Mac.

There was less frequent cheering during Jobs’ demonstration of 10 features out of 300+ planned for Leopard, scheduled to ship in October. The biggest crowd pleasers were a new graphical desktop, an improved Finder and Core Animation.

The audience also liked that there will be more robust 64-bit support in Leopard – and that the new OS will come in a single SKU, priced at US$129, instead of the bewildering number of versions of Windows Vista, and reacted favorably to the surprise announcement of Safari for Windows (a public beta of Safari 3 for Windows and Mac OS X 10.4 is now available for download). The new backup application, Time Machine, also was popular.

The biggest yawner was a tool to let consumers make their own dashboard gadgets by making clippings from Web pages. Snooze.

The biggest disappointment was Jobs’ “one final thing” presentation of Apple’s official way for third-party developers to create apps for the iPhone: write and host Web apps for Safari using AJAX. That was not what this audience had hoped for.

Three elements in the keynote are worthy of additional discussion: 64-bit computing, Safari for Windows, and the iPhone development news.

64-bit computing. Apple has flirted with 64-bit software since the release of the PowerMac G5, but went back to 32-bit with the first Macs running the 32-bit Intel Core processor. Today, however, most shipping Macs today are using the 32/64-bit Intel Core 2 processor. Since Mac OS X 10.4 (and earlier) is 32-bit when running on Intel, the news that Leopard will include both 32-bit and 64-bit kernels is welcome.

What about 32-bit/64-bit interoperability? It’s common for 64-bit operating systems using the x64 platform (like the Intel Core 2 family and the AMD Opteron family) to be able to run 32-bit software. 64-bit Linux and Unix can generally run 32-bit binaries directly; Windows uses a software abstraction layer called WoW64 (which means Windows 32 on Windows 64). Mac OS X, remember, is based on Unix.

That’s the good news. The bad news is that the inverse is usually not possible: You can’t generally run 64-bit binaries on a 32-bit version of the operating system, even if the chip is nominally a 32/64-bit chip. The hardware architecture generally can’t handle it, and software emulation would be extremely computationally complex (and would have terrible performance). But that’s what Jobs seemed to imply: that you could run both 32-bit and 64-bit applications on both 32-bit and 64-bit Macs running Leopard.

Jobs wasn’t explicit, in stating that a 32-bit hardware platform (like my first-generation 32-bit iMac Core Duo) would run 64-bit apps under Leopard — but again, he implied it. This is a big question mark, and I have queries into Apple about it.

Safari on Windows. Jobs talked about the market share of Safari at 5% of all browsers, vs. 15% for Firefox and about 76% for Internet Explorer. To boost market share, he announced Safari for Windows, and the company released the public beta for Windows and Mac OS X 10.4.

The only benefit that Jobs presented for running Safari on Windows was speed: HTML rendering with Safari for Windows was about 2x faster than IE 7, and 1.6x faster than Firefox 2, he said. That’s not a reason to change browsers! Security, features and interoperability are ‘way more important than rendering speed.

Sadly, Jobs didn’t mention security, features or interoperability. Specifically, he didn’t discuss the reason why so many Mac users abandoned Safari 2 and embraced Firefox 1.5 (and later, of course, Firefox 2): There are too many Web sites, especially those optimized for Windows or using extensive Web 2.0 functionality, that simply don’t work right with Safari 2, but which work great on the Mac with Firefox.

I would have liked to have heard Jobs pledge to fix those issues instead of implying that only reason to choose a browser is its rendering speed.

iPhone development. Developers have been frustrated by the on-again, off-again rumoring about third-party apps on the iPhone. Looks like “off-again” won, and that’s a shame. Third-party apps are what define a platform and ensure its success. Jobs’ recommendation that developers should just build AJAX applications, and let customers access them via the iPhone’s embedded Safari browser, is a solution that will satisfy nearly nobody.

Sure, Apple has created hooks that would let AJAX apps use built-in iPhone services, like making phone calls or sending e-mail – but that’s not what developers want. They want an SDK. If that’s what Jobs really thinks customers or developers want to run third party apps over using AJAX in a browser, why bother encouraging the development of native Universal Binary software for Mac OS X desktops and notebooks? Running AJAX apps in Safari does not compute.

Not heard at the WWDC. Completely not addressed by Jobs: Application development for Leopard. Nothing about APIs, SDKs or an update of the Xcode development suite. Disappointing. Also not mentioned at the keynote was last week’s leak, by Sun’s Jonathan Schwartz, that Leopard would use Sun’s ZFS file system.

I went home from the Apple WWDC empty handed, both literally and figuratively. Beyond the preview of Leopard, and beta of Safari for Windows, there wasn’t much to get excited about – and neither of those really spoke to developers.

Z Trek Copyright (c) Alan Zeichick

Today, IBM announced that it’s buying Telelogic, a leading company in the modeling space, specifically a leader in model-driven development. Last week, IBM announced that it’s buying Watchfire, a mid-sized innovator in security and software testing. These are both savvy moves by IBM. But while the Watchfire deal is good for everyone, the Telelogic deal is less so.

With Watchfire, IBM shores up a big hole in its lineup: its IBM Rational tools are popular, but the 800-pound gorilla didn’t have the right offerings for helping companies focus on writing secure code. Buying Watchfire’s technology makes sense.

The days, I believe, of the stand-alone software security suite is soon going to be over: the IDE vendors, from IBM to Microsoft, are going to have to address security as a core feature of their software development products.

It’s analogous to spell checkers in word processors: once upon a time, spell checkers were external programs that people bought and used during a special “proofreading” phase of document creation. Today, continuous spellcheck is integral to not only word processors, but just about every application that has a text input field. That’s how it should be for spelling – and that’s how it should be for software security as well.

I see the Watchfire move as positive for the industry, as it brings software security to a higher level. The other giants, like Microsoft, Oracle, Sun, BEA and Borland, are going to respond by bringing secure coding front-and-center as well. Look for more acquisitions, more innovation, and more spending, as developers and end-users win.

With Telelogic, the motivation is consolidation. There are only a handful of companies that make top-shelf modeling software. IBM already owns the biggest fish in that pond, Rational Rose.

Telelogic’s offerings, added to IBM’s extensive testing and modeling portfolio, is a bid to buy market share, as well as fill in some technical holes in IBM’s offerings. I don’t see this as being necessarily good for the industry, or for customers (except for IBM’s customers); we’ve taken a powerful innovator and strong IBM competitor out of the market.

While the Telelogic move may spur further acquisition in the modeling and testing space, I don’t see it driving innovation, but merely reducing competition and consolidating market share.

Z Trek Copyright (c) Alan Zeichick

If your house or office is anything like mine, there are dead computer bits lying around everywhere. Desktop PCs, old notebook PCs, the occasional server, monitors, keyboards, even a pile of 36GB Ultra2 SCSI hard drives. I’m buried in computer detritus.

What can you do with it?

• Some stuff gets sold, though it’s hard to get enough money for an old computer to be worth the effort, plus it’s a hassle making sure that the hard drive(s) are sufficiently wiped clean of personal data and commercially licensed software.

• Some stuff gets donated, but that’s not always as easy as it sounds.

• Some stuff gets dismantled by an inquisitive teenager.

• But most sits around, waiting for me to recycle or trash it.

In the town that I live in, the garbage company has an E-Waste Recycling Program. That means, they’ll take the stuff if you bring it to the transfer station, but they’ll charge fees like $25 for a computer monitor or $10 for a computer printer. It’s a good service, but what a nuisance. (Not all local towns are even that accommodating.)

What about computer manufacturers? They’re starting to help.

For example, if you buy a new consumer PC from Dell, they’ll take back and recycle your old PC and monitor for free. (For $5, they’ll also plant a tree for you.) They’ll also take back and recycle any Dell product any time, even without a new product purchase. Dell even pays the shipping.

Similarly, Apple provides free recycling for as many as two devices when you purchase a new computer. A couple of weeks ago, I purchase a new MacBook Pro for one of our BZ Media employees at the local Apple store, and when I got home, there was an email from Apple (pictured) containing two FedEx RMA Shipping Authorization bar codes, ready to be slapped onto whatever products were ready to be disposed of.

I’m delighted to see leading computer providers taking a leadership role in computer recycling. My job is easy: Stop procrastinating, and use those FedEx shipping codes.

Z Trek Copyright (c) Alan Zeichick

I’m delighted to report that registration is now open for EclipseWorld 2007, coming in early November to Reston, Va. This is the third annual EclipseWorld, an independent conference produced by BZ Media.

This year’s conference, scheduled for Nov. 6-8, will be the biggest ever, with more than 70 full-day tutorials and technical classes. Our opening keynote will be from David Intersimone, developer evangelist (and VP of Developer Relations) for CodeGear, the tools subsidiary of Borland.

David I (pictured) is well known, and well respected, throughout the development industry for his real-world view of how programmers actually work. He’s talked to more dev teams, in more companies around the world, than anyone I’ve ever met. Nobody tells stories about “life in the trenches” like David I. His keynote is going to be a real treat.

Another huge highlight is the full-day tutorial T-7, The Europa World Tour, taught by Wayne Beaton. Wayne is a developer evangelist with the Eclipse Foundation, and in this full-day class he’ll take you on a whirlwind around the myriad projects that make up this summer’s “Europa” simultaneous release of the Eclipse platform and associated projects.

If you, or your development team, uses Eclipse or Eclipse-based technologies like the Rich Client Platform or IBM Software Development Platform, then EclipseWorld 2007 is where you want to be this November.

Z Trek Copyright (c) Alan Zeichick

It’s time for our regular quarterly look at the SCO Group, which released its fiscal second-quarter financials today. The quarter ended on April 30.

SCO’s revenue continues to fall. Total top-line revenue for this quarter was US$6,014,000, compared to $7,126,000 in the comparable quarter last year. That’s a 15.6% drop, if I did the math right.

But while SCO continues to lose money, the rate of loss is slowing. For this quarter, it lost $(1,143,000), which is a tremendous improvement over the $(4,694,000) loss in the same quarter last year.

According to SCO, “The decrease in revenue was primarily attributable to continued competitive pressures on the Company’s UNIX products and services and the improvement in net loss was primarily attributable to reduced legal costs and operating expenses.”

The company continues to run through its cash reserves, which have fallen to $11,181,000, compared to $12,664,000 six months ago.

The big question remains: Why won’t SCO’s directors exercise their fiduciary responsibility to their shareholders: abandon the lawsuit and fire Darl McBride? SCO has significant intellectual property assets (translation: cool technology), which will never flourish until both the lawsuit and McBride are gone.

Z Trek Copyright (c) Alan Zeichick

Computers, yes. PDAs, yes. Cell phones, yes. But programmable calculators as virus targets? Amazingly, yes. According to Symantec, the popular Texas Instruments TI-89 calculator can be infected by a virus named TIOS.Tigraa. (Credit to eWeek’s Brian Prince for reporting this story yesterday.)

According to Symantec, “TIOS.Tigraa is a memory-resident entry point-obscuring infector of ASM files on Texas Instruments TI89-compatible calculators (TI89, TI92, TI92+, Voyage 200).”

The company further states, “The virus cannot leave the calculator on its own, it requires that a user shares an infected file (either accidentally or intentionally) with another user.”

Fascinating, and scary. What will malware creators think of next?

Z Trek Copyright (c) Alan Zeichick

It’s a good week for developers: Sun pushes its compilers to do more with Linux and multi-core systems, while Microsoft unveiled more about the next version of Visual Studio.

Sun’s developer toolchain for native (C/C++ and FORTRAN) code is called Sun Studio. This is different and distinct from Java Studio, which is an entirely different toolchain for Java development. Both are based on the NetBeans open-source IDE framework, but that’s just about all they have in common. The Sun Studio tools are evolved from the old Forte tools. The announcement on Monday was that Sun Studio 12 is now available.

For a while now, Sun Studio has focused primarily on Solaris on SPARC and x64, but recently also embraced Linux. The new version, Sun Studio 12, improves the support for Linux on x86/x64. But the big changes have to do with tools for debugging, profiling and optimization for multi-core processors.

This is important because, when you’re tuning for top performance, there are subtle differences in the behavior of, say, a server with eight unicore processors, four dual-core processors or two quad-core processors. Yes, all have eight hardware threads, but you’ll have different issues for cache sharing, memory access, deadlocks/races and so-on.

Sun has tweaked the tools to work with the Intel and AMD microprocessor architectures. Both the Opteron and Xeon chips have essentially the same instruction sets, but Intel uses an eternal memory controller with a uniform memory access architecture, while AMD has on-chip memory controllers and uses a NUMA architecture.

Sun’s tools are all free to use. Support contracts range from US$864 to $1,200 per year per developer seat.

Meanwhile, at Microsoft TechEd, Microsoft formally named Visual Studio “Orcas” as Visual Studio 2008. A newly announced feature is the “Visual Studio Shell,” which sounds like elements of the Eclipse platform, as it lets software companies ship their own custom IDEs based on Visual Studio. A second beta of VS2008, with the shell, is supposed to come out this summer.

Z Trek Copyright (c) Alan Zeichick

I’m delighted that despite all predictions to the contrary, print publications are doing just great. The June 1, 2007, issue of SD Times is the biggest in our seven-year history. The 60-page issue has more articles, and more advertisements, than ever before.

This issue includes the 5th annual SD Times 100, as well as three cover stories, on how Sun is challenging Silverlight with JavaFX; how VMware adds the ability to replay crashes; and how the future of software will embrace both services and collaboration. There’s also special report on taking a disciplined approach to healthy software.

It’s a great issue, with outstanding articles. Tons to read. If you’re a subscriber to our print edition, watch your mailbox. If you don’t subscribe, you can read the articles at our Web site, or download a 5.1MB PDF of the entire issue.

In the funny coincidence department, the special report in this issue of SD Times uses a very similar image to the cover art for the June 2007 issue of another BZ Media publication, Software Test & Performance: Both show flexible female athletes bending over backwards. (The ST&P cover line is “Limber up your team with agile methods.”)

I’d like to recognize the tremendous efforts by all the editorial, ad sales and production teams of both magazines for their great work this month, with special kudos to the art directors: Mara Leonardi on SD Times, LuAnn Palazzo on ST&P, and Erin Broadhurst on both publications (she worked on the SD Times 100 “Bam! Pow!” artwork).

Z Trek Copyright (c) Alan Zeichick

The winners are…. YOU! If you’re a software developer, or development manager, you’ll want to read SD Times’ fifth annual SD Times 100, a listing of the most important companies, people and projects in the software development industry.

The 2007 SD Times 100 came out today, in the June 1, 2007, issue of SD Times, and of course on our Web site, where we write:

Foiling malicious attacks where they may strike! Stomping out evil, site-bleeding bugs from the netherworld! Single-handedly lifting development teams out of harm’s way! Look! Up and down the list! It’s a leader! It’s an influencer! It’s the SD Times 100 (with apologies to DC Comics)!

This marks the fifth year we have given a nod to the organizations and individuals who have demonstrated leadership in their markets, based either on their share of those markets or on their technological innovation. This year’s SD Times 100 shines a somewhat brighter light on the industry “Influencers” — those companies that we’ve identified as having the greatest impact on the software development landscape. Our lists of winners, by category, remain the same. These, then, are the newsmakers and noisemakers of 2006.

Z Trek Copyright (c) Alan Zeichick

Ever since Microsoft shipped Office 2007 for Windows, Mac users have been at a disadvantage. Office 2004 for the Mac (the current version) can’t read and write the new file format used by Office 2007. Microsoft didn’t place the creation of file-format converters for other platforms (and for older versions of Office) on the critical path for the software release.

We’ll leave it as an exercise for the reader to decide if this was intentional or accidental. (I first wrote about this in February, in Singing the .docx blues.)

A week or so ago, Microsoft released its first beta of the Microsoft Office Open XML File Converter for the Mac. As far as I can tell, this was a fairly stealthy release; I only happened across it a couple of days on a Microsoft blog.

This release, labeled version 0.1b, is only for Word documents (.docx) and Word macro-enabled documents (.docm). What does it do? It converts .docx and .docm files into .rtf (rich text format) documents.

Sigh.

As the Microsoft blog says,

“We do not, however, want to see you inadvertently mess up any critical documents you are working with. For that reason, only one-way (read only) conversion is supported in this beta. When sending documents back to colleagues and contacts, we recommend saving to the default .doc format from Mac Word (listed as “Word document” in the save dialog). Similarly, we continue to recommend that you advise friends and colleagues who use Office 2007 and collaborate regularly with Mac users to save their documents as a “Word/Excel/PowerPoint 97-2003 Document” (.doc, .xls, .ppt) to ensure that the files can be robustly shared across platforms while waiting for final availability of Office 2008 for Mac.”

What about a real converter? Microsoft reiterates that Office 2004 users will have to wait until months after Office 2008 for the Mac (pictured) comes out:

“We plan to release a final integrated converter for Office 2004, which will appear as an update that allows you to simply open and save the new file formats as if they’d always been there (though, some of the newer functionality expressed in the formats will naturally only be available in Office 2008). We are on track to deliver this final integrated converter for Office 2004 six to eight weeks after Office 2008 for Mac is available.”

Without casting aspersions on the skill and dedication of Microsoft’s Mac developers (who must feel like fish out of water in Redmond), this is darned disappointing.

It’s also a telling indictment of Open Office XML, if file format converters are hard to create. But then, anyone who actually looked at the 6,039-page spec for Office Open XML knows that Microsoft’s intention was to create a format that would be impossible for anyone but Microsoft’s Office for Windows team to implement.

If Microsoft’s own Mac development team can’t get it right with a reasonable amount of resources in a reasonable period of time, what chance does anyone else have?

Z Trek Copyright (c) Alan Zeichick

Microsoft TechEd, the company’s top training event for developers and systems administrators, is coming up in a few weeks: June 4-8, in Orlando. TechEd is focused tightly on currently shipping Microsoft tools, platforms and applications, so you’ll see lots of classes on things like Windows Vista, Visual Studio 2005, SQL Server 2005, Office 2007 and so-on.

TechEd is great educational, but it doesn’t show you what’s coming down the road. That’s where Microsoft PDC comes in. The Professional Developer Conference focuses on next-general platforms: think Windows Server 2008, the “Katmai” version of SQL Server, Visual Studio “Orcas,” Silverlight, and so-on. PDC had been scheduled for October 2-5, 2007 in Los Angeles — dates that I had confirmed just last week.

However, sometime in the last 24 hours Microsoft quietly canceled or postponed PDC 2007, saying,

We are currently in the process of rescheduling this fall’s Professional Developer Conference. As the PDC is the definitive developer event focused on the future of the Microsoft platform, we try to align it to be in front of major platform milestones.

By this fall, however, upcoming platform technologies including Windows Server 2008, SQL Server codenamed “Katmai,” Visual Studio codenamed “Orcas” and Silverlight will already be in developers’ hands and approaching launch, which is where we’ll focus our developer engagement in the near term.

We will update this site when we have a new date for the PDC that is better timed with the next wave of platform technologies.

I’d been looking forward to attending PDC, even though it conflicts with our own Software Test & Performance Conference Fall 2007 in Boston. (Creative flight arrangements would allow me to attend both events.) Thus, from a personal standpoint, this date change is good news. Still, this move by Microsoft is surprising: When do they intend to teach developers in depth about these new technologies?

Z Trek Copyright (c) Alan Zeichick

I have nothing against gambling. I enjoy playing blackjack from time to time in places like Las Vegas and Lake Tahoe, and even turned a nice profit once at the Monte Carlo Casino in Monaco playing craps and roulette.

However, we should all be outraged at this report saying that the U.S. military makes money from slot machines in its overseas military bases. The military profits from its hard-working officers and enlisted personnel, to the tune of US$130 million last year from the Army and Marine Corps alone. (Figures weren’t provided in the CBS story from the Air Force and Navy.)

The military, according to the story, claims that this revenue funds recreation facilities at those bases. That’s no excuse. Such facilities should be funded by taxpayer dollars, not by exploiting the soldiers. We should support our troops, both in war zones and in peaceful postings overseas, as much as possible. We should not take advantage of our troops’ loneliness in this despicable manner.

Z Trek Copyright (c) Alan Zeichick

Renee Bader Niemi, I owe you dinner!

I’ve known Renee for nearly two decades, starting from when she worked on the launch of the Poqet PC, the first MS-DOS palmtop around 1989. At the time, I was executive editor of IDG’s Portable Computing magazine.

After Poqet folded, we worked together again when she was at NEC Technologies’ laptop division, and then through the late 1990s, when she served as a vice president of Xircom, which Intel purchased in 2001.

You can see coverage of a panel discussion that I led on “Palm-Sized, Hand-Held Devices,” at Spring Comdex 1999, with Renee, Bill Witte of 3Com/Palm, and Richard Hall of Handheld PC Magazine.

The last time we talked was mid-2000, when she worked at MobileSys, a wireless messaging startup. We ran a story about MobileSys in the
Aug. 1 issue of SD Times. Today, Renee is vice president and general manager of the mobile and entertainment group at Plantronics, which falls outside SD Times’ coverage area.

I remember a conversation that Renee and I had in 1998 or 1999, when Xircom introduced the RealPort 10/100 Network Adapter, an early Fast Ethernet adapter for notebook PCs. That was in the era, hard to believe, when many notebooks still didn’t include Ethernet ports. Renee was the product manager for the RealPort series.

We discussed the technical and consumer implications for Fast Ethernet on a notebook, in an age when there wasn’t much demand for bandwidth to the desktop, other than for large file copies. The next speed bump, Gigabit Ethernet, was super-expensive, and was only found in switch uplinks and a few high-end cards for servers.

During the conversation, I light-heartedly bet Renee dinner that we’d never, ever have Gigabit Ethernet running into a notebook PC. “Why would we ever need it?” I said, or something to that effect. Renee insisted that while it would take a few years, GigE to the notebook would come to pass in less than a decade.

Well, Renee was right. GigE is everywhere today. I realized this morning, when looking at my Gigabit Ethernet workgroup switch, that every computer on the office LAN – including my Apple MacBook Pro – is running Gigabit Ethernet. The only slower devices are things like my laser printer and DSL modem, which use Fast Ethernet.

So, Renee, give me a call. Dinner’s on me!

Z Trek Copyright (c) Alan Zeichick

When Guy Kawasaki became a technology evangelist for Apple, the world lost a wonderful stand-up comic. To wit: The audience for his half-hour keynote address at the Salesforce Developer Conference was in stitches for about, oh, 30 minutes.

Guy has a wonderfully casual, self-deprecating style. He bemoaned that he potentially lost a couple of billion dollars by deciding not to interview for CEO of Yahoo, years ago, because he wanted to stay home with his young children. In retrospect, he knows he made the right choice. It was worth giving up a billion dollars to be with his family. But that second billion… that pisses him off.

No, I’m not going to transcribe Guy Kawasaki’s rapid-fire bon mots; I couldn’t do them justice. Instead, I’m going to summarize the ten rules that he (in his role of venture capitalist) says are “the art of the start.” (That’s also the theme of his 2004 book of the same name.)

These are his headings, but my interpretation of his words; these aren’t literal quotes.

1. Make meaning.

Great entrepreneurs are motivated to change the world, make the world a better place, perpetuate good things, end bad things. Don’t be focused on just making money. If you tell people that you’re in business to make money, you’ll attract MBAs and consultants. If you tell people that you’re going to change the world, you’ll attract true believers and enthusiasts who can help you.

Also, think big. Think about jumping curves. Don’t set out to make something that’s 10% better than what’s available; strive make something that’s 10x better. Change the game, don’t just refine it.

2. Roll the DICEE.

What are the tactical qualities of great products?

D = Deep, with a lot of rich functionality
I = Intelligent, anticipating the user’s pain
C = Complete, it’s the totally of the experience
E = Elegant and simple.
E = Emotive, you love them or hate them, but feel something strong

3. Make a mantra.

Most companies make mission statements. Those are worthless. Instead, come up with three or four words that explain, to the world, why your company or product exists.

4. Get going.

Too many entrepreneurs wait, for the perfect tool, the perfect moment, the perfect situation, the perfect whatever. Don’t wait. Go ahead and:

• Think different. Create the product or service that you want to use yourself.
• Polarize people. Don’t intentionally piss people off, but don’t be afraid to piss people off. Great products polarize people.
• Find a few soul mates. Don’t do it alone. The optimal number in a startup is three: One who can make it, one who can sell it, an d one who can pull it together. Don’t find people like you; you want to a team with balance of skills.

5. Define a business model.

• Be specific. Know exactly who your customer is, and exactly how you’re going to get your money out of her purse. (Your customers’ money is really your money, they’re just temporarily holding it for you.)
• Keep the business model simple. Innovate in your software and your architecture, but not in your business model. The trick to success is to make something for $1 and sell it for $5.
• Ask women about your business model. Men will be focused on what you’ll “kill,” and will almost always say “yes, go for it.” Women will give more balanced, and practical, advice.

6. Niche thyself.

Imagine a simple four-quadrant chart. The vertical access defines your ability to provide a unique product or service. The horizontal axis defines the value of that product/service to the customer. You want to go high and to the right, offering the most unique product/service, with the best value to the customer.

Here’s what the corners mean:

Lower left: The dot-com corner. You’re selling a common product with little value to the customer.
Lower right: The price corner: You’ve a common product that the customer wants, but so does everyone else. You’ll always be competing on price.
Upper left: The stupid corner. Nobody else makes it, but nobody wants it either.
Upper right: The value corner. You’ve got the only thing that solves a big problem. Ka-ching!

7. Follow the 10/20/30 rule of PowerPoint.

Everyone does 60 slides for a 60-minute presentations. Don’t do that, since you’re talking too fast for too long. Instead spend 20 minutes giving a ten-slide presentation that clearly defines the problem, your business model, and how you’ll make work.

The “30” is the optimum font size.

8. Hire infected people.

Look for people who are infected by the love of what you’re doing. Sure, it’s nice to get people with the right educational background and the right work experience. But it’s more important to find people who love what they do.

• Ignore the irrelevant when you’re interviewing and hiring.
• Hire better than yourself. If you hire b-level players, soon you’ll be surrounded by bozos
• Apply the “Shopping Center Test.” If you interviewed a person in the morning, and then that evening you see her across the mall: If you want to run over and talk to her, hire her. If you don’t want to run over and talk, don’t hire her. Hire the people that you want to work with.

9. Lower barriers to adoption.

You’d be amazed at how many barriers there are to adoption, and you won’t always know where they are.

• Let a hundred flowers blossom: Your customers will find better uses for your product than you’d ever imagine. Create an environment where that can happen.
• Find the true influencers. While everyone says they want to sell to CEOs or CTOs or CIOs, the true influencers are the people who get the work done, and know that your product/service can get the word done better.
• Embrace your evangelists. These are customers who aren’t motivated not by options or salaries, but are the early adopters who love what you do, and want to help. Don’t be scared of them, or of their ideas. Embrace them.

10. Don’t let the bozos grind you down.

There are two kinds of bozos:

• Slovenly bozos who drives a rusty car. You can see that they’re bozos, and you know to ignore them.
• Svelte rich bozos who dress in all black, drive expensive cars and wear fine watches. You think they must be smart and successful. Well, they might be a bozo in disguise. If they tell you that you’ll fail, and you listen, then they’re right: you will fail.

Guy’s summary:

• If you want to change the world, want to make meaning. That doesn’t mean doing things 15% better. you have to jump to the next curve — 10x better, not 10% better.
• Go high and to the right.
• Let a hundred flowers blossom.
• Don’t be stubborn, thinking that you know who the customer is, and that’s the only person you need to talk to.

Z Trek Copyright (c) Alan Zeichick

About 700 people crowded into the Santa Clara Marriott last Monday, for the Salesforce Developer Conference. I stayed for the first half of the event, enjoying the opening and keynote speeches, but then bailed after lunch. (The afternoon program consisted of technical sessions on Apex programming, enterprise mashups, and launching a business based on Salesforce.com’s hosted platform.)

What Salesforce has done lately is impressive, no doubt about it: turned the the platform for its hosted CRM software into a multitenant platform for your own enterprise programs. It’s more than just a hosted service, like Google, eBay or Amazon, in that you can truly write your own programs and run them on the hosted environment. It’s also more than a reusable grid, such as the Sun Grid Compute Utility, due to the rich set of services that it provides. (The Sun utility resembles an timeshare batch-processing mainframe, complete with billing by the CPU/hour.)

No, what Salesforce.com has created is unique; at least, I’ve not seen anything like it. Not only that, but it’s also profitable; the company is making money hand over fist. It’s becoming quite aggressive, in fact, is pushing its hosted platform (and its Apex language) to ISVs. Good call. One reason why Microsoft destroyed OS/2 is that Microsoft embraced third-party developers, showering them with love and free goodies, while IBM lurched between exploiting and ignoring them. Salesforce CEO Marc Benioff has learned that lesson well: Third party developers can be your most fervent evangelists. Every dollar you spend on them will be returned a thousandfold.

There wasn’t much news at the Salesforce Developer Conference, although Benioff kept hinting about a big announcement with Google which was covered in the business section of the San Francisco Chronicle, and in fact, several speakers made in-your-face references to the fact that they couldn’t talk about Google. It was fairly obnoxious. (Benioff also disclosed that there’s a signficant rebranding of Salesforce.com coming soon, as the company seeks to move beyond its CRM origins.)

However, there were some really cool demos, such as one by Kevin Lynch, senior vice president and chief software architect at Adobe, showing how to integrate Salesforce.com’s online services into a locally executing online/offline application. This was all based on the forthcoming Apollo platform, which merges Flash and Flex. Very cool, very impressive.

Even more significant, however, was the demonstration of Salesforce SOA, which allows custom-written Apex applications to push and pull Web services, based not only on an enterprise’s publicly exposed services, but also commercial services like those from Amazon, FedEx, Google, and others. (Whoops, we’re not supposed to talk about Google.) Impressive.

While the Salesforce.com platform is technologically impressive, bear in mind that many of the third-party applications as simply single-function add-on buttons to the core CRM software itself. (Not all are, but many are.) In fact, when I asked one successful Salesforce.com ISV whether it would be fair to characterize third-party apps as being akin to Excel macros, he agreed.

Remember, one generally has to be a subscriber to the Salesforce.com service in order to benefit from third party applications hosted on the Salesforce.com platform — just as you need to buy a copy of Excel from Microsoft in order to run an Excel macro.

In April, Salesforce came up with a model where users who don’t use the CRM application could have access to third-party applications. The “Platform Edition” subscription costs only $50 per user per month (with limited access to applications and paid tech support), or $100 per user per month (with unlimited access and free tech support). Plus, of course, the cost for the third-party application. So, while Salesforce.com is innovative, it’s not cheap; those fees add up. But there’s no doubt that it’s innovating — and it’s innovating very quickly.

The highlight of my visit to the Salesforce Developer Conference was a half-hour talk by Guy Kawasaki. I’ll blog about that separately.

Z Trek Copyright (c) Alan Zeichick