The Traveling Wilburys released their two official albums in 1988 and 1990. Ever since then, my wife and I have listened to the cassettes we bought nearly two decades ago, later supplanted by a “rip” of those cassettes onto CD-R discs.

But now, our lives have changed, with the newly released CD set comprising both of their albums, with some bonus tracks as well as a DVD. The CD set came in today’s mail (thank you, Amazon!), and we’ve been enraptured. The sound is crisp and clean… but then again, we’ve been listening to a cassette since 1988, for heaven’s sake.

The best are the first-disc tracks featuring the voice of the late Roy Orbison, but there’s not a bad song in the bunch. If you’re a fan of Bob Dylan, Jeff Lynne, Tom Petty, George Harrison and Orbison, you need this.

Z Trek Copyright (c) Alan Zeichick

To many people, it seems that open source software movement begins and ends with Linux. Take, for example, the popular news Web site, NewsForge, which has done a good job of covering the entire open source universe. Last week, its owners, SourceForge Inc. (formerly known as VA Software), announced that it was folding NewsForge into its Linux.com site.

Does that imply that everyone who is interested in news about open source software is focused on Linux? You’d think that SourceForge would know better.

As a fan of open source software, but as someone who doesn’t use Linux on my desktop (I use Mac OS X 90% of the time, Solaris 5% and Windows XP 5%), I guess I find the lofty position of Linux atop the entire open source software movement troubling.

There are many, many open source projects other than Linux, folks. And not all open source projects see themselves as locked in a religious life-and-death struggle against Microsoft Windows.

For example, how about the myriad Eclipse projects? Most people that I’ve seen coding with Eclipse are running it on Windows workstations, and many are targeting Windows servers as a runtime for Java apps built on Eclipse. How about OpenOffice? Sure, it’s a way to bring a first-class Office suite to Linux – but it’s also a way to bring a cross-platform Office suite to Windows and the Mac as well.

Mozilla? Lots of people run Firefox on Windows and Mac. Apache? Many of Apache’s projects are centered around Linux, but many of them aren’t. NetBeans? It’s cross-platform. OpenMake runs on Linux, Windows and Solaris. Speaking of which, you can’t tell me that OpenSolaris is part of the Linux agenda.

Many people like using the gcc/gdb tools from the Free Software Foundation precisely because they’re excellent and cross-platform, not just because they’re available for Linux.

Yet, in many open source projects, there is nothing but contempt for anyone who would use an operating system other than Linux, even if the project itself supports other platforms. I remember, not very fondly, posting a message on a project newsgroup about a problem we had running an open-source package on a Windows server. The sole response advised me to run the package on a Linux server instead. Thanks for nothing.

Yes, Linux is arguably the highest profile open source software project. I won’t dispute that. However, the common belief that it’s at the center of the open source universe (as in NewForge’s mission to be “the Online Newspaper for Linux and Open Source”) does credit to nobody.

The open source community should realize that not every supporter of open source software uses Linux, or even cares about Linux. The open source community, also, needs to be less contemptuous of open source projects that don’t focus on Linux, or which may not even run on Linux.

Equally, of course, enterprise IT managers need to realize that Linux is merely one of many open source projects. You can choose to run Linux, or choose not to run Linux. You can also choose to use other open-source platforms and tools, or choose not to. But your support of open source in general, should be distinct from your interest in Linux. Sadly, this is difficult, since so many open source zealots focus on promoting Linux to the exclusion of the movement’s other success stories.

By pretending that Linux is the epitome of the open source movement, the open source community is acting against its own best interest.

Z Trek Copyright (c) Alan Zeichick

A highlight of the Linux Foundation Collaboration Summit (which I attended on June 13) was an afternoon panel entitled “How do we get more applications on Linux?”

Moderated by Dan Kohn, COO of the Linux Foundation, the panelist were Mike Milinkovich of the Eclipse Foundation, Darren Davis, from Novell, Kay Tate from IBM, Scott Nelson from RealNetworks, Ed Costello from Adobe, and Brian Aker from MySQL. (A word of caution: These quotes are based on my real-time notes at the conference, and I can’t guarantee that they’re word-for-word exact.)

I hadn’t expected that the panel would be dominated by a discussion of the Linux Standard Base, but it became clear about eight seconds into the discussion that the LSB is the end-all and be-all of getting third-party developers to target generic Linux.

Think about it: If you want to target Windows, you know which APIs to write to; if you want to target Mac OS X, you know which APIs to write to; if you want to target Solaris, you know which APIs to write to. Linux? You shouldn’t have to choose to write to Red Hat Linux, or SUSE Linux, or Debian Linux. You should be able to simply write to Linux.

That’s why the Linux Standard Base was envisioned as a set of consistent programmatic interfaces for applications developers. All Linux distros would implement the Linux Standard Base, and therefore, app developers could just target the LSB, secure in the knowledge that their source code should compile to run everywhere without modification. Customers, by looking for apps certified to run against the LSB, could be secure in the knowledge that apps would run on their own preferred distro.

Unfortunately, said Kohn, “There are a few hundred LSB certified applications, and a couple of dozen commercial ones. Why is Linux not getting thousands of applications?” Compare that against thousands upon thousands of certified Windows applications.

MySQL’s Aker said, “If the question is, how to you get [Windows applications] ported to Linux, well, you don’t! You never win there. You win in getting Linux used in new development. If the LSB can push new development onto Linux, you’ll win. If you find folks who are doing Windows applications today and get them to write in the first place for Linux tomorrow, that’s how you win.”

Novell’s Davis agreed. “The low-hanging fruit’s already done. I agree, the next step is new application development. We have to get ISVs involved in targeting Linux as a platform. What comes through Novell are customer requests, and that’s all customers care about. sure, We want to promote Linux, but customers don’t look at it that way. They’re looking at a business problem, and then finding an application to solve that problem. you have to have those on Linux.”

IBM’s Tate said, “We have great Linux certification by distro, and that’s the industry expectation. But being able to do certification at the application level, and teaching customers to look for that, is something that the Linux industry isn’t used to doing.” And indeed, the various commercial Linux distros push their own specific certification programs.

Eclipse’s Milinkovich expanded on this: “What you want is applications running on Linux. It’s a virtuous cycle. It’s not about getting products on top of Linux — it’s about getting the bank teller using Linux at the counter. Make Linux more ubiquitous, so you have an installed base of machines that ISVs will want to sell.”

He continued, “You’ve already eaten all the early adopters, now you have to get to the mainstream. Developers aren’t looking at the [software development] technology you’re including, like gcc and gdb — they think you’re nuts! You need a culture change. You have to give them application frameworks that they like. You can knock Microsoft all you want, but many developers like what Microsoft gives them to write apps. There’s nothing like that in Linux. Sure, I have an agenda, the Eclipse Rich Client Platform, which does that, but you have to recognize that if you want app developers, the status quo won’t get you there.”

A challenge, agreed the panel, is that the LSB, as a platform, doesn’t appeal to developers who aren’t hard-core Linux enthusiasts, but rather are considering it as merely as just another target platform.

Novell’s Davis said, “I have sent ISVs out to the LSB Web site, and they’ve come back screaming that they don’t know how to use it. You need tools to make using LSB easier. ISVs do the math and look at the market size, and can pick one or two distros — and that leaves the other distros out in the cold. We need LSB to work to stop it all from being dominated by Novell and Red Hat.”

MySQL’s Aker amplified that point: “I think it’s gotten worse. Different kernels on different distros means lots of different bugs that we have to support. That’s hard to describe to a customer, when there’s a problem that’s specific to a specific vendor build. This is getting worse.” He said, “There’s no money to be made in making tools. Getting vendors to implement new tools is really hard. Eclipse is the only forward option for tools.”

The moderator, the Linux Foundation’s Kohn, admitted, “The biggest competitor to the LSB is the makefile. if you have an open source app that you’re releasing, you create a makefile for each of the distros, BSD, Solaris, and so-on. The user compiles and it just works. The LSB project is trying to make the world safe for binaries, instead of source code, which isn’t the sexiest rallying cry. But Linux has to be that way to become a modern platform.”

That’s true. If your platform requires end users to compile applications, you’re not even in the game. As MySQL’s Aker said, “People use the distributions, and don’t want to compile their own applications.”

That’s a problem that this panel didn’t even come close to addressing.

Another problem that wasn’t addressed is that, for the most part, the majority of desktop application developers don’t want to target Linux specifically – they just want to sell or give away lots of software. The economics aren’t there – unless they have a specific business reason for targeting Linux directly, they’d be better off focusing on platforms with larger installed bases, and which therefore present a bigger business opportunity.

As the Eclipse Foundation’s Milinkovich said (and this was easily the best quote of the entire conference), “If you’re a startup, and the only platform you’re supporting is Linux, it sucks to be you.”

Z Trek Copyright (c) Alan Zeichick

If the Linux community has a hero other than Linus Torvalds, it’s Mark Shuttleworth, a dot-com gazillionaire who started the Ubuntu Project, and who funds it out of his own pocket. If you’re not familiar with Ubuntu, it’s a distro based on Debian Linux, which is designed to offer a one-disc install of everything you need: operating system, device drivers, and lots of free applications.

Unlike many Linux distros, Ubuntu was intended to be used by civilians. Shuttleworth has pledged that Ubuntu will always be free.

I’m a fan of Ubuntu Linux, and think that Mark Shuttleworth is a good guy. However, to many in the Linux community, the South African entrepreneur is a rock star, a god. What’s more, Ubuntu Linux is seen by many as the first, and only, Linux distribution that might have a shot at gaining market share on the desktop. That makes it the only viable open-source alternative to Windows.

No matter whether you like Shuttleworth, or worship him, or hate him, or are fairly ambivalent, there’s no doubt that he’s very influential in the Linux community. So, his keynote address at the Linux Foundation Collaboration Summit (which I attended on June 13) was worth listening to, particularly because it presented a frank window into the state of the Linux platform.

Shuttleworth, naturally enough, praised the concept of the Collaboration Summit, the first time that such a wide swatch of the Linux community came together ostensibly to find better ways to work together to improve the platform, both technologically and in the marketplace.

“It’s a chance for us to build relationships, and accelerate collaboration,” he said. “I enjoy seeing the extent to which people who aren’t generally ‘free software folks’ are hearing about free software, and are starting to see that this world is more interesting than they thought it was.” (A word of caution: These quotes are based on my real-time notes at the conference, and I can’t guarantee that they’re word-for-word exact.)

Speaking to the non-Linux attendees (such as press and analysts), he said, “It’s not just Windows vs. Mac, there’s a whole other world out there. Many people want it to be Red Hat vs. Microsoft, or Ubuntu vs. Microsoft. but that’s not what it is.”

What separates Linux from Windows and the Mac, he said, is the large number of different parties, and different vested interests, who have come together to make the platform work. “Innovation works best when you have people who are inspired, and you let them work. However, you need a framework for innovation that brings them together.”

Enthusiasm isn’t enough, though, Shuttleworth said. “We have some disadvantages. The other guys can spend more, in hard dollars, than our guys. On the other hand, we have an extraordinarily rich pool of talent, and have the freedom to innovate. Collaboration sounds beautiful, but it’s hard. Collaboration means working together, but that’s labor —that’s real work.”

Shuttleworth warned against pitfalls caused by biases, or lack of knowledge, or the type of unhealthy group dynamic when projects are run by close-knit meritocracies with little patience for those people who are less technical, or who have a different perspective.

“Too often, people say ‘I don’t want to work with that distro,’ or ‘I didn’t know where to send my patch.’ We have lots of tools, like mailing lists. Unfortunately, they shake things up so that the loudest ideas bubble up. Think about poisonous people! How can you run a project so that the best ideas win, vs. the loudest ideas win?”

But even worse than collaboration within a project, he said, is broader collaboration between the myriad projects within the open source community. “We also have version control, wikis, Web sites, lots of tools, but most of those tools focus on collaboration within a project. If you’re in a project, you know where to go, and who to talk to within your project. The big problem is collaboration between projects.”

Shuttleworth pointed out a couple of areas where things fall down, like translations into other languages, bug management and patch distribution.

“One clear thing is that translations fail to move upstream. People translate what they’re using, but it doesn’t move up. It’s not intentional, but we don’t have standard processes or conduits so they can send their translations upstream. The same thing with bugs: You may have people who reported a bug to Red Hat, or SUSE, or Debian, or Ubuntu — but they’re like silos. We need a way to aggregate those eyeballs.”

He added, “It’s the same with patches. A patch solves one person’s need. How can we accelerate the process of moving patches upstream? We need something that’s more of a star topology, to move patches up and around.”

A bigger question, Shuttleworth asked is, “How do we minimize the barriers to participation, and include the v3 guys, and BSD, etc? A central database is not the answer.”

A minor question, which clearly irks Shuttleworth: “Why can’t I programmatically access the kernel development team’s database through APIs?”

Shuttleworth then jumped back to translations. “Here’s the point of friction: The tools are difficult. In order to contribute translations, you have to have Committer access [to a project], and that’s not good. How can we get people to contribute translations without requiring they be developers, or that they use development tools? There are also different frameworks [for localization]. Mozilla’s framework is different than OpenOffice, for example. It’s worth the effort to break down the barriers. We need to agree that the inability for work to flow across projects is a problem.”

On bug tracking, he added, “The biggest problem is the friction to file the bug! You don’t understand the bug tracker, or the project’s conventions, or even know if you would get a response. If you have to have a personal relationship [with someone on a project] in order to get a bug into the project, that ‘s a problem. We should think about federation, and find standard ways to describe bugs, and common APIs for submitting bugs.”

Shuttleworth commented, “In Ubuntu, we took as role models Mozilla because we like their roadmaps, Gnome because of their commitment to release cycles. We like blueprints of specifications, since they let micro communities form around a particular piece of work.”

He pointed out a real problem, that the meritocracies found in many open source projects may help development, but may hinder deployment and adoption. “How do we remove the hard lines between committers and everyone else, i.e., the line between first-class citizens and everyone else? Why is someone who receives a tarball [instead of accessing the source code repository] a second-class citizen? Getting everyone to participate is important.”

Shuttleworth concluded by speaking to people who run projects, and the lack of tolerance found within many projects: “Ultimately, we need to remember that while tools are important, it’s people that collaborate. People, not robots. The decision-making process is important. Unfortunately, meanness spreads: I get disappointed when I hear that if you disagree with me, you’re stupid. We agree on an awful lot. I would appeal to all of you to remember that leadership is important, and collaboration is more important than our differences.”

Z Trek Copyright (c) Alan Zeichick

If you talk to anyone about open source software these days, the topic of the Free Software Foundation’s General Public License v3 is sure to come up. People – or rather, those people who think about open source licenses – generally fall into three categories:

• GPL v3 is the worst thing since Rome was sacked by Alaric, King of the Visigoths, and will result in the destruction of the open source movement. This seems to be the majority opinion, and is argued very passionately.

• GPL v3 is an improvement over GPL v2, primarily because it might do a better job of reining in Microsoft. This seems to be the minority opinion, and is expressed without much passion.

• GPL v3 is just another open source license, use it if you want, and don’t if you don’t. Not too many people express this opinion, but it’s the one that I hold.

My own preference would have been that the Free Software Foundation, which “owns” the General Public License, had spawned GPL v3 as a brand new license, rather than presenting it as a simple update of the very successful GPL v2. For example, there is already the LGPL, the Lesser General Public License. Perhaps what we’re seeing as GPL v3 could have been called the GGPL v1, for the Greater General Public License, or SGPL v1, for the Strong General Public License.

Why does this matter? Well, some software (I don’t know how much) is written to say that it conforms to “GPL v2 or later.” Thus, such software will automatically will fall under the provisions of GPL v3 when that license is finalized, which I doubt is what the people who wrote “GPL v2 or later” intended.

Note: I think that Richard Stallman and the FSF knew exactly what they were doing, and see the automatic migration of all “GPL v2 or later” projects to GPL v3 as essential to swiftly establishing a base of GPL v3 intellectual property.

I was pleased that at the Linux Foundation Collaboration Summit (which I attended on June 13), many presenters held a level-headed view of the GPL v3 issue. One speaker, Dan Frye for IBM, seemed to speak for many there when he said, “Just chill when v3 comes out. Let’s just see how things go.”

In all fairness to those who were appalled by the first drafts of GPL v3, the initial versions were terrible, and would have done tremendous harm. Richard Stallman deserves kudos for his uncharacteristic flexibility. A kinder, gentler Stallman – who’d have thought it?

So, chill, everyone.

However, IANAL: I am not a lawyer.

There was a panel of attorneys at the Summit discussing intellectual property and license issues, and it was fascinating. McCoy Smith, a senior attorney at Intel, made the point that while individuals and even companies can try to adhere to what a community generally believes that license says, that’s only valid as long as people are working together directly. As soon as a dispute goes to court, he said, decisions are made by the specific language in the license – and not by what everyone “knows” the licenses supposed to mean.

Smith, who worked on the GPL v3 project, said, “This came up a lot with GPL v3. The FSF said we had this v2 license for a number of years, but there are potential issues with the way that it was written that could be exploited by its enemies. So, there’s a problem where amateur lawyers might not capture their intentions property. However, this largely polices itself. If everyone in the community abides by the intention, and it’s not going to court, you’re okay.”

But if Linux does go to court – perhaps in cases involving Microsoft or SCO – licensing terms like the GPL v2, which were written by amateur lawyers, might not be good enough.

As Jason Wacha, general counsel for MontaVista Software, said, “Sometimes amateurs are better at describing what they want to do. The five most dangerous letters in the English language are IANAL, especially when you try to interpret copyright or patent law. If you’re not a lawyer, don’t try to interpret the law. You have to go by the actual words in the statutes, not by how people have been interpreting that statute.”

So, chill – carefully.

Z Trek Copyright (c) Alan Zeichick

At the Linux Foundation Collaboration Summit (which I attended on June 13), Jim Zemlin, executive director of the Linux Foundation, accurately portrayed that the Linux movement has changed. He stated that, from the enterprise perspective at least, the days of having to build awareness for Linux, and for open source in general, are long since over. He’s right. Within most organizations, in my experience, Linux is seen as just as viable an option for servers as Windows and Unix (specifically, Solaris). That’s both good –and bad.

We’re past the days of religious wars, and we’re also past the days where Linux is chosen merely because it’s free or because it’s open source. The costs of using and deploying Linux aren’t significantly different than those of deploying Windows or Unix on server. To be honest, the licensing cost of cost of software is only a small part of the long-term total cost of ownership.

However, Linux itself has challenges, which I was pleased that the Linux Foundation meeting was honest enough to admit.

For example, because Linux is developed primarily by individuals working on things that they find interesting, Linux lacks the directed evolution of Windows, Unix, Solaris or Mac OS X. Thus, there were many people at the conference talking about the inconsistent state of power management within the Linux kernel and kernel-level device drivers. Everyone acknowledged that it is a problem – but nobody could do anything about it.

Similarly, there are many views as to the best way to prevent fracturing of commercial Linux distributions around kernel levels, but no agreed-upon way to solve that problem. While the Linux kernel team itself is run as a benevolent dictatorship, most other decisions are left up to the individual commercial distributions, who pointedly do not coordinate with the Linux Foundation or with each other.

Of course, not all the issues facing Linux have to do with process. There’s a lot of dissent within the community regarding licensing. The Free Software Foundation’s General Public License v3 is a huge polarizing factor, and as the Linux Foundation explains, even if the bulk of the community wished to adopt the new license (which is uncertain), the process of moving code to the GPLv3 would be incredibly time consuming. It just ain’t gonna happen, folks.

For now, and for the next several years at least, it seems clear that be three separate Linux worlds:

• Linux on servers: Hugely successful. Because servers typically run on a fairly limit set of hardware; most enterprises choose an operating system when they buy server hardware; and because a particularly server runs only a small number of applications at one time, Linux’ limitations in terms of device drivers and applications are not a significant factor.

• Linux on mobile devices: Hugely successful. As the representative from Motorola, Christy Watt, said during the Linux Foundation meeting, “We believe that about 60% of our base will be on Linux soon. We have shipped 6 million devices on Linux already.” The recompilable kernel, ability to create custom drivers, open-source licensing and cost factors are excellent for phones, PDAs and other devices.

• Linux on the desktop: Not even close. There’s been tremendous strides in this area, but device drivers remain a challenge, particularly for top-end graphics cards. Another challenge is the proliferation of user interfaces. Despite the amazing success of Ubuntu Linux, desktop and notebook PCs will be found mainly in three locations: task-specific desktops (such as cash registers or point-of-sale systems); on machines used by True Believers; and in low-cost desktops, such as those deployed into the third world. For the mainstream home and office market, the world belongs to Windows (and the Mac, as a distant runner up), and it’s going to stay that way for a long, long time.

Z Trek Copyright (c) Alan Zeichick

According to Managed Objects, a company that sells software service management software, software is at the root of all evil.

Managed Objects has conducted a survey of 200 IT managers in the United States, and report that 61% of those IT managers say that software is generally the culprit for IT downtime, compared to 21% who blame hardware. (The rest don’t know if it’s hardware or software.)

Further, according to Managed Objects, IT managers are more likely to blame software for IT failures “if their organization relies heavily on home-grown applications.”

To quote:

Complicating the battle to reduce downtime is the pervasiveness of revenue-driving homegrown or custom applications within large companies’ infrastructure, which according to survey results can represent up to 90 percent of some organizations’ application mix. Within organizations relying on more homegrown applications than off-the-shelf offerings, more than 80 percent of respondents blamed software as the primary cause of most outages.

These findings are particularly significant when placed against the backdrop of results that showed just how often application downtime occurs. 82 percent of surveyed organizations reported application outages in the last year significant enough to impact their businesses, at an average cost of more than $10,000 per hour and an average duration of between three and four hours.

Bear in mind, of course, that it’s in Managed Objects’ best interest to make this issue look dire. The company wants to sell you tools designed to measure application uptime, enforce software service level agreements and implement end-to-end software management. This study was done as part of their marketing push around a new product called Application Configuration Study, launched last week. Plus, a survey of only 200 IT managers constitutes a pretty small sample, and it’s unclear how the study defines an IT failure, if it does at all.

Even so, the results are interesting, and accord with my own observations. Hardware failures in a modern data center are fairly rare. If proper precautions are taken (such as using network load balancers, fault-tolerant devices and redundant systems), a device fault should not result in an “IT failure” at all.

Z Trek Copyright (c) Alan Zeichick

The software security tools market has been ripe for picking, for two reasons:

• We had a lot of small, privately held companies developing exciting, but in many cases, overlapping, technology, but those companies had trouble finding customers and going to market. Their exit strategy is to be gobbled up by a big fish.

• The big fish were significantly behind the times when it came to integrating security into their developer tools. Their business models favor buy-it instead of build-it for this type of technology, but they weren’t buying.

Until recently, that is. Until this month, none of the big fish, like IBM Rational, Mercury (now part of HP), Microsoft, Oracle, Sun and Borland/CodeGear, had integrated security as part of their IDEs, test/QA tools, or applicaction life cycle suites. The notable exception has long been Compuware, with its DevPartner SecurityChecker. It’s only been a matter of time before the fish starting munching, and frankly, I expected the feeding frenzy to start quite a while ago.

The dam broke a couple of weeks ago, when IBM announced the acquisition of Watchfire. Today, we have the second big move, as Hewlett-Packard today said it will buy SPI Dynamics.

This is only the start.

Microsoft needs to incorporate a top-quality security solution into Visual Studio Team System, encompassing not only the edition for software testers, but also the editions for architects and developers. This is a glaring weakness. While Microsoft could build the functionality itself (as it did with the other ALM tools in Team System), it would do better buying one of the existing players, and that’s what I think the company will do.

Oracle is in a similar situation: It doesn’t have software security functionality, either baked into its tools or as stand-alone offerings. It’s hard to predict what Oracle will do, whether they’ll build it, buy it or ignore it.

Borland and its CodeGear subsidiary are also behind the times. Software is a core part of the application life cycle, but there’s no specific security offering in the Borland pantheon, even in its Silk and Gauntlet tools. As with Microsoft, Borland could build or buy, but I expect them to buy it. I hope whatever they do gets baked back into the CodeGear IDEs; that’s where software security belongs, as tightly integrated into the developer desktop as spellcheck is integrated into a word processor.

The Eclipse and NetBeans projects need to start software security initiatives. The lack of a security project, either as a top-level element or as part of the Eclipse Test & Performance Tools Project, is a huge oversight, but given how the Eclipse Foundation works, it takes a commercial vendor to initiate a project. Now that IBM is buying Watchfire, it’s less likely that Big Blue will push Eclipse in this direction. NetBeans, by contrast, is driven by one company’s strategic vision, However, Sun needs to be more strategic and visionary here, and get on the ball.

Fortunately, there are more software security companies available for purchase. It won’t be long until we see the big fish gobble up Agitar, Armorize, Cigital, Fortify, Klocwork and Ounce Labs. Who do you think will go first?

Z Trek Copyright (c) Alan Zeichick

A great article, for all you methodology fans, is a cover story in today’s SD Times: “It’s Lean, But Is It Agile?,” written by Jennifer deJong.

As Jenn writes,

Are lean software development and agile software development—of which XP is the most prominent example— one and the same? They both are iterative approaches to developing software, and in some respects lean and agile are closely aligned, said Beck. Several other agile experts interviewed by SD Times agreed.

But when asked whether lean software development is an agile methodology, alongside the others that fall under the agile umbrella—Adaptive, Crystal, Dynamic Systems Development Method, Feature-Driven Development, Scrum and XP—only one of the experts said yes.

the story continues,

Forrester analyst Carey Schwaber agreed that lean software development and agile methodologies are two complementary schools of thought. “There are quite a lot of people out there using the terms ‘agile’ and ‘lean’ interchangeably, but they are not the same thing,” she said. She has also seen agile consultants attempt to differentiate themselves by citing expertise in lean. “But often that expertise isn’t much to speak of.”

It’s my experience that many developers are unfamiliar with the concept of lean software development. Take a few minutes, read Jenn’s, tell me what you think about lean software development. If you find the concept intriguing, check out “Implementing Lean Software Development,” by Mary and Tom Poppendieck.

Z Trek Copyright (c) Alan Zeichick

This morning, CodeGear sent me a new picture of David Intersimone, to replace the several-years-old file photo that I used in my previous blog posting and on our conference Web site. David I, as he’s widely known, is an incredibly well-respected developer evangelist at CodeGear, and is a keynote speaker at EclipseWorld 2007.

So, cast your votes: Which do you like better: New David I (pictured), or David I Classic?

Z Trek Copyright (c) Alan Zeichick

“Life’s too short to run beta software.” That’s long been my philosophy, specifically in regard to my own personal workstation.

Back when I did a lot of hardware and software testing, there were servers and workstations designated as testbeds, while my personal workstation (sometimes a Mac, sometimes a PC) was sacrosanct.

Every so often, however, I’m tempted to use the latest-and-greatest beta code, and this generally proves to be a disaster. It sure did with the Safari 3 beta for the Mac (announced at the Apple Worldwide Developer Conference), which I installed Monday afternoon, and uninstalled this morning.

There were three significant problems with this beta of Safari 3 on Mac OS X 10.4.9 — even when the new browser itself wasn’t running:

1. It devastated iChat, Apple’s instant messaging application. The application itself ran slower than molasses: even typing text into the entry field appeared o n e c h a r a c t e r a t a t i m e s o m e t i m e s s e c o n d s a p a r t. Pressing Enter to send a message caused a delay of between 10 and 30 seconds. Invitations to chats were painful: I’d hear the sound effect, indicating that someone wanted to chat, and then iChat would hang for up to two minutes before an error message dialog would pop up, saying that there was a problem joining the chat.

2. It wounded Skype, which kept shutting down. Sometimes Skype would close immediately after launch, and sometimes it would stay up for a few hours… but when I went to use it, I’d find that it had quietly closed.

3. It hurt the overall system, in that many apps seemed to be running slower. I have a menubar CPU meter, and also kept an eye on the Activity Monitor, and couldn’t see an obvious cause for the slowdown. The biggest issues were with apps that accessed the Internet, such as NetNewsWire, my RSS reader.

This is all a pity, in that Safari 3 itself seemed to be a real improvement over Safari 2.0.4, the current “stable” version. During this test period, I’d switched from Firefox 2.0.0.4 to Safari 3 as a primary browser, and it was doing fine… other than those unbearable problems.

Fortunately, Apple included an uninstaller with the Safari 3 beta, which brought back the Safari 2.0.4 code. Everything’s back to normal. And next time, I’ll wait for the finished code before putting it on my personal workstation. Frankly, I knew better.

The unfortunate aspect is that if Apple followed safe programming practices, and treated Safari as “just another application,” these compatibility issues should not have occurred, especially when the application itself wasn’t running.

I’ve not done the forensics, but it seems a reasonable guess that the Safari 3 beta installer replaced some shared libraries or altered the network stack in some way… which is bad, bad, bad. Applications shouldn’t be able to affect the operating system and its shared libraries. That type of naughty behavior is one of my biggest beefs with Microsoft (whose installers for products like Windows Media Player, Internet Explorer and Office replace key Windows DLLs), and I’m disappointed to think that Apple may have fallen into that trap.

Z Trek Copyright (c) Alan Zeichick

It’s an incredible coincidence. The same day that BZ Media put out a press release about the record-setting June 1 issue of SD Times, CMP Media put out a press release about its continuing shift to an online media company. That involves laying off 200 people, and shuttering three publications. Network Computing and Optimize. Both of them will be folded into InformationWeek, CMP’s newsweekly. SysAdmin is being closed outright.

To quote from the release:

“We found last year for the first time, our non-print revenues outstripped our print revenues,” CEO Steve Weitzner said. “This year that trend is continuing and the gap is actually growing. We want to realign internal resources around these growth areas and look at opportunities in the marketplace and really go after them.”

Consequently, Weitzner said, the company is putting its online businesses “at the center of everything we do and changing how we do print.”

SysAdmin and Network Computing will be missed. (Frankly, I don’t think that Optimize will be: I never quite understood who that magazine was for, and what value it provided.) The inclusion of Network Computing into InformationWeek seems like a terrible mistake – I’m not sure what value it offers either magazine’s subscribers or advertisers.

My own relationship with Network Computing goes back about a decade. That’s before CMP (which published Network Computing) merged with Miller Freeman (which published Network Magazine, of which I was the editor-in-chief).

Network Magazine was, in my opinion, the better-written, and more thoughtful publication discussing the emerging trends and high-level issues around LANs and WANs, SANs and NAS. However, Network Computing pounded us into pudding, with its heavy emphasis on labs-based product reviews. Written very pragmatically, talking to LAN and WAN managers about the stuff they needed to buy, Network Computing was unstoppable.

In September 2005, after CMP and Miller Freeman merged, Network Magazine was renamed IT Architect. The publication shut down after its March 2006 issue. CMP did the right thing in trying to differentiate Network Magazine and Network Computing.

The decision to merge Network Computing into InformationWeek makes no sense. Network Computing is/was a magazine for network managers and senior network administrators, who are solving specific technical problems, primarily around network and infrastructure software, as well as wide-area connectivity. InformationWeek is a news-and-analysis weekly written for CIOs and senior IT managers, discussing the “vital issues of the day” and providing guidance for making strategic IT decisions.

It’s not a fit. InformationWeek’s readers aren’t network managers and network admins. Frankly, looks like a naked ploy to keep some of Network Computing’s advertising revenue. CMP will doubtlessly spin a good tale, of course, as to why this IS a good fit. But if it’s such a good fit, why weren’t those advertisers already advertising in InformationWeek before this?

Z Trek Copyright (c) Alan Zeichick

We just announced a second keynote for the EclipseWorld 2007 conference: Robert C. Martin, founder and CEO of Object Mentor. “Uncle Bob” is a top expert in object-oriented design, with tremendous expertise in C++ and Java, and also speaks and writes brilliantly on agile methodologies and software craftsmanship.

Also, at EclipseWorld, “Uncle Bob” is teaching one of our full-day tutorials, on test-driven development.

It’s going to be great conference!

Z Trek Copyright (c) Alan Zeichick

Steve Jobs’ keynote at the Apple World Wide Developers Conference is sometimes huge with news, sometimes less so. This year’s news was weaker than most. Jobs didn’t unveil new hardware or new developer tools. He didn’t announce new software, but did distribute betas of Mac OS X 10.5 “Leopard” to paid attendees (not to press/analysts).

The announcement that won the biggest applause from WWDC’s 5000+ developers was that two major game companies – Electronic Arts and ID Software – will be increasing their support of the Mac platform. In EA’s case, a number of best-selling games will be released on for Mac OS X, and in the future, there will be some simultaneous releases for Windows, Mac and game platforms. ID’s game engine, which normally targets the Xbox and Playstation, will soon be available for the Mac.

There was less frequent cheering during Jobs’ demonstration of 10 features out of 300+ planned for Leopard, scheduled to ship in October. The biggest crowd pleasers were a new graphical desktop, an improved Finder and Core Animation.

The audience also liked that there will be more robust 64-bit support in Leopard – and that the new OS will come in a single SKU, priced at US$129, instead of the bewildering number of versions of Windows Vista, and reacted favorably to the surprise announcement of Safari for Windows (a public beta of Safari 3 for Windows and Mac OS X 10.4 is now available for download). The new backup application, Time Machine, also was popular.

The biggest yawner was a tool to let consumers make their own dashboard gadgets by making clippings from Web pages. Snooze.

The biggest disappointment was Jobs’ “one final thing” presentation of Apple’s official way for third-party developers to create apps for the iPhone: write and host Web apps for Safari using AJAX. That was not what this audience had hoped for.

Three elements in the keynote are worthy of additional discussion: 64-bit computing, Safari for Windows, and the iPhone development news.

64-bit computing. Apple has flirted with 64-bit software since the release of the PowerMac G5, but went back to 32-bit with the first Macs running the 32-bit Intel Core processor. Today, however, most shipping Macs today are using the 32/64-bit Intel Core 2 processor. Since Mac OS X 10.4 (and earlier) is 32-bit when running on Intel, the news that Leopard will include both 32-bit and 64-bit kernels is welcome.

What about 32-bit/64-bit interoperability? It’s common for 64-bit operating systems using the x64 platform (like the Intel Core 2 family and the AMD Opteron family) to be able to run 32-bit software. 64-bit Linux and Unix can generally run 32-bit binaries directly; Windows uses a software abstraction layer called WoW64 (which means Windows 32 on Windows 64). Mac OS X, remember, is based on Unix.

That’s the good news. The bad news is that the inverse is usually not possible: You can’t generally run 64-bit binaries on a 32-bit version of the operating system, even if the chip is nominally a 32/64-bit chip. The hardware architecture generally can’t handle it, and software emulation would be extremely computationally complex (and would have terrible performance). But that’s what Jobs seemed to imply: that you could run both 32-bit and 64-bit applications on both 32-bit and 64-bit Macs running Leopard.

Jobs wasn’t explicit, in stating that a 32-bit hardware platform (like my first-generation 32-bit iMac Core Duo) would run 64-bit apps under Leopard — but again, he implied it. This is a big question mark, and I have queries into Apple about it.

Safari on Windows. Jobs talked about the market share of Safari at 5% of all browsers, vs. 15% for Firefox and about 76% for Internet Explorer. To boost market share, he announced Safari for Windows, and the company released the public beta for Windows and Mac OS X 10.4.

The only benefit that Jobs presented for running Safari on Windows was speed: HTML rendering with Safari for Windows was about 2x faster than IE 7, and 1.6x faster than Firefox 2, he said. That’s not a reason to change browsers! Security, features and interoperability are ‘way more important than rendering speed.

Sadly, Jobs didn’t mention security, features or interoperability. Specifically, he didn’t discuss the reason why so many Mac users abandoned Safari 2 and embraced Firefox 1.5 (and later, of course, Firefox 2): There are too many Web sites, especially those optimized for Windows or using extensive Web 2.0 functionality, that simply don’t work right with Safari 2, but which work great on the Mac with Firefox.

I would have liked to have heard Jobs pledge to fix those issues instead of implying that only reason to choose a browser is its rendering speed.

iPhone development. Developers have been frustrated by the on-again, off-again rumoring about third-party apps on the iPhone. Looks like “off-again” won, and that’s a shame. Third-party apps are what define a platform and ensure its success. Jobs’ recommendation that developers should just build AJAX applications, and let customers access them via the iPhone’s embedded Safari browser, is a solution that will satisfy nearly nobody.

Sure, Apple has created hooks that would let AJAX apps use built-in iPhone services, like making phone calls or sending e-mail – but that’s not what developers want. They want an SDK. If that’s what Jobs really thinks customers or developers want to run third party apps over using AJAX in a browser, why bother encouraging the development of native Universal Binary software for Mac OS X desktops and notebooks? Running AJAX apps in Safari does not compute.

Not heard at the WWDC. Completely not addressed by Jobs: Application development for Leopard. Nothing about APIs, SDKs or an update of the Xcode development suite. Disappointing. Also not mentioned at the keynote was last week’s leak, by Sun’s Jonathan Schwartz, that Leopard would use Sun’s ZFS file system.

I went home from the Apple WWDC empty handed, both literally and figuratively. Beyond the preview of Leopard, and beta of Safari for Windows, there wasn’t much to get excited about – and neither of those really spoke to developers.

Z Trek Copyright (c) Alan Zeichick

Today, IBM announced that it’s buying Telelogic, a leading company in the modeling space, specifically a leader in model-driven development. Last week, IBM announced that it’s buying Watchfire, a mid-sized innovator in security and software testing. These are both savvy moves by IBM. But while the Watchfire deal is good for everyone, the Telelogic deal is less so.

With Watchfire, IBM shores up a big hole in its lineup: its IBM Rational tools are popular, but the 800-pound gorilla didn’t have the right offerings for helping companies focus on writing secure code. Buying Watchfire’s technology makes sense.

The days, I believe, of the stand-alone software security suite is soon going to be over: the IDE vendors, from IBM to Microsoft, are going to have to address security as a core feature of their software development products.

It’s analogous to spell checkers in word processors: once upon a time, spell checkers were external programs that people bought and used during a special “proofreading” phase of document creation. Today, continuous spellcheck is integral to not only word processors, but just about every application that has a text input field. That’s how it should be for spelling – and that’s how it should be for software security as well.

I see the Watchfire move as positive for the industry, as it brings software security to a higher level. The other giants, like Microsoft, Oracle, Sun, BEA and Borland, are going to respond by bringing secure coding front-and-center as well. Look for more acquisitions, more innovation, and more spending, as developers and end-users win.

With Telelogic, the motivation is consolidation. There are only a handful of companies that make top-shelf modeling software. IBM already owns the biggest fish in that pond, Rational Rose.

Telelogic’s offerings, added to IBM’s extensive testing and modeling portfolio, is a bid to buy market share, as well as fill in some technical holes in IBM’s offerings. I don’t see this as being necessarily good for the industry, or for customers (except for IBM’s customers); we’ve taken a powerful innovator and strong IBM competitor out of the market.

While the Telelogic move may spur further acquisition in the modeling and testing space, I don’t see it driving innovation, but merely reducing competition and consolidating market share.

Z Trek Copyright (c) Alan Zeichick

If your house or office is anything like mine, there are dead computer bits lying around everywhere. Desktop PCs, old notebook PCs, the occasional server, monitors, keyboards, even a pile of 36GB Ultra2 SCSI hard drives. I’m buried in computer detritus.

What can you do with it?

• Some stuff gets sold, though it’s hard to get enough money for an old computer to be worth the effort, plus it’s a hassle making sure that the hard drive(s) are sufficiently wiped clean of personal data and commercially licensed software.

• Some stuff gets donated, but that’s not always as easy as it sounds.

• Some stuff gets dismantled by an inquisitive teenager.

• But most sits around, waiting for me to recycle or trash it.

In the town that I live in, the garbage company has an E-Waste Recycling Program. That means, they’ll take the stuff if you bring it to the transfer station, but they’ll charge fees like $25 for a computer monitor or $10 for a computer printer. It’s a good service, but what a nuisance. (Not all local towns are even that accommodating.)

What about computer manufacturers? They’re starting to help.

For example, if you buy a new consumer PC from Dell, they’ll take back and recycle your old PC and monitor for free. (For $5, they’ll also plant a tree for you.) They’ll also take back and recycle any Dell product any time, even without a new product purchase. Dell even pays the shipping.

Similarly, Apple provides free recycling for as many as two devices when you purchase a new computer. A couple of weeks ago, I purchase a new MacBook Pro for one of our BZ Media employees at the local Apple store, and when I got home, there was an email from Apple (pictured) containing two FedEx RMA Shipping Authorization bar codes, ready to be slapped onto whatever products were ready to be disposed of.

I’m delighted to see leading computer providers taking a leadership role in computer recycling. My job is easy: Stop procrastinating, and use those FedEx shipping codes.

Z Trek Copyright (c) Alan Zeichick

I’m delighted to report that registration is now open for EclipseWorld 2007, coming in early November to Reston, Va. This is the third annual EclipseWorld, an independent conference produced by BZ Media.

This year’s conference, scheduled for Nov. 6-8, will be the biggest ever, with more than 70 full-day tutorials and technical classes. Our opening keynote will be from David Intersimone, developer evangelist (and VP of Developer Relations) for CodeGear, the tools subsidiary of Borland.

David I (pictured) is well known, and well respected, throughout the development industry for his real-world view of how programmers actually work. He’s talked to more dev teams, in more companies around the world, than anyone I’ve ever met. Nobody tells stories about “life in the trenches” like David I. His keynote is going to be a real treat.

Another huge highlight is the full-day tutorial T-7, The Europa World Tour, taught by Wayne Beaton. Wayne is a developer evangelist with the Eclipse Foundation, and in this full-day class he’ll take you on a whirlwind around the myriad projects that make up this summer’s “Europa” simultaneous release of the Eclipse platform and associated projects.

If you, or your development team, uses Eclipse or Eclipse-based technologies like the Rich Client Platform or IBM Software Development Platform, then EclipseWorld 2007 is where you want to be this November.

Z Trek Copyright (c) Alan Zeichick

It’s time for our regular quarterly look at the SCO Group, which released its fiscal second-quarter financials today. The quarter ended on April 30.

SCO’s revenue continues to fall. Total top-line revenue for this quarter was US$6,014,000, compared to $7,126,000 in the comparable quarter last year. That’s a 15.6% drop, if I did the math right.

But while SCO continues to lose money, the rate of loss is slowing. For this quarter, it lost $(1,143,000), which is a tremendous improvement over the $(4,694,000) loss in the same quarter last year.

According to SCO, “The decrease in revenue was primarily attributable to continued competitive pressures on the Company’s UNIX products and services and the improvement in net loss was primarily attributable to reduced legal costs and operating expenses.”

The company continues to run through its cash reserves, which have fallen to $11,181,000, compared to $12,664,000 six months ago.

The big question remains: Why won’t SCO’s directors exercise their fiduciary responsibility to their shareholders: abandon the lawsuit and fire Darl McBride? SCO has significant intellectual property assets (translation: cool technology), which will never flourish until both the lawsuit and McBride are gone.

Z Trek Copyright (c) Alan Zeichick

Computers, yes. PDAs, yes. Cell phones, yes. But programmable calculators as virus targets? Amazingly, yes. According to Symantec, the popular Texas Instruments TI-89 calculator can be infected by a virus named TIOS.Tigraa. (Credit to eWeek’s Brian Prince for reporting this story yesterday.)

According to Symantec, “TIOS.Tigraa is a memory-resident entry point-obscuring infector of ASM files on Texas Instruments TI89-compatible calculators (TI89, TI92, TI92+, Voyage 200).”

The company further states, “The virus cannot leave the calculator on its own, it requires that a user shares an infected file (either accidentally or intentionally) with another user.”

Fascinating, and scary. What will malware creators think of next?

Z Trek Copyright (c) Alan Zeichick

It’s a good week for developers: Sun pushes its compilers to do more with Linux and multi-core systems, while Microsoft unveiled more about the next version of Visual Studio.

Sun’s developer toolchain for native (C/C++ and FORTRAN) code is called Sun Studio. This is different and distinct from Java Studio, which is an entirely different toolchain for Java development. Both are based on the NetBeans open-source IDE framework, but that’s just about all they have in common. The Sun Studio tools are evolved from the old Forte tools. The announcement on Monday was that Sun Studio 12 is now available.

For a while now, Sun Studio has focused primarily on Solaris on SPARC and x64, but recently also embraced Linux. The new version, Sun Studio 12, improves the support for Linux on x86/x64. But the big changes have to do with tools for debugging, profiling and optimization for multi-core processors.

This is important because, when you’re tuning for top performance, there are subtle differences in the behavior of, say, a server with eight unicore processors, four dual-core processors or two quad-core processors. Yes, all have eight hardware threads, but you’ll have different issues for cache sharing, memory access, deadlocks/races and so-on.

Sun has tweaked the tools to work with the Intel and AMD microprocessor architectures. Both the Opteron and Xeon chips have essentially the same instruction sets, but Intel uses an eternal memory controller with a uniform memory access architecture, while AMD has on-chip memory controllers and uses a NUMA architecture.

Sun’s tools are all free to use. Support contracts range from US$864 to $1,200 per year per developer seat.

Meanwhile, at Microsoft TechEd, Microsoft formally named Visual Studio “Orcas” as Visual Studio 2008. A newly announced feature is the “Visual Studio Shell,” which sounds like elements of the Eclipse platform, as it lets software companies ship their own custom IDEs based on Visual Studio. A second beta of VS2008, with the shell, is supposed to come out this summer.

Z Trek Copyright (c) Alan Zeichick

I’m delighted that despite all predictions to the contrary, print publications are doing just great. The June 1, 2007, issue of SD Times is the biggest in our seven-year history. The 60-page issue has more articles, and more advertisements, than ever before.

This issue includes the 5th annual SD Times 100, as well as three cover stories, on how Sun is challenging Silverlight with JavaFX; how VMware adds the ability to replay crashes; and how the future of software will embrace both services and collaboration. There’s also special report on taking a disciplined approach to healthy software.

It’s a great issue, with outstanding articles. Tons to read. If you’re a subscriber to our print edition, watch your mailbox. If you don’t subscribe, you can read the articles at our Web site, or download a 5.1MB PDF of the entire issue.

In the funny coincidence department, the special report in this issue of SD Times uses a very similar image to the cover art for the June 2007 issue of another BZ Media publication, Software Test & Performance: Both show flexible female athletes bending over backwards. (The ST&P cover line is “Limber up your team with agile methods.”)

I’d like to recognize the tremendous efforts by all the editorial, ad sales and production teams of both magazines for their great work this month, with special kudos to the art directors: Mara Leonardi on SD Times, LuAnn Palazzo on ST&P, and Erin Broadhurst on both publications (she worked on the SD Times 100 “Bam! Pow!” artwork).

Z Trek Copyright (c) Alan Zeichick

The winners are…. YOU! If you’re a software developer, or development manager, you’ll want to read SD Times’ fifth annual SD Times 100, a listing of the most important companies, people and projects in the software development industry.

The 2007 SD Times 100 came out today, in the June 1, 2007, issue of SD Times, and of course on our Web site, where we write:

Foiling malicious attacks where they may strike! Stomping out evil, site-bleeding bugs from the netherworld! Single-handedly lifting development teams out of harm’s way! Look! Up and down the list! It’s a leader! It’s an influencer! It’s the SD Times 100 (with apologies to DC Comics)!

This marks the fifth year we have given a nod to the organizations and individuals who have demonstrated leadership in their markets, based either on their share of those markets or on their technological innovation. This year’s SD Times 100 shines a somewhat brighter light on the industry “Influencers” — those companies that we’ve identified as having the greatest impact on the software development landscape. Our lists of winners, by category, remain the same. These, then, are the newsmakers and noisemakers of 2006.

Z Trek Copyright (c) Alan Zeichick