, , ,

Power down… or airplane mode?

Like many of you, I travel with a vast array of personal electronic devices – so much that my briefcase bulges with screens, batteries, cables and charging bricks. Some devices are turned off when I’m on an airplane – and some aren’t, often because I forget.

Take this week, for example. I am working out of SD Times’ New York headquarters, instead of my usual office near San Francisco. What did I bring? A 13-inch mid-2011 MacBook Air notebook, an iPad Mini with Logitech Ultrathin Keyboard, a Google Nexus 7 tablet, a Galaxy Nexus phone, a Virgin Mobile MiFi access point, Bose QuietComfort 15 noise-cancelling headphones, RocketFish RF-MAB2 Bluetooth stereo headset, a Microsoft Notebook Optical Mouse 3000, a USB hub, and an HP-15C calculator. Oh, let’s not forget the Canon PowerShot S100 digital camera. And my Pebble watch.

All that for a five-day trip. A bit excessive? Maybe.

I can guarantee that not every device is powered down during a flight. Yes, the flight attendants ask passengers to turn devices all the way off, and I have good intentions. But there’s a good chance that the laptop is sleeping, that some tablets and the phone might in airplane mode instead of off, I might have forgotten to slide the switch on the Logitech keyboard, and so-on.

Think about all the electronic noise from those electronics. Think about all the potential interference from the WiFi, cellular and Bluetooth radios, the GPSes in the phone and Google tablet… yet it doesn’t seem to make a tangible difference.

I’m not alone in failing to turn off every personal electronic device. According to a new study by the Consumer Electronics Association,

Almost one-third (30 percent) of passengers report they have accidently left a PED turned on during a flight. The study found that when asked to turn off their electronic devices, 59 percent of passengers say they always turn their devices completely off, 21 percent of passengers say they switch their devices to “airplane mode,” and five percent say they sometimes turn their devices completely off. Of those passengers who accidently left their PED turned on in-flight, 61 percent said the device was a smartphone.

At least I have good intentions. Many travelers intentionally keep playing games with their phones, hiding them when the flight attendant walks by, taking them out as soon as the uniformed crewmember stops looking.

That doesn’t change the reality that devices are left turned on — and the flights appear to be perfectly safe. It’s time for the U.S. Federal Aviation Administration, and the U.S. Federal Communications Commission, to stop the ban on using electronic devices during takeoff, landing, and flying at altitudes under 10,000 feet.

, , , ,

Let’s boost developer velocity by 30x

Not long ago, if the corporate brass wanted the change major functionality in a big piece of software, the IT delivery time might be six to 12 months, maybe longer. Once upon a time, that was acceptable. Not today.

Thanks to agile, many software changes can be delivered in, say, six to 12 weeks. That’s a huge improvement — but not huge enough. Business imperatives might require that IT deploy new application functionality in six to 12 days.

Sounds impossible, right? Maybe. Maybe not. I had dinner a few days ago with S. “Soma” Somasegar (pictured), the corporate vice president of Microsoft’s Developer Division. He laughed – and nodded – when I mentioned the need for a 30x shift in software delivery from months to days.

After all, as Soma pointed out, Microsoft is deploying new versions of its cloud-based Team Foundation Service every three weeks. The company has also realize that revving Visual Studio itself every two or three years isn’t serving the needs of developers. That’s why his team has begun rolling out regular updates that include not only bug fixes but also new features. The latest is Update 2 to Visual Studio 2012, released in late April, which added in new features for agile planning, quality assurance, line-of-business app developer, and improvements to the developer experience.

I like what I’m hearing from Soma and Microsoft about their developer tools, and about their direction. For example, the company appears sincere in its engagement of the open source community through Microsoft Open Technologies — but I’ll confess to still being a skeptic, based on Microsoft’s historical hostility toward open source.

Soma said that it’s vital not only for Microsoft to contribute to open source, but also to let open source communities engage with Microsoft. It’s about time!

Soma also cited the company’s new-found dedication to DevOps. He said that future versions of both on-premises and cloud-based tools will help tear down the walls between development and deployment. That’s where the 30x velocity improvement might come from.

Another positive shift is that Microsoft appears to truly accept that other platforms are important to developers and customers. He acknowledges that the answer to every problem cannot be to use Microsoft technologies exclusively.

Case in point: Soma said that fully 60% of Microsoft developers are building applications that touch at least three different platforms. He acknowledged that Microsoft still believes that it has the best platforms and tools, but said, “We now know that developers make other choices for valid reasons. We want to meet developers where they are” – that is, engaging with other platforms.

Soma’s words may seem like a modest and obvious statement, but it’s a huge step forward for Microsoft.

, , , , ,

Mobile developer mojo

Tickets for the Apple Worldwide Developer Conference went on sale on Thursday, April 25. They sold out in two minutes.

Who says that the iPhone has lost its allure? Not developers. Sure, Apple’s stock price is down, but at least Apple Maps on iOS doesn’t show the bridge over Hoover Dam dropping into Black Canyon any more.

Two minutes.

To quote from a story on TechCrunch,

Tickets for the developer-focused event at San Francisco’s Moscone West, which features presentations and one-on-one time with Apple’s own in-house engineers, sold out in just two hours in 2012, in under 12 hours in 2011, and in eight days in 2010.

Who attends the Apple WWDC? Independent software developers, enterprise developers and partners. Thousands of them. Many are building for iOS, but there are also developers creating software or services for other aspects of Apple’s huge ecosystem, from e-books to Mac applications.

Two minutes.

Mobile developers love tech conferences. Take Google’s I/O developer conference, scheduled for May 15-17. Tickets sold out super-fast there as well.

The audience for Google I/O is potentially more diverse, mainly because Google offers a wider array of platforms. You’ve got Android, of course, but also Chrome, Maps, Play, AppEngine, Google+, Glass and others beside. My suspicion, though, is that enterprise and entrepreneurial interest in Android is filling the seats.

Mobile. That’s where the money is. I’m looking forward to seeing exactly what Apple will introduce at WWDC, and Google at Google I/O.

Meanwhile, if you are an Android developer and didn’t get into Google I/O before it sold out – or if you are looking for a technical conference 100% dedicated to Android development – let me invite you to register for AnDevCon Boston, May 28-31. We still have a few seats left. Hope to see you there.

, , ,

Big Data and PC Sales Data

Last week, we held the debut Big Data TechCon in Cambridge, Mass. It was a huge success – more attendees than we expected, which is great. (With a debut event, you never really know.)

We had lots of sessions, many of which were like trying to drink from a fire hose. That’s a good thing.

A commonality is that there is no single thing called Big Data. There are oodles of problems that have to do with capturing, processing and storing large quantities of structured and unstructured data. Some of those problems are called Big Data today, but some have evolved out of diverse disciplines like data management, data warehousing, business intelligence and matrix-based statistics.

Problems that seemed simple to solve when you were talking about megabytes or terabytes are not simple when you’re talking about petabytes.

You may have heard about the “Four V’s of Big Data” – Volume, Velocity, Variety and Veracity. Some Big Data problems are impacted by some of these V’s. Other Big Data problems are impacted by other V’s.

Think about problem domains where you have very large multidimensional data sets to be analyzed, like insurance or protein folding. Those petabytes are static or updated somewhat slowly. However, you’d like to be able to run a broad range of queries. That’s an intersection of data warehousing and business intelligence. You’ve got volume and veracity. Not much variety. Velocity is important on reporting, not on data management.

Or you might have a huge mass of real-time data. Imagine a wide variety of people, like in a social network, constantly creating all different types of data, from text to links to audio to video to photos to chats to comments. You not only have to store this, but also quickly decide what to present to whom, through relationships, permissions and filters, but also implement a behind-the-scenes recommendation engine to prioritize the flow. Oh, and you have to do it all sub-second. There all four V’s coming into play.

Much in Big Data has to do with how you model the data or how you visualize it. In non-trivial cases, there are many ways of implementing a solution. Some run faster, some are slower; some scale more, others scale less; some can be done by coding into your existing data infrastructure, and others require drastic actions that bolt on new systems or invite rip-and-replace.

Big Data is fascinating. Please join us for the second Big Data TechCon, coming to the San Francisco Bay Area in October. See www.bigdatatechcon.com.

While in Cambridge wrapping up the conference, I received an press release from IDC: “PC Shipments Post the Steepest Decline Ever in a Single Quarter, According to IDC.”

To selectively quote:

Worldwide PC shipments totaled 76.3 million units in the first quarter of 2013 (1Q13), down -13.9% compared to the same quarter in 2012 and worse than the forecast decline of -7.7%.

Despite some mild improvement in the economic environment and some new PC models offering Windows 8, PC shipments were down significantly across all regions compared to a year ago. Fading Mini Notebook shipments have taken a big chunk out of the low-end market while tablets and smartphones continue to divert consumer spending. PC industry efforts to offer touch capabilities and ultraslim systems have been hampered by traditional barriers of price and component supply, as well as a weak reception for Windows 8. The PC industry is struggling to identify innovations that differentiate PCs from other products and inspire consumers to buy, and instead is meeting significant resistance to changes perceived as cumbersome or costly.

The industry is going through a critical crossroads, and strategic choices will have to be made as to how to compete with the proliferation of alternative devices and remain relevant to the consumer. 

It’s all about the tablets, folks. That’s right: iPads and Android-based devices like the Samsung Galaxy, Kindle Fire, Barnes & Noble Nook and Google Nexus. Attempts to make standard PCs more tablet-like (such as the Microsoft Surface devices) just aren’t cutting it. Just as we moved from minicomputers to desktops, and from desktops to notebooks, we are moving from notebooks to tablets.

(I spent most of the time at the Big Data TechCon working on a 7-inch tablet with a Bluetooth keyboard. I barely used my notebook at all. The tablet/keyboard had a screen big enough to write stories with, a real keyboard with keys, and best of all, would fit into my pocket.)

Just as desktops/notebooks have different operating systems, applications, data storage models and user experiences than minicomputers (and minicomputer terminals), so too the successful tablet devices aren’t going to look like a notebook with a touchscreen. Apps, not applications; cloud-based storage; massively interconnected networks; inherently social. We are at an inflection point. There’s no going back.

,

Looking for Girls Who Code

I know many female IT professionals. In some parts of the tech field, there are lots of women. In others — including software development — females are fairly rare.Is this a problem? If so, why? Those are legitimate questions. Do companies have compelling reasons to recruit more female developers? Do universities have compelling reasons to seek more female computer science students – or more female computer science faculty and researchers? Do open source projects and other peer-driven collaborative ventures have compelling reasons to welcome female contributors?

I say yes to all the above. The reasons are difficult to articulate, but it’s clear to me that a programming culture that pushes women away is cutting off access to half the pool of available talent. I also believe (at a gut level) that gender-balanced departments and teams are more collaborative, more creative, and more welcoming to those females who work there – and to many men as well.

Let’s be clear. This is a problem of culture, not one of intelligence, talent, drive or initiative. The macho attitude pervading many coding shops creates a hostile attitude for many women. Not just hostile. Sometimes the project teams are quite literally abusive in ways both subtle and overt.

In that sort of toxic environment, everyone, men and women alike, are justified in finding someplace more welcoming to work or study or contribute. When women chose a different department, a different company, a different career, a different academic major, or a different online community, everyone loses.

What are the solutions? I truly don’t know. I don’t believe that books like Facebook COO Sheryl Sandberg’s “Lean In” have the answer. Similarly, I don’t believe that Yahoo CEO Marissa Mayer can serve as a reasonable role model for female rank-and-file programmers.

The life of a huge company’s CEO or top executive is worlds away, no matter the gender, from the workers in the cubicles. Yes, it’s fun and informative to learn from standout performers like Sandberg, Mayer, Carol Bartz, Meg Whitman, Ursula Burns or Virginia Rometty. However, their example does not clearly illustrate a career path that other women can follow, any more than the typical male programmer can advance by copying Steve Jobs, Bill Gates, Larry Ellison or Mark Zuckerburg.

Let me point out a few resources.

Open a Gateway for Girls to Enter the Computer Field,” a great story last week in the New York Times.

The Anita Borg Foundation, which works to increase the impact of women in technology.

Girls Who Code, a nonprofit that works to educate, inspire, and equip young women with the skills and resources to pursue academic and career opportunities in computing fields.

 

, ,

Moving into Big Data mode

Packing lists – check.  Supplies ordered – check. Show bags on schedule – check. Speakers all confirmed – check. Missing laptop power cord located – check. Airline tickets verified – check. Candy purchased for reservation desk – check.

Our team is getting excited for the debut Big Data TechCon. It’s coming up very shortly: April 8-10 in Boston.

What drove us to launch this technical conference? Frustration, really, that there were mainly two types of face-to-face conferences surrounding Big Data.

The first were executive-level meetings that could be summarized as “Here’s WHY you should be jumping on the Big Data bandwagon.” Thought leadership, perhaps, but little that someone could walk away with.

The second were training sessions or user meetings focused on specific technologies or products. Those are great if you are already using those products and need to train your staff on specific tools.

What was missing? A practical, technical, conference focused on HOW TO do Big Data. How to choose between a wide variety of tools and technologies, without bias toward a particular platform. How to kick off a Big Data project, or scale existing projects. How to avoid pitfalls. How to define and measure success. How to leverage emerging best practices.

All that with dozens of tutorials and technical classes, plus inspiring keynotes and lots and lots of networking opportunities with the expert speakers and fellow attendees. After all, folks learn in both the formal classroom and the informal hallway and lunch table.

The result – Big Data TechCon, April 8-10 in Boston. If you are thinking about attending, now’s the time to sign up. Learn more at www.bigdatatechcon.com.

See you in Boston!

, , , , , ,

Android + Chrome = Confusion

What is going on at Google? I’m not sure, and neither are the usual pundits.

Last week, Google announce that Andy Rubin, the long-time head of the Android team, is moving to another role within the company, and will be replaced by Sundar Pichai — the current head of the company’s Chrome efforts.

To quote from Larry Page’s post

Having exceeded even the crazy ambitious goals we dreamed of for Android—and with a really strong leadership team in place—Andy’s decided it’s time to hand over the reins and start a new chapter at Google. Andy, more moonshots please!

Going forward, Sundar Pichai will lead Android, in addition to his existing work with Chrome and Apps. Sundar has a talent for creating products that are technically excellent yet easy to use—and he loves a big bet. Take Chrome, for example. In 2008, people asked whether the world really needed another browser. Today Chrome has hundreds of millions of happy users and is growing fast thanks to its speed, simplicity and security. So while Andy’s a really hard act to follow, I know Sundar will do a tremendous job doubling down on Android as we work to push the ecosystem forward. 

What is the real story? The obvious speculation is that Google may have too many mobile platforms, and may look to merge the Android and Chrome OS operating systems.

Ryan Tate of Wired wrote, in “Andy Rubin and the Great Narrowing of Google,”

The two operating system chiefs have long clashed as part of a political struggle between Rubin’s Android and Pichai’s Chrome OS, and the very different views of the future each man espouses. The two operating systems, both based on Linux, are converging, with Android growing into tablets and Chrome OS shrinking into smaller and smaller laptops, including some powered by chips using the ARM architecture popular in smartphones.

Tate continues,

There’s a certain logic to consolidating the two operating systems, but it does seem odd that the man in charge of Android – far and away the more successful and promising of the two systems – did not end up on top. And there are hints that the move came as something of a surprise even inside the company; Rubin’s name was dropped from a SXSW keynote just a few days before the Austin, Texas conference began.

Other pundits seem equally confused. Hopefully, we’ll know what’s on going on soon. Registration for Google’s I/O conference opened – and closed – on March 13. If you blinked, you missed it. We’ll obviously be covering the Android side of this at our own AnDevCon conference, coming to Boston on May 28-31.

, ,

Is Big Data a fancy way of saying Big Social?

What do companies use Big Data technologies to analyze? Sales transactions. Social media trends. Scientific data. Social media trends. Weather readings. Social media trends. Prices for raw materials. Social media trends. Stock values. Social media trends. Web logs. And social media trends.

Sometimes I wonder if the entire point of Big Data is to sort through tweets. And Pinterest, Facebook and Tumblr – as well as closed social media networks like Salesforce.com’s Chatter and Microsoft’s recently acquired Yammer.

Perhaps this is a reflection that “social” is more than a way for businesses to disintermediate and reach customers directly. (Remember “disintermediation”? It was the go-to word during the early dot-com era of B-to-B and B-to-C e-commerce, and implied unlimited profits.)

Social media – nowadays referred to simply as “social” – is proving to be very effective in helping organizations improve communications. Document repositories and databases are essential, of course. Portal systems are vital. But traditional ways of communication, namely e-mail and standard one-to-one instant messaging, aren’t getting the job done, not in big organizations. Employees drown in their overflowing inboxes, and don’t know whom to message for information or input or workflow.

Enter a new Big Data angle on social. It’s one that goes beyond sifting through public messages to identifying what’s trending so you can sell more products or get on top of customer dissatisfaction before it goes viral. (Not to say those aren’t important, but that’s only the tip of the iceberg.)

What Big Data analysis can show you is not just what is going on and what the trends are, but who is driving them, or who are at least on top of the curve.

Use analytics to find out which of your customers are tastemakers – and cultivate them. Find out which of your partners are generating the most tractions – and deepen those ties. And find out which of your employees, through in-house social tools like instant messaging, blogs, wikis and forums, are posting the best information, are attracting followers and comments, and are otherwise leading the pack.

Treasure those people, especially those who are in your IT and development departments.

Big Social is the key to your organization’s future. Big Data helps you find and turn that key. We’ll cover both those trends at Big Data TechCon, coming to Boston from April 8-10. Hope to see you there.

, , ,

Bug Invaders! Angry Code! World of Compilecraft!

Everything, it seems, is a game. When I use the Waze navigation app on my smartphone, I earn status for reporting red-light cameras. What’s next: If I check in code early to version-control system, do I win a prize? Get points? Become a Code Warrior Level IV?

Turning software development into a game is certainly not entirely new. Some people live for “winning,” and like getting points – or status – by committing code to open-source projects or by reporting bugs as a beta tester. For the most part, however, that was minor. The main reason to commit the code or document the defect was to make the product better. Gaining status should be a secondary consideration – a reward, if you will, not a motivator.

For some enterprise workers, however, gamification of the job can be more than a perk or added bonus. It may be the primary motivator for a generation reared on computer games. Yes, you’ll get paid if you get your job done (and fired if you don’t). But you’ll work harder if you are encouraged to compete against other colleagues, against other teams, against your own previous high score.

Would gamification work with, say, me? I don’t think so. But from what I gather, it’s truly a generational divide. I’m a Baby Boomer; when I was a programmer, Back in the Day, I put in my hours for a paycheck and promotions. What I cared about most: What my boss thought about my work.

For Generation Y / Millennials (in the U.S, generally considered to be those born between 1982 and 2000), it’s a different game.

Here are some resources that I’ve found about gamification in the software development profession. What do you think about them? Do you use gamification techniques in your organization to motivate your workers?

Gamification in Software Development and Agile

Gamifying Software Engineering and Maintenance

Gamifying software still in its infancy, but useful for some

Some Thoughts on Gamification and Software

TED Talk: Gaming can make a better world 

, , ,

Big challenges with data and Big Data

Just about everyone is talking about Big Data, and I’m not only saying that because I’m conference chair for Big Data TechCon, coming up in April in Boston.

Take Microsoft, for example. On Feb. 13, the company released survey results that talked about their big customers’ biggest data challenges, and how those relate to Big Data.

In its “Big Data Trends: 2013” study, Microsoft talked to 282 U.S. IT decision-makers who are responsible for business intelligence, and presumably, other data-related issues. To quote some findings from Microsoft’s summary of that study:

• 32% expect the amount of data they store to double in the next two to three years.

• 62% of respondents currently store at least 100 TB of data. 

• Respondents reported an average of 38% of their current data as unstructured.

• 89% already have a dedicated budget for a Big Data solution.

• 51% of companies surveyed are in the middle stages of planning a big data solution

• 13% have fully deployed a Big Data solution.

• 72% have begun the planning process but have not  yet tested or deployed a solution; of those currently planning, 76% expect to have a solution implemented in less than one year.

• 62% said developing near-real-time predictive analytics or data-mining capabilities during the next 24 months is extremely important.

• 58% rated expanding data storage infrastructure and resources as extremely important.

• 53% rated increased amounts of unstructured data to analyze as extremely important.

• Respondents expect an average of 37% growth in data during the next two to three years.

I can’t help but be delighted by the final bullet point from Microsoft’s study. “Most respondents (54 percent) listed industry conferences as one of the two most strategic and reliable sources of information on big data.”

Hope to see you at Big Data TechCon.

, , , ,

From Apple to Microsoft to Tesla, rumors abound

teslaIf there’s no news… well, let’s make some up. That’s my thought upon reading all the stories about Apple’s forthcoming iWatch – a product that, as far as anyone knows, doesn’t exist.

That hasn’t stopped everyone from Forbes to CNN to the New York Times from jumping in with breathless analysis of the rumor.

Turn the page.

More breathless analysis focused on why Microsoft’s stores and retail partners didn’t have enough stock of the Surface Pro tablet. Was this intentional, some wondered, part of a scheme to make the device appear more popular?

My friend John P. Mello Jr. had solid analysis in his article for PC World, “Microsoft Surface Pro sell-out flap: Is the tablet really that popular?

I think the real reason is that Microsoft isn’t very good at sales estimation or manufacturing logistics. Companies like Apple and HP have dominated, in large part, because of their master of the supply chain. Despite its success with the Xbox consoles, Microsoft is a hardware newbie. I think the inventory shortfall was a screw-up, but an honest one.

After all, when Apple or Samsung run out of hot items, nobody says “It’s a trick.”

Can’t leave the conversation about rumors without mentioning the kerfuffle with the New York Times’s story, “Stalled Out on Tesla’s Electric Highway.” In short: Times columnist John M. Broder claims that the Tesla Model S electric car doesn’t live up to its claimed 265-mile estimated range. Tesla founder Elon Musk tweeted “NYTimes article about Tesla range in cold is fake.”

Everyone loves a good twitter-fight. Dozens of pundits, and gazillions of clicks, are keeping this story in the news.

, , , ,

The complications of cloud adoption

Cloud computing is seductive. Incredibly so. Reduced capital costs. No more power and cooling of a server closet or data center. High-speed Internet backbones. Outsourced disaster recovery. Advanced edge caching. Deployments are lightning fast, with capacity ramp-ups only a mouse-click away – making the cloud a panacea for Big Data applications.

Cloud computing is scary. Vendors come and vendors go. Failures happen, and they are out of your control. Software is updated, sometimes with your knowledge, sometimes not. You have to take their word for security. And the costs aren’t always lower.

An interesting new study from KPMG, “The Cloud Takes Shape,” digs into the expectations of cloud deployment – and the realities.

According to the study, cloud migration was generally a success. It showed that 33% of senior executives using the cloud said that the implementation, transition and integration costs were too high; 30% cited challenges with data loss and privacy risks; 30% were worried about the loss of control. Also, 26% were worried about the lack of visibility into future demand and associated costs, 26% fretted about the lack of interoperability standards between cloud providers; and 21% were challenged by the risk of intellectual property theft.

There’s a lot more depth in the study, and I encourage you to download and browse through it. (Given that KPMG is a big financial and tax consulting firm, there’s a lot in the report about the tax challenges and opportunities in cloud computing.)

The study concludes,

Our survey finds that the majority of organizations around the world have already begun to adopt some form of cloud (or ‘as-a-service’) technology within their enterprise, and all signs indicate that this is just the beginning; respondents expect to move more business processes to the cloud in the next 18 months, gain more budget for cloud implementation and spend less time building and defending the cloud business case to their leadership. Clearly, the business is becoming more comfortable with the benefits and associated risks that cloud brings.

With experience comes insight. It is not surprising, therefore, that the top cloud-related challenges facing business and IT leaders has evolved from concerns about security and performance capability to instead focus on some of the ‘nuts and bolts’ of cloud implementation. Tactical challenges such as higher than expected implementation costs, integration challenges and loss of control now loom large on the cloud business agenda, demonstrating that – as organizations expand their usage and gain more experience in the cloud – focus tends to turn towards implementation, operational and governance challenges.

,

You can’t analyze what you don’t capture

Big Data can sometimes mean Big Obstacles. And often those obstacles are simply that the Big Data isn’t there.

That’s what more than 1400 CIOs told Robert Half Technology, a staffing agency. According to the study, whose data was released in January, only 23% of CIOs said their companies collected customer data about demographics or buying habits. Of those that did collect this type of data, 53% of the CIOs said they had insufficient staff to access or analyze that data.

Ouch. 

The report was part of Robert Half Technology’s 2013 Salary Guide. There is a page about Big Data, which says,

When you consider that more than 2.7 billion likes and comments are generated on Facebook every day — and that 15 out of 17 U.S. business sectors have more data stored per company than the U.S. Library of Congress — it’s easy to understand why companies are seeking technology professionals who can crack the big data “code.”

Until recently, information collected and stored by companies was a mishmash waiting to be synthesized. This was because most companies didn’t have an effective way to aggregate it.

Now, more powerful and cost-effective computing solutions are allowing companies of all sizes to extract the value of their data quickly and efficiently. And when companies have the ability to tap a gold mine of knowledge locked in data warehouses, or quickly uncover relevant patterns in data coming from dynamic sources such as the Web, it helps them create more personalized online experiences for customers, develop highly targeted marketing campaigns, optimize business processes and more.

,

Honors for the father of fuzzy logic, Lotfi Zadeh

“In contrast to classical logical systems, fuzzy logic is aimed at a formalization of modes of reasoning that are approximate rather than exact. Basically, a fuzzy logical system may be viewed as a result of fuzzifying a standard logical system. Thus, one may speak of fuzzy predicate logic, fuzzy modal local, fuzzy default logic, fuzzy multivalued logic, fuzzy epistemic logic, and so-on. In this perspective, fuzzy logic is essentially a union of fuzzified logical systems in which precise reasoning is viewed as a limiting case of approximate reasoning.”

So began one of the most important technical articles published by AI Expert Magazine during my tenure as its editor: “The Calculus of Fuzzy If/Then Rules,” by Lotfi A. Zadah, in March 1992.

Even then, more than 20 years ago, Dr. Zadeh was revered as the father of fuzzy logic. I recall my interactions with him on that article very fondly.

I was delighted to learn that Fundacion BBVA, the philanthropic foundation of the Spanish bank BBVA, has recognized Dr. Zadeh with their 2012 Frontiers of Knowledge Award.

To quote from the Web page for the award,

The BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies (ICT) category has been granted in this fifth edition to the electrical engineer Lotfi A. Zadeh, “for the invention and development of fuzzy logic.” This “revolutionary” breakthrough, affirms the jury in its citation, has enabled machines to work with imprecise concepts, in the same way humans do, and thus secure more efficient results more aligned with reality. In the last fifty years, this methodology has generated over 50,000 patents in Japan and the U.S. alone. 

The key paper, the one that started it all, was “Fuzzy Sets,” published by Dr. Zadeh in June 1965 in the journal “Information and Control.” You can read the paper here as a PDF. I would not call it light reading.

Congratulations, Dr. Zadeh, for your many contributions to computer science and software engineering – and to the modern world.

, , ,

Big Data, by any other name, would smell as sweet

Modern companies thrive by harnessing and interpreting data. The more data we have, and the more we focus on analyzing it, the better we can make decisions. Data about our customers, data about purchasing patterns, data about network throughput, data in server logs, data in sales receipts. When we crunch our internal data, and cross-reference it against external data sources, we get goodness. That’s what Big Data is all about.

Data crunching and data correlation isn’t new, of course. That’s what business intelligence is all about. Spotting trends and making predictions is what business analysts have been doing for 40 years or more. From weather forecasters to the World Bank, from particle physicists to political pollsters, all that’s new is that our technology has gotten better. Our hardware, our software and our algorithms are a lot better.

Admittedly, some political pollsters in the recent United States presidential election didn’t seem to have better data analytics. That’s another story for another day.

Is “Big Data” the best term for talking about data acquisition and predictive analytics using Hadoop, Map/Reduce, Cassandra, Avro, HBase, NoSQL databases and so-on? Maybe. Folks like Strata conference chair Edd Dumbill and TechCrunch editor Leena Rao think not.

Indeed, Rao suggests, “Let’s banish the term ‘big data’ with pivot, cloud and all the other meaningless buzzwords we have grown to hate.” She continues, “the term itself is outdated, and consists of an overly general set of words that don’t reflect what is actually happening now with data. It’s no longer about big data, it’s about what you can do with the data.”

Yes, “Big Data” is a fairly generic phrase, and our focus should rightfully be on benefits, not on the 1s and 0s themselves. However, the phrase neatly fronts a broad concept that plenty of people seem to understand very well, thank you very much. Language is a tool; if the phrase Big Data gets the job done, we’ll stick with it, both as a term to use in SD Times and as the name of our technical training conference focused on data acquisition, predictive analytics, etc., Big Data TechCon.

The name doesn’t matter. Big Data. Business Intelligence. Predictive Analytics. Decision Support. Whatever. What matters is that we’re doing it.

, , ,

Movable walls in the garden

walled-gardenToday’s word is “open.” What does open mean in terms of open platforms and open standards? It’s a tricky concept. Is Windows more open than Mac OS X? Is Linux more open than Solaris? Is Android more open than iOS? Is the Java language more open than C#? Is Firefox more open than Chrome? Is SQL Server more open than DB2?

The answer in all these cases can be summarized in two more words: “That depends.” To some purists, anything that is owned by a non-commercial project or standards body is open. By contrast, anything that is owned by a company, or controlled by a company, is by definition not open.

There are infinite shades of gray. Openness isn’t a line or a spectrum, and it’s not a two-dimensional matrix either. There are countless dimensions.

Take iOS. The language used to program iPhone/iPad apps is Objective-C. It’s pretty open – certainly, some would say that Objective-C is more open than Java, which is owned and controlled by Oracle. Since iOS uses Objective-C, and Android uses Java, doesn’t that makes iOS open, and Android not open?

But wait – perhaps when people talk about the openness of the mobile platforms, they mean whether there is a walled garden around its primary app store. If you want to distribute native apps to through Apple’s store, you must meet Apple’s criteria in lots of ways, from the use of APIs to revenue sharing for in-app purchases. That’s not very open. If you want to distribute native apps to Android devices, you can choose Google Play, where the standards for app acceptance are fairly low, or another app store (like Amazon’s), or even set up your own. That’s more open.

If you want to build apps that are distributed and use Microsoft’s new tiled user experience, you have to put them into the Windows Store. In fact, such applications are called Windows Store Apps. Microsoft keeps a 30% cut of sales, and reserves the right to not only kick your app out of the Windows Store, but also remove your app from customer’s devices. That’s not very open.

The trend these days is for everyone to set up their own app store – whether it’s the Windows Store, Google Play, the Raspberry Pi Store, Salesforce.com AppExchange, Firefox Marketplace, Chrome Web Store, BlackBerry App World, Facebook Apps Center or the Apple App Store. There are lots more. Dozens. Hundreds perhaps.

Every one of these stores affects the openness of the platform – whether the platform is a mobile or desktop device, browser, operating system or cloud-based app. Forget programming language. Forget APIs. The true test of openness is becoming the character of the app store, whether consumers are locked into using open “approved” stores, what restrictions are placed on what may be placed in that app store, and whether developers have the freedom to fully utilize everything the platform can offer. (If the platform vendor’s own apps, or those from preferred partners, can access APIs that are not allowed in the app store, that’s not a good sign.)

Nearly every platform is a walled garden. The walls aren’t simple; they make Calabi-Yau manifolds look like child’s play. The walls twist. They turn. They move.

Forget standards bodies. Today’s openness is the openness of the walled garden.

, ,

Write once run everywhere, version 2.0

ethan-evansIn 1996, according to the Wikipedia, Sun Microsystems promised

Java’s write-once-run-everywhere capability along with its easy accessibility have propelled the software and Internet communities to embrace it as the de facto standard for writing applications for complex networks

That was version 1.0. Version 2.0 of the write-once-run-everywhere promise goes to HTML5. There are four real challenges with pure HTML5 apps, though, especially on mobile devices:

  • The specification isn’t finished, and devices and browsers don’t always support the full draft spec.
  • Run-time performance can be slow, especially on older mobile devices – and HTML5 apps developers can’t always manage or predict client performance.
  • Network latency can adversely affect the user experience, especially compared to native apps.
  • HTML5 apps can’t always access native device features – and what they can access may depend on the client operating system, browser design and sandbox constraints.

What should you do about it? According to Ethan Evans, Director of App Developer Services at Amazon.com, the answer is to build hybrid apps that combine HTML5 with native code.

In his keynote address at AnDevCon earlier this month, Evans said that there are three essential elements to building hybrid apps. First, architecting the correct division between native code and HTML5 code. Second, make sure the native code is blinding fast. Third, make sure the HTML5/JavaScript is blinding fast.

Performance is the key to giving a good user experience, he said, with the goal that a native app and a hybrid apps should be indistinguishable. That’s not easy, especially on older devices with underpowered CPUs and GPUs, small amounts of memory, and of course, poor support for HTML5 in the stack.

“Old versions of Android live forever,” Evans said, along with old versions of Webkit. Hardware acceleration varies wildly, as does the browser’s use of hardware acceleration. A real problem is flinging – that is, rapidly trying to scroll data that’s being fed from the Internet. Native code can handle that well; HTML5 can fall flat.

Thus, Evans said, you need to go native. His heuristic is:

  • HTML5 is good for parts of the user experience that involve relatively low interactivity. For example, text and static display, video playback, showing basic online content, handling basic actions like payment portals.
  • HTML5 is less good when there is more user interactivity. For example, scrolling, complex physics that use native APIs, multiple concurrent sounds, sustained high frame rates, multi-touch or gesture recognition.
  • HTML5 is also a challenge when you need access to hardware features or other applications on the device, such as the camera, calendar or contacts.
  • Cross-platform HTML5 is difficult to optimize to different CPUs, GPUs, operating systems versions, or even to accommodate single-core vs. multi-core devices.
  • Native code, by contrast, is good at handling the performance issues, assuming that you can build and test on all the key platforms. That means that you’ll have to port.
  • With HTML5, code updates are handled on the server. When building native apps, code updates will require apps upgrades. That’s fast and easy on Android, but slow and hard on iOS due to Apple’s review process.
  • Building a good user interface is relatively easy using HTML5 and CSS, but is harder using native code. Testing that user interface is much harder with native code due to the variations you will encounter.

Bottom line, says Amazon’s Ethan Evans: HTML5 + CSS + JavaScript + Native = Good.

, , , ,

When Big Data becomes Bad Data

The subject line in today’s email from United Airlines was friendly. “Alan, it’s been a while since your last trip from Austin.”

Friendly, yes. Effective? Not at all close.

Alan, you see, lives in northern California, not in central Texas. Alan rarely goes to Austin. Alan has never originated a round trip from Austin.

My most recent trip to Austin was from SFO to AUS on Feb. 13, 2011, returning on Feb. 15, 2011. The trip before that? In 2007.

Technically United is correct. It indeed has been a while since my last trip from Austin. Who cares? Why in the world would United News & Deals — the “from” name on that marketing email— think that I would be looking for discounted round-trip flights from Austin?

It is Big Data gone bad.

We see example of this all the time. A friend loves to post snarky screen shots of totally off-base Facebook ads, like the one that offered him ways to “meet big and beautiful women now,” or non-stop ads for luxury vehicles. For some reason, Lexus finds his demographic irresistible. However: My friend and his wife live in Manhattan. They don’t own or want a car.

Behavioral ad targeting relies upon Big Data techniques. Clearly, those techniques are not always effective, as the dating, car-sales and air travel messages demonstrate. There is both art and science to Big Data – gathering the vast quantities of data, processing it quickly and intelligently, and of course, using the information effectively to drive a business purpose like behavioral marketing.

Sometimes it works. Oops, sometimes it doesn’t. Being accurate isn’t the same as being useful.

Where to learn that art and science? Let me suggest Big Data TechCon. Three days, dozens of practical how-to classes that will teach you and your team how to get Big Data right. No, it’s not in Austin— it’s near Boston, from April 8-10, 2013. Hope to see you there— especially if you work for United Airlines or Lexus.

, , , , , ,

Happy Thanksgiving

Tomorrow Americans will celebrate Thanksgiving. This is an odd holiday. It’s partly religious, but also partly secular, dating back to the English colonization of eastern North America. A recent tradition is for people to share what they are thankful for. In a lighthearted way, let me share some of my tech-related joys.

• I am thankful for PDF files. Websites that share documents in other formats (such as Microsoft Word) are kludgy, and document never looks quite right.

• I am thankful for native non-PDF files. Extracting content from PDF files to use in other applications is a time-consuming process that often requires significant post-processing.

• I am thankful that Hewlett-Packard is still in business – for now at least. It’s astonishing how HP bungles acquisition after acquisition after acquisition.

• I am thankful for consistent language specifications, such as C++, Java, HTML4 and JavaScript, which give us a fighting chance at cross-platform compatibility. A world with only proprietary languages would be horrible.

• I am thankful for HTML5 and CSS3, which solve many important problems for application development and deployment.

• I am thankful that most modern operating systems and applications can be updated via the Internet. No more floppies, CDs or DVDs.

• I am thankful that floppies are dead, dead, dead, dead, dead.

• I am thankful that Apple and Microsoft don’t force consumers to purchase applications for their latest desktop operating systems from their app stores. It’s my computer, and I should be able to run any bits that I want.

• I am thankful for Hadoop and its companion Apache projects like Avro, Cassandra, HBase and Pig, which in a only a couple of years became the de facto platform for Big Data and a must-know technology for developers.

• I am thankful that Linux exists as a compelling server operating system, as the foundation of Android, and as a driver of innovation.

• I am thankful for RAW photo image files and for Adobe Lightroom to process those RAW files.

• I am thankful for the Microsoft Surface, which is the most exciting new hardware platform since the Apple’s iPad and MacBook Air.

• I am thankful to still get a laugh by making the comment, “There’s an app for that!” in random non-tech-related conversations.

• I am thankful for the agile software movement, which has refocused our attention to efficiently creating excellent software, and which has created a new vocabulary for sharing best practices.

• I am thankful for RFID technology, especially as implemented in the East Coast’s E-Zpass and California’s FasTrak toll readers.

• I am thankful that despite the proliferation of e-book readers, technology books are still published on paper. E-books are great for novels and documents meant to be read linearly, but are not so great for learning a new language or studying a platform.

• I am thankful that nobody has figured out how to remotely hack into my car’s telematics systems yet – as far as I know.

• I am thankful for XKCD.

• I am thankful that Oracle seems to be committed to evolving Java and keeping it open.

• I am thankful for the wonderful work done by open-source communities like Apache, Eclipse and Mozilla.

• I am thankful that my Android phone uses an industry-standard Micro-USB connector.

• I am thankful for readers like you, who have made SD Times the leading news source in the software development community.

Happy Thanksgiving to you and yours.

, , ,

The joy of being a geek: 60-core chips, self-driving cars

So much I could write about today. The U.S. presidential elections. Intel’s new 60-core PCIX-based coprocessor chip. The sudden departure of Steven Sinofsky from Microsoft, after three years as president of the Windows Division. The Android 4.2 upgrade that unexpectedly changed the user experience on my Nexus phone. All were candidates.

Nah. All those ideas are off the table. Today, let’s bask in the warm geekiness of the Google Self-Driving Car. The vehicle, an extensively modified Lexus RH450h hybrid sport utility, lives here in Silicon Valley. The cars are frequently sighted on the highways around here, and in fact my wife Carole saw one in Mountain View last week.

Until today, I had never seen one in action, but at lunchtime, the Self-Driving Car played with me on I-280. If you’re not familiar with the Google Self-Driving Car, here’s a great story in the New York Times about one of the small fleet, “Yes, Driverless Cars Know the Way to San Jose.”

I encountered the Google car going northbound on I-280, and passed it carefully. Many cars lengths ahead, I carefully changed into its lane and slowed down slightly — and waited to see what the self-driving car would do.

The Google car approached slowly, signaled, moved into the next lane, and passed me. I was taking pictures out the window — and the Google engineer sitting in the passenger seat smiled and waved. It was just another day for the experimental hardware, software and cloud-based services.

Yet, why do I have the feeling of having a Star Trek-style First Contact with an alien artificial life form? It is wonderful living in Silicon Valley and being a participant in the evolution of modern technology – both at the IDE and behind the wheel.

, , , ,

Apple’s victory over Samsung should drive innovation

The jury is in: Samsung was found to have infringed upon Apple’s numerous mobile patents. The jury’s verdict form, handed down in the United States District Court in San Jose, Calif., found that in many cases that the “Samsung entity has diluted any Apple trade dress(es).” What’s more, Apple proved “by a preponderance of the evidence that the Samsung entity’s direction was willful.”

Ouch. This is the worst case scenario for Samsung. Forget about the US$1.049 billion in damages that Samsung is supposed to pay Apple. What this means is that the jury agreed with what everyone knew simply by looking at the hardware and playing with the software: the Samsung Galaxy Tab 10.1 is just like the iPad.

On the short term, this ruling is going have a chilling effect not only on Apple, but on every maker of Android devices. The more similar the devices are to Apple’s iOS phones and tablets, the more scared the hardware manufacturers are going to be. (That is, if the verdict stands and isn’t overturned on appeal.)

We can expect to see a lot of introspection within the Android ecosystem. Google, Samsung and the other device manufacturers will look close, really close, to make sure they stay away from the specific patents cited in this case.

We can expect to see software updates and hardware guidelines that will take Android devices farther from Apple’s devices.

On the short term – this will depress sales of Android devices. On the longer term, we will see a ton of innovation that will truly differentiate Android from iOS.

For too long, Android handset- and tablet-makers have been trying to get as close to the iPhone and iPad design as possible. It’s not laziness or a lack of technical savvy, in my opinion. It’s just that Apple has done such a good job of defining the smartphone and tablet that consumers expect that, well, that’s just how the platforms should work.

Salespeople want to sell Android devices that are identical to Apple devices, only less expensive.

Consumers who choose Android are sometimes making those selections based on technical merit, but are sometimes looking for something that’s just like an iPhone/iPad, only different. Perhaps they want more memory, perhaps a bigger phone screen, perhaps a smaller tablet screen, perhaps a slide-out keyboard, sometimes a removable battery, sometimes simply a brand that isn’t spelled “Apple.”

Of course, with rumors that Apple is about to release a 7-inch iPad, the job of Android tablet companies is only going to get harder. In my own informal polling, folks who have purchased 7-inch tablets have done so mainly because Apple doesn’t sell one.

For the next year or so, Samsung and the whole Android community will fall back and retrench. That will involve unleashing innovation that may have been stifled, as they preferred to imitate the iOS designs instead of pushing their own ideas.

Imitation may be the most sincere form of flattery – but in the smartphone and tablet markets, imitation is off the table. For good.

, ,

The new Microsoft logo

The temptation to write about Microsoft’s brand-new logo is almost unbearable. I’ve been trying to resist but… okay. I can’t resist any longer.

Microsoft has a new logo. It has color squares reminiscent of the four color blocks in Office, SharePoint, Visual Studio, and so-on, with the word “Microsoft” spelled out in type. The Pac-Man-like bite out of the letter “o” is gone.

You can see the new logo in this blog post from Jeff Hansen, General Manager, Brand Strategy, Microsoft. Hansen writes

The Microsoft brand is about much more than logos or product names. We are lucky to play a role in the lives of more than a billion people every day. The ways people experience our products are our most important “brand impressions”. That’s why the new Microsoft logo takes its inspiration from our product design principles while drawing upon the heritage of our brand values, fonts and colors.

Ahhh. When I see companies redrawing their logos, I’m reminded of ship stewards rearranging the deck chairs. Don’t they have something better to spend their time on, their money on, than redrawing a well-recognized, 25-year-old logo? Think about the signs that must be remade, documents that must be reprinted, business cards, brand identity handbooks, and so-on. The ROI for this is what?

The same was true, by the way, for the last several movies based on the Star Trek: The Next Generation crew. Why was the Federation constantly redesigning its Star Fleet uniforms? But I digress.

Let’s not forget the 2010 logo redesign for the Gap, a chain of clothing stores. The social-media outrage about this logo change was so swift that the Gap reversed itself a week later. Amazing. You can read the whole sordid story here in Vanity Fair.

The new Microsoft logo isn’t terrible. But it’s not wonderful either. Yes, the colors tie the corporate logo to flagship product identities, but other tech companies like Google use similar colors with Chrome and other product lines. The new Microsoft logo seems utterly unnecessary – and the timing isn’t great.

, , ,

Preying on human weakness with well-designed faux emails

This past week, I’ve started receiving messages from eFax telling me that I’ve received a fax, and to click on a link to download my document. As a heavy eFax user, this seemed perfectly normal… until I clicked one of the links. It took me to a malware site. Fortunately, the site was designed to target Windows computers, and simply froze my Mac’s browser.
The faux eFax messages were well designed. They had clean headers and made it through my email service provider’s malware filters.
Since then, six of those malicious messages have appeared. I have to look carefully at the embedded link to distinguish those from genuine eFax messages with links to genuine faxes.
The cybercrime wars continue unabated, with no end in sight. I’ve also received fake emails from UPS, asking me to print out a shipping label… which of course leads me to a phishing site.
Malicious email – whether it’s phishing, a “419”-style confidence scam, or an attempt to add your computers to someone’s botnet – is only one type of cybercrime. Most of the time, as software developers, we’re not focusing on bad emails, unless we’re trying to protect our own email account, or worrying about the design of emails sent into automated systems. SQL Injection delivered by email? That’s nothing I want to see.
Most of the attacks that we have to content with are more directly against our software – or the platforms that they are built upon. Some of those attacks come from outside; some from inside.
Some attacks are successful because of our carelessness in coding, testing, installing or configuring our systems. Other attacks succeed despite everything we try to do, because there are vulnerabilities we don’t know about, or don’t know how to defend against. And sometimes we don’t even know that a successful attack occurred, and that data or intellectual property has been stolen.
We need to think longer and harder about software security. SD Times has run numerous articles about the need to train developers and tester to learn secure coding techniques. We’ve written about tools that provided automated scanning of both source code and binaries. We’re talked about fuzz testers, penetration tests, you name it.
What we generally don’t talk about is the backstory – the who and the why. Frankly, we generally don’t care why someone is trying to hack our systems; it’s our job to protect our systems, not sleuth out perpetrators.
We are all soldiers in the cybercrime war – whether we like it or not. Please read a story by SD Times editor Suzanne Kattau, “Cybercrime: How organizations can protect themselves,” where she interviewed Steve Durbin, for the Information Security Forum. It’s interesting to see this perspective on the broader problem.
, , , ,

The handheld and the tablet, circa 1976

Let’s talk about the HP-67 and HP-97 programmable calculators.

Introduced in 1976, both those models hold place of pride in my collection of vintage computation devices – which consists of a tremendous number of older Hewlett-Packard and Texas Instruments calculators, as well as dozens of slide rules going back to the late 1800s.

The four-function pocket calculator was the feature phone of its era. Arriving in the early 1970s, they swiftly replaced adding machines. The HP-35 calculator (1972) with its trig, log and exponential functions, singlehandedly killed the slide rule industry.

Programmable calculators with persistent removable storage – specifically Hewlett-Packard’s HP-65 (1974) and Texas Instruments’ SR-52 (1975) – were the equivalent of the first smartphones. Why? Because you could store and load programs on little magnetic cards. You could buy pre-written packs of programs on those cards from HP and TI. There were user groups where calculator programs could publish and share programs. And there were even a few commercial developers who sold programs on cards as well.

Some of my earliest published programs were written for HP and TI calculators in the mid-1970s. A foundational part of my own history as a computer scientist was learning how to do some pretty sophisticated work with only a few hundred bytes of addressable memory. Not megabyes. Not kilobytes. Bytes.

In modern terms, we would call calculator programs distributed on mag cards “apps.” The HP-65 Users Library and the TI PPX-52 (Personal Program Exchange) were among the first app stores.

This brings me to the HP-67 and HP-97, which were introduced simultaneously at prices of US$450 and $750, respectively. They were essentially the same device – except that the HP-67 was a 0.7-pound pocket calculator and the HP-97 was a 2.5-pound battery-powered desktop model with a built-in thermal printer.

“Calculator” is probably the wrong word for these devices. They were portable computers – in fact, they were truly personal computers, albeit with a custom microprocessor, one-line numeric display and only 224 bytes of programmable memory.

Although the form factors and key placement were different – and the HP-97 had the printer – both used the same programming language. Both models had a mag-card reader – and a program written on one could be used on the other without modification. This was unique.

In modern terms, the HP-67 and HP-97 were like handhelds and tablets sharing the same apps, like the iPhone and iPad, or Android phones and tablets.

No matter how far we’ve come, we’ve been here before.

, , ,

Fight back against the ugly ‘brogrammer’ trend

I don’t like the trend toward ‘brogrammers’ – that is, a very chauvinistic, juvenile attitude that seems to be creating a male-centric, female-exclusionary culture in software development departments – and across IT. It’s time to put an end to the put-downs, pin-ups, constant sports in-jokes and warfare metaphors, management by belittlement, and insulting locker-room attitude.

When I was a student studying math and computer science, nearly all of my fellow students, and nearly all of the faculty, were male. Although my idol was Admiral Grace Hopper, there were few Grace Hoppers in our profession to serve as role models for young women — or men.

Change came slowly. In the 1980s, nearly all writers of technical articles in computer magazines were male. Nearly all readers were mail. Nearly all attendees of technology conferences were male; the females at the show were almost exclusively marketers or booth babes.

Much has changed in the past few decades. For example, while the demographic research shows that most SD Times readers are male, the percentage of female readers is rising. The same is true of the technical conferences that our company produces. While female faces are still a minority, that is becoming less true every year, thanks in part to organizations like the Anita Borg Foundation.

That’s a good thing. A very good thing. Our fast-growing, demanding profession needs all the brainpower we can get. Women, we need you. Having female programmers on your team doesn’t mean that you need to buy pink mice and purple IDEs. It means that you have more top-notch architects, coders and testers, and you will create better software faster.

That’s why the so-called brogrammer trend is so infuriating. Why don’t managers and executives understand?

A few days ago, a female techie friend wrote to me in anger about a new website called Hot Tech Today which features short technology stories allegedly written by attractive young women posing in bikinis.

Disgusting.

We are better than this. We must be better than this.

Let’s put our resources into changing the brogrammer culture. Let’s make our profession not only safe for females, but also inviting and friendly. That means ditching the inappropriate language, curbing the stupid jokes, stopping the subtle put-downs of the women in your organization, and having a zero-tolerance rule to anyone who creates a hostile work environment for anyone, regardless of gender, race, national origin or anything.

Brogrammers. Just say no.

For more on this nasty trend, see:

The Rise of the Brogrammer, by SD Times’ Victoria Reitano

Oh Hai Sexism, by Charles Arthur

In tech, some bemoan the rise of the ‘brogrammer’ culture, by Doug Gross

In war for talent, ‘brogrammers’ will be losers, by Gina Trapani

, ,

Celestial navigation, driving by GPS and agile development

Going agile makes sense. Navigating with traditional methodologies doesn’t make sense. I don’t know about you, but nothing sucks the life out of a software development project faster having to fully flesh out all the requirements before starting to build the solution.

Perhaps it’s a failure of imagination. Perhaps it’s incomplete vision. But as both a business owner and as an IT professional, it’s rare that a successfully completed application-development project comes even close to matching our original ideas.

Forget about cosmetic issues like the user interface, or unforeseen technical hurtles that must be overcome. No, I’m talking about the reality that my business – and yours, perhaps – moves fast and changes fast. We perceive the needs for new applications or for feature changes long before we understand all the details, dependencies and ramifications.

But we know enough to get started on our journey. We know enough to see whether our first steps are in the first direction. We know enough to steer us back onto the correct heading when we wander off course. Perhaps agile is the modern equivalent of celestial navigation, where we keep tacking closer and closer to our destination. In the words of John Masefield, “Give me a tall ship and a star to steer her by.”

Contrast that to the classic method of determining a complete set of requirements up front. That’s when teams create project plans that are followed meticulously until someone stands up and says, “Hey, the requirements changed!” At that point, you stop, revise the requirements, update the project plan and redo work that must be redone.

Of course, if the cost of creating and revising the requirements and project plan are low, sure, go for it. My automobile GPS does exactly that. If I tell it that I want to drive from San Francisco to New York City (my requirements), it will compute the entire 2,907-mile journey (my project plan) with incredible accuracy, from highway to byway, from interchange to intersection. Of course, every time the GPS detects that I missed an exit or pulled off the highway to get fuel, the device calculates the entire journey again. But that’s okay, as the cost of having the device recreate the project plan when it detects a requirements change is trivial.

In the world of software development, the costs of determining, documenting and getting approvals for a project’s requirements and project plans are extremely expensive, both in terms of time and money. Worse, there are no automated ways of knowing when business needs have changed, and therefore the project plan must change also. Thus, we can spend a lot of time sailing in the wrong direction. That’s where agile makes a difference – be design, it can detect when something going wrong faster than classic methodologies.

In a perfect world, if it were easy to create requirements and project plans, there would be no substantive difference between agile and classic methodologies. But in the messy, every-changing real world of software development that I live in, though, agile is the navigation methodology for me.

, ,

Oracle, Sun, Winners, Losers

It looks like Oracle is going to buy Sun Microsystems for $5.6 billion (net of Sun’s cash cache). Maybe the deal won’t happen. Maybe IBM will swing in with a counter offer. At this point, though, the odds are good that Oracle’s going to end up owning Java and all the other Sun technologies.

Oracle is getting a lot of very nice intellectual property. Whether that IP — as well as Sun’s product lines, maintenance agreements, licenses, consulting gigs and sales contracts — are worth $5.6 billion, that’s hard to say.

Overall, though, Oracle is clearly the biggest winner in this deal. It’s getting core technology that will cement its position in the application server market, and also give it obvious control over key industry specifications like the Java language, the enterprise Java EE platform, and the very important Java ME platform. Expect Oracle to exercise that control.

Let’s see who else wins and loses.

Loser: IBM. For years, I’ve speculated that IBM would purchase Sun just to secure a tight control over Java – which is a core technology that IBM depends upon. Now, that technology, as well as the Java Community Process, is going to fall into enemy hands. Bummer, Big Blue.

Winner: Java. Java is very important to Sun. Expect a lot of investment — in the areas that are important to Oracle.

Loser: The Java Community Process. Oracle is not known for openness. Oracle is not known for embracing competitors, or for collaborating with them to create markets. Instead, Oracle is known to play hardball to dominate its markets.

Winner: Customers that pay for Sun’s enterprise software. Oracle will take good care of them, though naturally there will be some product consolidation. Software customers may like being taken of by a company that’s focused on software, not hardware.

Loser. Customers that use open-source or community-supported versions of Sun’s software. Oracle is not in the free software business, except when that free software supports its paid software business. Don’t expect that to change.

Winner: Enterprise Linux vendors. Red Hat and other enterprise Linux distros will be dancing if Oracle decides that it doesn’t want to be in the Solaris business. On the other hand, this purchase makes it less likely that Oracle will spend big dollars to buy Red Hat in the near future.

Loser: MySQL customers. If Oracle keeps MySQL, expect it to be at the bottom of the heap as a lead-in for upgrades to Oracle’s big-gun database products. If Oracle decides not to kill or spin off MySQL, that’s going to mean disruption for the community.

Winner: Eclipse Foundation. Buh-bye, NetBeans! Oracle is heavily invested in Eclipse, and would be unlikely to continue investing in NetBeans. It’s hard to imagine that anyone would buy it, and the community probably couldn’t thrive if Oracle set it free.

Loser: Sun’s hardware customers. If Oracle stays in the hardware business, expect those Sun boxes to be only a bit player in Oracle’s product portfolio. If Oracle sells it, whoever buys it will probably milk it. How does “IBM System s (SPARC)” sound to you? Not very attractive.

Biggest Winner: Sun’s shareholders, including employees with options. After watching their shares plummet in value, and after getting a scare from IBM’s paltry offer, they must be counting their blessings right now.

, , , ,

Email messages without subject lines — grrrr!

nosubjectAmong the most peevish of my pet peeves are email messages that have no subject line. Why do people send them?

I know, I know, it’s generally accidental. Unfortunately, not all email applications warn users when they’re sending a message without a subject line. While most do warn, often you can set a configuration preference to disable such warnings.

The graphic is of the pop-up message that Mac Mail provides. As far as I know, there’s no way to disable it the alert. Good!

Memo to world: Sending email without a subject line is pretty rude. Subject lines help us find messages in our inbox, and also let us link threads together. Test your email software to make sure that it warns you. If it doesn’t, check your settings to turn that feature on (or back on).

Memo to my friend Nancy, who always uses the subject line “from Nancy”: That’s just as bad! I already know that the message is from you, since I see your name in the “From” field. I have a hundred messages from you, on multiple threads, and they all have the subject lines “from Nancy” or “re: From Nancy” — stop it!

, , ,

When the cloud was good, it was very very good. But when it was bad, it was horrid

Cloud computing took a big hit this week amid two significant service outages.

The biggest one, at least as it affects enterprise computing, is the eight-hour failure of Amazon’s Simple Storage Service. Check out the Amazon Web Services service health dashboard, and then select Amazon S3 in the United States for July 20. You’ll see that problems began at 9:05 am Pacific Time with “elevated error rates,” and that service wasn’t reported as being fully restored until 5:00 pm.

About the error, Amazon said,

We wanted to share a brief note about what we observed during yesterday’s event and where we are at this stage. As a distributed system, the different components of Amazon S3 need to be aware of the state of each other. For example, this awareness makes it possible for the system to decide to which redundant physical storage server to route a request. In order to share this state information across the system, we use a gossip protocol. Yesterday, we experienced a problem related to gossiping our internal state information, leaving the system components unable to interact properly and causing customers’ requests to Amazon S3 to fail. After exploring several alternatives, we determined that we had to temporarily take the service offline so that we could clear all gossipped state and restart gossip to rebuild the state.

These are sophisticated systems and it generally takes a while to get to root cause in such a situation. We’re working very hard to do this and will be providing more information here when we’ve fully investigated the incident. We also wanted to let you know that for this particular event, we’ll be waiving our standard SLA process and applying the appropriate service credit to all affected customers for the July billing period. Customers will not need to send us an e-mail to request their credits, as these will be automatically applied. This transaction will be reflected in our customers’ August billing statements.

Kudos to Amazon for issuing a billing adjustment. However, as we all know, the business cost of a service failure like this vastly exceeds the cost you pay for the service. If your applications were offline for eight hours because Amazon S3 was malfunctioning, that really hurts your bottom line. This wasn’t their first service failure, either: Amazon S3 went down in February as well.

Less significant to enterprises, but just as annoying to those concerned, involved hosted e-mail accounts hosted on Apple’s MobileMe service. MobileMe is the new name of the .Mac service, and the service was updated in mid-July along with the launch of the iPhone 3G. Unfortunately, not everything worked right. As you can see from Apple’s dashboard, some subscribers can’t access their email. Currently, this is affects about 1% of their subscribers — but it’s been like that since last Friday.

According to Apple,

We understand this is a serious issue and apologize for this service interruption. We are working hard to restore your service.

This reminds me of the poem from that great Maine writer, Henry Wadsworth Longfellow:

There was a little girl
Who had a little curl
Right in the middle of her forehead;
And when she was good
She was very, very good,
But when she was bad she was horrid.

, ,

Testosterone-fueled software development

A business-technology blogger for the Wall Street Journal, Rebecca Buckman, posits that there’s an innate difference in coding style between male and female programmers.

In her June 6 posting, “Men Write Code from Mars, Women Write More Helpful Code from Venus,” Buckman leads by throwing out another gender stereotype. This broad brushstroke, presented as unassailable fact, undermines her conclusion’s credibility right off the bat.

“We all know men hate to ask for directions. Apparently they loathe putting directions in computer code, too,” Buckman writes.

Buckman based her broad characterization of male and female programmers on the comments of one female software executive in Silicon Valley, Ingres’ Emma McGratten.

McGratten’s point, as amplified by Buckman, is that smart women write beautifully clear software to communicate better with their colleagues, while stupid men write cryptic code to show off how clever they think they are. Yay, women. Boo, men.

That’s why McGratten believes there’s a “big need to fix testosterone-fueled code at Ingres because only about 20% of the engineers are women.”

What a load of nonsense. I expect better from the WSJ.