, ,

Do you want to be a billionaire?

Would you like billion-dollar-billa billion dollars? Software companies, both startups and established firms, are selling like hotcakes. Some are selling for millions of U.S. dollars. Some are selling for billions. While the bulk of the sales price often goes back to venture financiers, a sale can be sweet for equity-holding employees, and even for non-equity employees who get a bonus. Hurray for stock options!

A million U.S. dollars is a lot of money. A billion dollars is a mind-blowing quantity of money, at least for me. A billion dollars is how much money Sun Microsystems paid to buy MySQL in 2008. Nineteen billion dollars is how much money Facebook is spending to buy the WhatsApp messaging platform in 2014 (see “With social media, it’s about making and spending lots of money”).

I’m going to share some analysis from Berkery Noyes, an investment bank that tracks mergers and acquisitions in the software industry. Here’s info excerpted from their Q1 2014 Software Industry Trends Report:

Software transaction volume declined four percent over the past three months, from 435 to 419. However, this represented a 14 percent increase compared to Q1 2013. Deal value gained 72 percent, from $22.6 billion in Q4 2013 to $38.8 billion in Q1 2014. This rise in aggregate value was attributable in large part to Facebook’s acquisition of Whatsapp [sic], a cross-platform mobile messaging application, for $16 billion. The top ten largest transactions accounted for 61 percent of the industry’s total value in Q1 2014, compared to 55 percent in Q4 2013 and 38 percent Q1 2013.

The Niche Software segment, which consists of software that is targeted to specific vertical markets, underwent a ten percent volume increase in Q1 2014. In terms of growth areas within the segment, deal volume pertaining to the Healthcare IT market increased 31 percent. Meanwhile, the largest Niche Software transaction during Q1 2014 was Thoma Bravo’s acquisition of Travelclick [sic], which provides cloud-based hotel management software, for $930 million.

According to Berkery Noyes, the niche software market was the largest of four segments defined by the bank’s analysts. You can read about the transactions in the business, consumer and infrastructure software segments in their report.

Why would someone acquire a software company? Sometimes it’s because of the customer base or the strength of a brand name. Sometimes it’s to eliminate a competitor. Sometimes it’s to grab intellectual property (like source code or patents). And sometimes it’s to lock up some specific talent. That’s particularly true of very small software companies that are doing innovative work and have rock-star developers.

Those “acqui-hires” are often lucrative for the handful of employees. They get great jobs, hiring bonuses and, if they have equity, a share of the purchase price. But not always. I was distressed to read about an acquisition where, according to one employee, “Amy”:

Under the terms of Google’s offer, Amy’s startup received enough money to pay back its original investors, plus about $10,000 in cash for each employee. Amy’s CEO was hired as a mid-level manager, and her engineering colleagues were given offers from Google that came with $250,000 salaries and significant signing bonuses. She was left jobless, with only $10,000 and a bunch of worthless stock.

The implication in the story, “The Secret Shame of an Unacquired Tech Worker,” is that this is sexist: The four male employees were hired by Google, and the one female employee was not.

We don’t know the real story here, and frankly, we probably never will. Still, stories like this ring true because of the brogrammer culture in Silicon Valley, and in the tech industry.

Let’s end with a bit of good news in that regard. According to market research firm Evans Data Corp.:

The number of females in software development has increased by 87% since first being measured in 2001, according to Evans Data’s recently released Developer Marketing 2014 survey.  In 2014, 19.3% of software developers are women, or approximately three and a half million female software developers worldwide.  While today’s number is strong compared to 2001, it is even stronger compared to the years of 2003 to 2009 when the percent of female developers dipped into the single digit range.

We are making progress.

, , , ,

Read about Carla Schroder’s nerd life – and it’s a good life

Carla-Schroder“I tried working for some tech companies like Microsoft, Tektronix, IBM, and Intel. What a fiasco. I can’t count how many young men with way less experience and skills than me snagged the good fun hands-on tech jobs, while I got stuck doing some kind of crap customer service job. I still remember this guy who got hired as a desktop technician. He was in his 30s, but in bad health, always red and sweaty and breathing hard. It took him forever to do the simplest task, like connecting a monitor or printer. He didn’t know much and was usually wrong, but he kept his job. I busted my butt to show I was serious and already had a good skill set, and would work my tail off to excel, and they couldn’t see past that I wasn’t male. So I got the message, mentally told them to eff off and stuck with freelancing.”

So writes Carla Schroder in her blog post, “My Nerd Life: Too Loud, Too Funny, Too Smart, Too Fat” on linux.com. Her story is an important one for female techies – and all techies. Read it.

, , ,

Amazing women in tech? Just say “Yes”

1486079_10201603140415088_346126634_oWe need all the technical talent we can get. Whether we are talking developers, architects, network staff, IT admins, managers, hardware, software or firmware, the more women in technology, the better. For everyone – for companies, for customers, for women and for men.

I have recently started working with an organization called WITI – that’s Women in Technology International. (My nickname is now “WITI Alan.”)

WITI is a membership organization. Women who join get access to amazing resources for networking, professional development, career opportunities and more. Companies who join as corporate sponsors can take advantage of WITI’s incredible solutions for empowering women’s networks, including employee training and retention services, live events and more.

My role with WITI is going to be to help women in technology tell their stories. We kicked this off at January’s International Consumer Electronics show in Las Vegas, and we’ll be continuing this at numerous events in 2014 – including the WITI Annual Summit, coming to Santa Clara from June 1-3, 2014. (You can see me here at the WITI booth at CES with Michele Weisblatt.)

If you are a woman in technology, or if you know women in technology, or you understand the value of increasing the number of women in technology, please support the super-important work being done by WITI.

, , ,

Dancing with Apple cofounder Steve Wozniak

Steve-WozniakI’ve had the opportunity to meet and listen to Steve Wozniak several times over the years. He’s always funny and engaging, and his scriptless riffs get better all the time. With this one, he had me rolling in the aisle.

The Woz’s hour-long talk (and Q&A session) covered familiar ground: His hacking the phone system with blue boxes (and meeting Captain Crunch), working his way though college, meeting Steve Jobs, designing the Apple I and Apple II computers, the dispute about the Apple Macintosh vs. Apple Lisa, his amnesia after a plane crash, his dedication to Elementary school teaching, his appearance on the TV competition Dancing with the Stars in 2009, and so on.

Many of us have heard and read these stories before — and love them.

Read all about his talk here, in my story on the SmartBear blog….

, , , ,

Microsoft keeps stumbling

Microsoft’s woes are too big to ignore.

Problem area number one: The high-profile Surface tablet/notebook device is flopping. While the 64-bit Intel-based Surface Pro hasn’t sold well, the 32-bit ARM-based Surface RT tanked. Big time. Microsoft just slashed its price — maybe that will help. Too little too late?

To quote from Nathan Ingraham’s recent story in The Verve, 

Microsoft just announced earnings for its fiscal Q4 2013, and while the company posted strong results it also revealed some details on how the Surface RT project is costing the business money. Microsoft’s results showed a $900 million loss due to Surface RT “inventory adjustments,” a charge that comes just a few days after the company officially cut Surface RT prices significantly. This $900 million loss comes out of the company’s total Windows revenue, though its worth noting that Windows revenue still increased year-over-year. Unfortunately, Microsoft still doesn’t give specific Windows 8 sales or revenue numbers, but it probably performed well this quarter to make up for the big Surface RT loss.

At the end of the day, though, it looks like Microsoft just made too many Surface RT tablets — we heard late last year that Microsoft was building three to five million Surface RT tablets in the fourth quarter, and we also heard that Microsoft had only sold about one million of those tablets in March. We’ll be listening to Microsoft’s earnings call this afternoon to see if they further address Surface RT sales or future plans.

Microsoft has spent heavily, and invested a lot of its prestige, in the Surface. It needs to fix Windows 8 and make this platform work.

Problem are number two: A dysfunctional structure. A recent story in the New York Times reminded me of this 2011 cartoon describing six tech company’s charts. Look at Microsoft. Yup.

Steve Ballmer, who has been CEO since 2000, is finally trying to do something about the battling business units. The new structure, announced on July 11, is called “One Microsoft,” and in a public memo by Ballmer, the goal is described as:

Going forward, our strategy will focus on creating a family of devices and services for individuals and businesses that empower people around the globe at home, at work and on the go, for the activities they value most. 

Editing and restructuring the info in that memo somewhat, here’s what the six key non-administrative groups will look like:

Operating Systems Engineering Group will span all OS work for console, to mobile device, to PC, to back-end systems. The core cloud services for the operating system will be in this group.

Devices and Studios Engineering Group will have all hardware development and supply chain from the smallest to the largest devices, and studios experiences including all games, music, video and other entertainment.

Applications and Services Engineering Group will have broad applications and services core technologies in productivity, communication, search and other information categories.

Cloud and Enterprise Engineering Group will lead development of back-end technologies like datacenter, database and specific technologies for enterprise IT scenarios and development tools, plus datacenter development, construction and operation.

Advanced Strategy and Research Group will be focused on the intersection of technology and policy, and will drive the cross-company looks at key new technology trends.

Business Development and Evangelism Group will focus on key partnerships especially with innovation partners (OEMs, silicon vendors, key developers, Yahoo, Nokia, etc.) and broad work on evangelism and developer outreach. 

If implemented as described, this new organization should certainly eliminate waste, including redundant research and product developments. It might improve compatibility between different platforms and cut down on mixed messages.

However, it may also constraint the freedom to innovate, and promote the unhealthy “Windows everywhere” philosophy that has hamstrung Microsoft for years. It’s bad to spend time creating multiple operating systems, multiple APIs, multiple dev tool chains, multiple support channels. It’s equally bad to make one operating system, API set, dev tool chain and support channel fit all platforms and markets.

Another concern is the movement of developer outreach into a separate group that’s organizationally distinct from the product groups. Will that distance Microsoft’s product developers from customers and ISVs? Maybe. Will the most lucrative products get better developer support? Maybe.

Microsoft has excelled in developer support, and I’d hate to see that suffer as part of the new strategy. 

Read Steve Ballmer’s memo. What do you think?

Z Trek Copyright (c) Alan Zeichick
, ,

Cloud failures: It’s not if, it’s when

Apple is sporting an nasty black eye, and the shiner isn’t only because iPad sales are slipping – with a 14% year-on-year decline reported. This time, it’s because QoS on the company’s cloud servers is ugly, ugly, ugly.

As of my writing (on Thursday, July 25), Apple’s developer portal has been offline for days. As you can see on the dashboard, just about everything is down. If you go to a dev center, you see this message:

We apologize for the significant inconvenience caused by our developer website downtime. We’ve been working around the clock to overhaul our developer systems, update our server software, and rebuild our entire database. While we complete the work to bring our systems back online, we want to share the latest with you.

We plan to roll out our updated systems, starting with Certificates, Identifiers & Profiles, Apple Developer Forums, Bug Reporter, pre-release developer libraries, and videos first. Next, we will restore software downloads, so that the latest betas of iOS 7, Xcode 5, and OS X Mavericks will once again be available to program members. We’ll then bring the remaining systems online. To keep you up to date on our progress, we’ve created a status page to display the availability of our systems.

As you may have read elsewhere, the reason for the outage is apparently a researcher found a massive security hole in the App dev center system. To prevent the flaw from being exploited, Apple took the entire system down – on July 18. That’s right, it’s been over a week.

Ouch.

And then, today, July 25, there are reports that the authentication server needed to set up new iPhone accounts is offline. Apple’s IT department certainly isn’t looking too savvy right now – and perhaps this points to bigger challenges within the company’s spending priorities.

However, before anyone piles onto Apple, bear in mind that service outages are not uncommon, especially in the cloud. Certainly, they are not new; I’ve written about them before, such as in 2008’s “When the cloud was good, it was very very good, but when it was bad, it was horrid” and 2011’s “Skynet didn’t take down Amazon Web Services.”

Cloud failure is not a matter of if. It’s a matter of when. When huge corporations like Amazon and Apple can suffer these sorts of outages, anyone can, no matter how big.

What’s the game plan? Do you have a fail-over strategy to spool up a backup provider? Do you have messaging ready for your customers and partners? Alternatives to suggest?

I have no idea how much money Apple is losing due to these outages – or how much its developer partners and customers are affected. Apple, however, is big enough to handle the hit. How about you?

Z Trek Copyright (c) Alan Zeichick
,

The developer is king

“You should double your top line revenue by making your products more awesome, not by doubling the size of your sales department.”

That was one of the insights shared during a technology roundtable held last July 16 in San Francisco. Called “The Developer is King,” the discussion was moderated by Dan Dodge of Google Ventures, formerly a startup evangelist at Microsoft and engineer at such diverse firms at AltaVista, Napster and Groove Networks. Also on the panel: John Collison, founder of online payment site Stripe; Tom Preston-Werner, founder of GitHub; Suhail Doshi, co-founder of Web analytics firm MixPanel; and Lew Cirne, founder of app monitoring firm New Relic.

The atmosphere around the panel was filled with pithy aphorisms about why so many developers are succeeding as entrepreneurs. For example, “developer aren’t just techies, they are artists who creating things,” and “a good startup founder is someone who doesn’t live only to write code, but who likes to solve problems.”

What made this conversation particularly interesting is that not only are these founders all developers, but their customers are also developers. The panelists offered some true words of wisdom for anyone targeting developers:

• Developers are hard to please. You have to build products that just work — you can’t create success through whiz-bang marketing.

• Developers will see your product and think they can build it themselves. It’s often not hard to duplicate your product. So you have to focus on the customers, ecosystem and simplicity.

• If you are building a commercial offering atop open source software, show that you help developers get their work done more easily than the open source version.

• Tools are quite viral; developers are great at telling their friends what works for them — and what doesn’t work for them.

• Focus on the initial user experience, and make customers more productive immediately. Contrast your offering with big platforms that require a lot of work to install, configure, train and use before the customer sees any benefit.

• The way to innovate is to try lots of things – and create a culture that tolerates failure.

• When hiring, a cultural fit beats anything on the resume. You can teach skills – you can’t teach character.

• Don’t set out to build a company; instead, start out creating a solution to a real problem, and then grow that into a business.

• Don’t get hung up on analyst estimates of market size. Create markets, don’t pursue them.

and my favorite,

• You shouldn’t build a company by focusing on a current fad or gold rush. Rather, figure out where people are frustrated or having problems. Make something that people want. Figure out how to make people happy.

, , ,

Saying farewell to the mouse-man, Douglas Engelbart

Dr. Douglas Engelbart, who passed away on July 2, was best known as the inventor of the computer mouse. While Dr. Engelbart was the brains behind many revolutionary ideas, his demonstration of a word processor using a mouse in 1968 paved the way for the graphical user interfaces in Xerox’s Alto (1973), Apple’s Lisa (1979) and Macintosh (1984), Microsoft’s Windows (1985) and IBM’s OS/2 Presentation Manager (1988).

Future generations may regard the mouse as a transitional technology. Certainly the touch interface, popularized in the iPad, Android tablets and Windows 8, are making a dent in the need for the mouse — though my Microsoft Surface Pro is far easier to use with a mouse, in addition to the touch screen.

Voice recognition is also making powerful strides. When voice is combined with a touch screen, it’s possible to envision the post-WIMP (Windows, Icons, Menus and Pointing Devices) mobile-style user experience surpassing mouse-driven systems.

Dr. Engelbart, who was recently fêted in Silicon Valley, was 88. Here are some links to help us gain more insight into his vision:

Obituary in the New York Times, by John Markoff.

“The Mother of All Demos” on 1968. Specifically, see clips 3 and 12 where Dr. Engelbart edits documents with a mouse.

A thoughtful essay about Dr. Engelbart’s career, by Tom Foremski.

I never had the honor of meeting Dr. Engelbart. There was a special event commemorating his accomplishments at Stanford Research Institute in 2008, but unfortunately I was traveling.

It’s remarkable for one person to change the world in such a significant way – and so fast. Dr. Engelbart and his team invented not only the mouse, but also personal computing as we know it today. It is striking how that 1968 demo resembles desktop and notebook computing circa 2013. Not bad. Not bad at all. May his memory be a blessing.

,

Test Early, Test Often

Quality Assurance. Testing. No matter what you call it – and of course, there are subtle distinctions between testing and QA – the discipline is essential for successfully creating professional-grade software.

Sure, a one-person shop or a small consultancy might get away without having formal test teams or serious QA policies. Most of us can’t afford to work that way. The cost of software failure, to us and to our customers, can be huge in so many ways.

SD Times and sdtimes.com recently asked readers about test and QA in a research study. Here are some of the results; how well do the answers match your organization’s profile?

Does your organization have separate development and test teams? (Please check one only)

Yes, all development teams and test/QA teams are separate 35.9%
Some development and test/QA teams are separate, some are integrated 33.4%
All test and development teams are integrated 27.4%
Don’t know 3.3%

If any of the test/QA teams in your organization are separate, where do those test teams report? (Please check all that apply)

To the development team 16.2%
To a development manager, director, or VP of development 33.8%
To an IT manager not managing development 22.2%
To a software architect or project leader on a particular project 19.7%
To the CIO/CTO 9.2%
To line of business managers 14.8%
Don’t know 8.1%

What background do your test/QA managers and directors typically have? (Please check all that apply)

Development 20.3%
Test/QA only 28.9%
Development and test/QA 48.9%
General IT background 31.7%
General management background 18.5%
No particular background – we train them from scratch 14.2%

Does your company outsource any of its software quality assurance or testing? (Please check one only)

Yes, all of it 3.7%
Yes, some of it 32.3%
No, none of it 58.1%
Don’t know 5.9%

Who is responsible for internally-developed application performance testing and monitoring in your company? (Please check all that apply)

Software/Application Developers 68.2%
Software/Application Development Management 54.2%
Testers 52.3%
Testing Management 43.9%
Systems administrators 34.9%
IT top management (development) (VP or above) 29.3%
Networking personnel 25.2%
IT top management (non-development) (VP or above) 24.6%
Line-of-business management 21.5%
Consultants 20.2%
Service providers 19.0%
Networking management 18.1%

What is the state of software security testing at your company? (Please check all that apply)

Software security is checked by the developers 41.2%
Software security is checked by the test/QA team 31.6%
Software security is tested by a separate security team 26.9%
Software security testing is done for Web applications 25.7%
Software security is checked by the IT/networking department 25.4%
Software security testing is done for in-house applications 24.1%
Software security testing is done for public-facing applications 21.7%
We don’t have a specific security testing process 20.4%
Software security is checked by contractors 9.3%
Software security testing is not our responsibility 3.1%

At what stage is your company, or companies that you consult, using the cloud for software testing? (Please check one only)

No plans to use the cloud for software testing 42.3%
We are studying the technology but have not started yet 21.2%
We are experimenting with using the cloud for software testing 16.0%
We are using the cloud for software testing on a routine basis 10.7%
Don’t know 9.8%

Lots of good data here!

Z Trek Copyright (c) Alan Zeichick
, , ,

Building on Microsoft Build

If you were at Microsoft Build this week in San Francisco, you hung out with six thousand of your closest friends. At least, your closest friends who are enterprise .NET developers, or who are building apps for some version of Windows 8.

Those aren’t necessarily the same people. The Microsoft world is more bifurcated than ever before.

There’s the solid yet slow-moving world of the Microsoft enterprise stack. Windows Server, SQL Server, Exchange, SharePoint, Azure and all that jazz. This part of Microsoft thinks that it’s Oracle or IBM.

And then there’s the quixotic set of consumer-facing products. Xbox, Windows Phone, the desktop version of Windows 8, and of course, snazzy new hardware like the Surface tablet. This part of Microsoft thinks that it’s Apple or Google – or maybe Samsung.

While the company’s most important (and most loyal) customer base is the enterprise, there’s no doubt that Microsoft wants to be seen as Apple, not IBM. Hip. Creative. Innovative. In touch with consumers.

#Microsoft wants to trend on Twitter.

To thrive in the consumer world, the company must dig deeper and do better. The highlight of Build was the preview of Windows 8.1, with user experience improvements that undo some of the damage done by Windows 8.0.

It’s great that you can now boot into the “desktop,” or traditional Windows. That is important for both desktop and tablet users. Yet the platform remains frenetic, inconsistent and missing key apps in the Tile motif.

While the Tile experience is compelling, it’s incomplete. You can’t live in it 100%. Yet Windows 8.0 locked you away from living in the old “desktop” environment. Windows 8.1 helps, but it’s not enough.

In his keynote address (focus on consumer tech), Microsoft CEO Steve Ballmer pushed two themes. 

One was that the company is moving to ship software faster. Citing the one-year timeline between Windows 8.0 and Windows 8.1 — instead of the traditional three-year cycle — the unstated message is that Microsoft is emulating Apple’s annual platform releases. “Rapid Release is the new norm,” Ballmer said.

A second theme is that Microsoft’s story is still Windows, Windows, Windows. This is no change from the past. Yes, Microsoft plays better with other platforms than ever before. Even so, Redmond wants to control every screen — and can’t understand why you might use anything other than Windows.

The more things change, the more they stay the same.

, , , , ,

Four common mobile development mistakes

Web sites developed for desktop browsers look, quite frankly, terrible on a mobile device. The look and feel is often wrong, very wrong. Text is the wrong size. Gratuitous clip art on the home page chews up bandwidth. Features like animations won’t behave as expected. Don’t get me started on menus — or on the use-cases for how a mobile user would want to use and navigate the site.

Too often, some higher-up says, “Golly, we must make our website more friendly,” and what that results in is a half-thought-out patch job. Not good. Not the right information, not the right workflow, not the right anything.

One organization, UserTesting.com, says that there are four big pitfalls that developers (and designers) encounter when creating mobile versions of their websites. The company, which focuses on usability testing, says that the biggest issues are:

Trap #1 – Clinging to Legacy: ‘Porting’ a Computer App or Website to Mobile
Trap #2 – Creating Fear: Feeding Mobile Anxiety
Trap #3 – Creating Confusion: Cryptic Interfaces and Crooked Success Paths
Trap #4 – Creating Boredom: Failure to Quickly Engage the User

Makes sense, right? UserTesting.com offers a quite detailed report, “The Four Mobile Traps,” that goes into more detail.

The report says,

Companies creating mobile apps and websites often underestimate how different the mobile world is. They assume incorrectly that they can create for mobile using the same design and business practices they learned in the computing world. As a result, they frequently struggle to succeed in mobile.

These companies can waste large amounts of time and money as they try to understand why their mobile apps and websites don’t meet expectations. What’s worse, their awkward transition to mobile leaves them vulnerable to upstart competitors who design first for mobile and don’t have the same computing baggage holding them back. From giants like Facebook to the smallest web startup, companies are learning that the transition to mobile isn’t just difficult, it’s also risky.

Look at your website. Is it mobile friendly? I mean, truly designed for the needs, devices, software and connectivity of your mobile users?

If not — do something about it.

,

See no evil, hear no evil, speak no evil, code no evil

Data can be abused. The rights of individuals can be violated. Bits of disparate information can be tracked without a customer’s knowledge, and used to piece together identities or other profile information that a customer did not intend to divulge. Thanks to Big Data and other analytics, patterns can be tracked that would dismay customers or partners.

What is the responsibility of the software development team to make sure that a company does the right thing – both morally and legally? The straight-up answer from most developers, and most IT managers outside the executive suite, is probably, “That’s not our problem.” That is not a very good answer.

Corporations and other organizations have senior managers, such as owners, presidents, CEOs and board of directors. There is no doubt that those individuals have the power to say yes – and the power to say no.

Top bosses might consult with legal authorities, such as in-house counsel or outside experts. The ultimate responsibility for making the right decision rests with the ultimate decision-makers. I am not a lawyer, but I expect that in a lawsuit, any potential liability belongs with managers who misuse data. Programmers who coded an analytics solution would not be named or harmed.

This topic has been on my mind for some time, as I ponder both the ethics and the legalities implicit in large-scale data mining. Certainly this has been a major subject of discussion by pundits and elected officials, at least in the United States, when it comes to customer info and social-media posts being captured and utilized by marketers.

Some recent articles on this subject:

Era of Online Sharing Offers Benefits of ‘Big Data,’ Privacy Trade-Offs

The Challenge of Big Data for Data Protection

Big Data Is Opening Doors, but Maybe Too Many

What are we going to do in the face of questionable software development requirements? Whether we are data scientists, computer scientists or other IT professionals, it is quite unclear. A few developers might prefer to resign rather than write software they believe crosses a moral line. Frankly, I doubt that many would do so.

Some developers might say, “I didn’t understand the implications.” Or they might say, “If I don’t code this application, management will fire me and get someone else to do it.” Or they might even say, “I was just following orders.”

, , ,

Hurray for COBOL and the mainframe

Perhaps I’m an old fogey, but I can’t help but smile when I see press releases like this: “IBM Unveils New Software to Enable Mainframe Applications on Cloud, Mobile Devices.” 

Everything old will become new again, as the late Australian musician Peter Allen famously sang in his song of that name.

Mainframes were all the rage in the 1960s and 1970s. Though large organizations still used mainframes as their basis of their business-critical transaction systems in the 1990s and 2000s, the excitement was around client/server and n-tier architectures built up from racks of low-cost commodity hardware.

Over the past 15 years or so, it’s become clear that distributed processing for Web applications fit itself into that clustered model. Assemble a few racks of servers and add a load-balancing appliance, and you’ve got all the scalability and reliability anyone needs.

But you know, from the client perspective, the cloud looks like, well, a thundering huge mainframe.

Yes, I am an old fogey, who cut his teeth on FORTRAN, COBOL, PL/1 and CICS on Big Blue’s big iron (that is to say, IBM System/370). Yes, I can’t help but think, “Hmm, that’s just like a mainframe” far too often. And yes, the mainframe is very much alive.

IBM’s press release says that,

Today, nearly 15 percent of all new enterprise application functionality is written in COBOL. The programming language also powers many everyday services such as ATM transactions, check processing, travel booking and insurance claims. With more than 200 billion lines of COBOL code being used across industries such as banking, insurance, retail and human resources, it is crucial for businesses to have the appropriate framework to improve performance, modernize key applications and increase productivity.

I believe that. Sure, there are lots of applications written in Java, C++, C# and JavaScript. Those are on the front end, where if  a database read or write fails, or a non-responsive screen is an annoyance, nothing more. On the back end, if you want the fastest possible response time, without playing games with load balancers, and without failures, you’re still looking at a small number of big boxes, not a large number of small boxes.

This fogey is happy that the mainframe is alive and well.

,

Big Expectations and Big Challenges for Big Data

According to IDG Research, 80% of business leaders say that Big Data should enable more informed business decisions – and 37% say that the insights provided by Big Data should prove critical to those decisions.

A February 2013 survey on Big Data was designed and executed jointly by IDG Research and Kapow Software, which sells an integration platform for Big Data. As with all vendor surveys, bear in mind that Kapow wants to make Big Data sound exciting, and to align the questions with its own products and services.

That said, the results of the survey of 200 business leaders, are interesting:

• 71% said that Big Data should help increase their competitive advantage by keeping them ahead of market trends

• 68% said that Big Data should improve customer satisfaction

• 62% believe Big Data should increase end-user productivity by providing real-time access to business information

• 60% said Big Data should improve information security and/or compliance

• 55% said Big Data should help create compelling new products and services

•33% said Big Data should help them monitor and respond to social media in real time

Those are big expectations for Big Data! The results to date… not so much. The study revealed that only one-third of organizations surveyed have implemented any sort of Big Data initiative – but another third expect to do so over the next year.

What are the barriers to Big Data success? The study’s answers:

• 53% say a lack of awareness of Big Data’s potential

• 49% say concerns about the time-to-value of the data

• 45% say having the right employee skills and training

• 43% say ability to extract data from the correct sources

,

Get ready for the #SDTimes100

The software development world keeps on changing. Just when we think we get a handle on something as simple as application lifecycle management, or cloud computing, or mobile apps, we get new models, new innovations, new technologies.

Forget plugging pair programming or continuous delivery or automated testing before checking code into the repository. The industry has moved on. Today, the innovation is around DevOps and Big Data and HTML5 and app stores and… well… it keeps changing.

Tracking and documenting those changes – that’s what we do at SD Times. Each year, the editors stop, catch our breath, and make a big list of the top innovators of the software development industry. We identify the leaders – the companies, the open-source projects, the organizations who ride the cutting edge.

To quote from Rex Kramer in the movie Airplane, the SD Times 100 are the boss, the head man, the top dog, the big cheese, the head honcho, number one…

Who are the SD Times 100? This week, all will be revealed. We will begin tweeting out the SD Times 100 on Thursday. Follow the action by watching @sdtimes on Twitter, or look for hashtag #SDTimes100.

After all the tweeting is complete, the complete list will be published to www.sdtimes.com. Be part of the conversation!

,

In-memory databases poised for takeoff

The classic database engines – like the big relational behemoths from IBM, Microsoft and Oracle – store the data on disk. So do many of the open-source databases, like MySQL and PostgreSQL, as well as the vast array of emerging NoSQL databases. While such database engines keep  all the rows and columns on the relatively slow, disks, they can boost performance by putting some element, including indices and sophisticated predicted caches, on faster solid-state storage or even faster main memory.

From a performance perspective, it would be great to store everything in main memory. It’s fast, fast, fast. It’s also expensive, expensive, expensive, and in traditional services, is not persistent. That’s why database designers and administrators leverage a hierarchy: A few key elements in the fastest, most costly main memory; more data in fast, costly solid-state storage; the bulk in super-cheap rotating disks. In some cases, of course, some of the data goes into a fourth layer in the hierarchy, off-line optical or tape storage.

In-memory databases challenge those assumptions for applications where database response time is the bottleneck to application performance. Sure, main memory is still fabulously expensive, but it’s not as costly as it used to be. New non-volatile RAM technologies can make main memory somewhat persistent without dramatically harming read/write times. (To the best of my knowledge, NVRAM remains slower than standard RAM – but not enough to matter.)

That’s not to say that your customer database, your server logs, or your music library, should be stored within an in-memory database. Nope. Not even close. But as you examine your application architecture, think about database contents that dramatically affect either raw performance, user experience or API response time. If you can isolate those elements, and store them within an in-memory database, you might realize several orders of magnitude improvement at minimal cost — and with potentially less complex code than you’d need to manage a hierarchical database system.

Not long ago, in-memory databases were a well-kept secret. The secret is out, according to research from Evans Data. Their new Global Development survey says that the developers using in-memory databases has increased 40% worldwide from 18% to 26% during the last six months. An additional 39% globally say they plan to incorporate in-memory databases into their development work within the next 12 months.

Z Trek Copyright (c) Alan Zeichick
, , , ,

Let’s boost developer velocity by 30x

Not long ago, if the corporate brass wanted the change major functionality in a big piece of software, the IT delivery time might be six to 12 months, maybe longer. Once upon a time, that was acceptable. Not today.

Thanks to agile, many software changes can be delivered in, say, six to 12 weeks. That’s a huge improvement — but not huge enough. Business imperatives might require that IT deploy new application functionality in six to 12 days.

Sounds impossible, right? Maybe. Maybe not. I had dinner a few days ago with S. “Soma” Somasegar (pictured), the corporate vice president of Microsoft’s Developer Division. He laughed – and nodded – when I mentioned the need for a 30x shift in software delivery from months to days.

After all, as Soma pointed out, Microsoft is deploying new versions of its cloud-based Team Foundation Service every three weeks. The company has also realize that revving Visual Studio itself every two or three years isn’t serving the needs of developers. That’s why his team has begun rolling out regular updates that include not only bug fixes but also new features. The latest is Update 2 to Visual Studio 2012, released in late April, which added in new features for agile planning, quality assurance, line-of-business app developer, and improvements to the developer experience.

I like what I’m hearing from Soma and Microsoft about their developer tools, and about their direction. For example, the company appears sincere in its engagement of the open source community through Microsoft Open Technologies — but I’ll confess to still being a skeptic, based on Microsoft’s historical hostility toward open source.

Soma said that it’s vital not only for Microsoft to contribute to open source, but also to let open source communities engage with Microsoft. It’s about time!

Soma also cited the company’s new-found dedication to DevOps. He said that future versions of both on-premises and cloud-based tools will help tear down the walls between development and deployment. That’s where the 30x velocity improvement might come from.

Another positive shift is that Microsoft appears to truly accept that other platforms are important to developers and customers. He acknowledges that the answer to every problem cannot be to use Microsoft technologies exclusively.

Case in point: Soma said that fully 60% of Microsoft developers are building applications that touch at least three different platforms. He acknowledged that Microsoft still believes that it has the best platforms and tools, but said, “We now know that developers make other choices for valid reasons. We want to meet developers where they are” – that is, engaging with other platforms.

Soma’s words may seem like a modest and obvious statement, but it’s a huge step forward for Microsoft.

, , , , ,

Mobile developer mojo

Tickets for the Apple Worldwide Developer Conference went on sale on Thursday, April 25. They sold out in two minutes.

Who says that the iPhone has lost its allure? Not developers. Sure, Apple’s stock price is down, but at least Apple Maps on iOS doesn’t show the bridge over Hoover Dam dropping into Black Canyon any more.

Two minutes.

To quote from a story on TechCrunch,

Tickets for the developer-focused event at San Francisco’s Moscone West, which features presentations and one-on-one time with Apple’s own in-house engineers, sold out in just two hours in 2012, in under 12 hours in 2011, and in eight days in 2010.

Who attends the Apple WWDC? Independent software developers, enterprise developers and partners. Thousands of them. Many are building for iOS, but there are also developers creating software or services for other aspects of Apple’s huge ecosystem, from e-books to Mac applications.

Two minutes.

Mobile developers love tech conferences. Take Google’s I/O developer conference, scheduled for May 15-17. Tickets sold out super-fast there as well.

The audience for Google I/O is potentially more diverse, mainly because Google offers a wider array of platforms. You’ve got Android, of course, but also Chrome, Maps, Play, AppEngine, Google+, Glass and others beside. My suspicion, though, is that enterprise and entrepreneurial interest in Android is filling the seats.

Mobile. That’s where the money is. I’m looking forward to seeing exactly what Apple will introduce at WWDC, and Google at Google I/O.

Meanwhile, if you are an Android developer and didn’t get into Google I/O before it sold out – or if you are looking for a technical conference 100% dedicated to Android development – let me invite you to register for AnDevCon Boston, May 28-31. We still have a few seats left. Hope to see you there.

, , , ,

Coping with the data

As I write this on Friday, Apr. 19, it’s been a rough week. A tragic week. Boston is on lockdown, as the hunt for the suspected Boston Marathon bombers continues. Explosion at a fertilizer plant in Texas. Killings in Syria. Suicide bombings in Iraq. And much more besides.

The Boston incident struck me hard. Not only as a native New Englander who loves that city, and not only because I have so many friends and family there, but also because I was near Copley Square only a week earlier. My heart goes out to all of the past week’s victims, in Boston and worldwide.

Changing the subject entirely: I’d like to share some data compiled by Black Duck Software and North Bridge Venture Partners. This is their seventh annual report about open source software (OSS) adoption. The notes are analysis from Black Duck and North Bridge.

How important will the following trends be for open source over the next 2-3 years?

#1 Innovation (88.6%)
#2 Knowledge and Culture in Academia (86.4%)
#3 Adoption of OSS into non-technical segments (86.3%)
#4 OSS Development methods adopted inside businesses (79.3%)
#5 Increased awareness of OSS by consumers (71.9%)
#6 Growth of industry specific communities (63.3%)

Note: Over 86% of respondents ranked Innovation and Knowledge and Culture of OSS in Academia as important/very important.

How important are the following factors to the adoption and use of open source? Ranked in response order:

#1 – Better Quality
#2 – Freedom from vendor lock-in
#3 – Flexibility, access to libraries of software, extensions, add-ons
#4 – Elasticity, ability to scale at little cost or penalty
#5 – Superior security
#6 – Pace of innovation
#7 – Lower costs
#8 – Access to source code

Note: Quality jumped to #1 this year, from third place in 2012.

How important are the following factors when choosing between using open source and proprietary alternatives? Ranked in response order:

#1 – Competitive features/technical capabilities
#2 – Security concerns
#3 – Cost of ownership
#4 – Internal technical skills
#5 – Familiarity with OSS Solutions
#6 – Deployment complexity
#7 – Legal concerns about licensing

Note: A surprising result was “Formal Commercial Vendor Support” was ranked as the least important factor – 12% of respondents ranked it as unimportant.  Support has traditionally been held as an important requirement by large IT organizations, with awareness of OSS rising, the requirement is rapidly diminishing.

When hiring new software developers, how important are the following aspects of open source experience? Ranked in response order:

2012
#1 – Variety of projects
#2 – Code contributions
#3 – Experience with major projects
#4 – Experience as a committer
#5 – Community management experience

2013
#1 – Experience with relevant/specific projects
#2 – Code contributions
#3 – Experience with a variety of projects
#4 – Experience as a committer
#5 – Community management experience

Note: The 2013 results signal a shift to “deep vs. broad experience” where respondents are most interested in specific OSS project experience vs. a variety of projects, which was #1 in 2012.

There is a lot more data in the Future of Open Source 2013 survey. Go check it out. 

, , ,

Big Data and PC Sales Data

Last week, we held the debut Big Data TechCon in Cambridge, Mass. It was a huge success – more attendees than we expected, which is great. (With a debut event, you never really know.)

We had lots of sessions, many of which were like trying to drink from a fire hose. That’s a good thing.

A commonality is that there is no single thing called Big Data. There are oodles of problems that have to do with capturing, processing and storing large quantities of structured and unstructured data. Some of those problems are called Big Data today, but some have evolved out of diverse disciplines like data management, data warehousing, business intelligence and matrix-based statistics.

Problems that seemed simple to solve when you were talking about megabytes or terabytes are not simple when you’re talking about petabytes.

You may have heard about the “Four V’s of Big Data” – Volume, Velocity, Variety and Veracity. Some Big Data problems are impacted by some of these V’s. Other Big Data problems are impacted by other V’s.

Think about problem domains where you have very large multidimensional data sets to be analyzed, like insurance or protein folding. Those petabytes are static or updated somewhat slowly. However, you’d like to be able to run a broad range of queries. That’s an intersection of data warehousing and business intelligence. You’ve got volume and veracity. Not much variety. Velocity is important on reporting, not on data management.

Or you might have a huge mass of real-time data. Imagine a wide variety of people, like in a social network, constantly creating all different types of data, from text to links to audio to video to photos to chats to comments. You not only have to store this, but also quickly decide what to present to whom, through relationships, permissions and filters, but also implement a behind-the-scenes recommendation engine to prioritize the flow. Oh, and you have to do it all sub-second. There all four V’s coming into play.

Much in Big Data has to do with how you model the data or how you visualize it. In non-trivial cases, there are many ways of implementing a solution. Some run faster, some are slower; some scale more, others scale less; some can be done by coding into your existing data infrastructure, and others require drastic actions that bolt on new systems or invite rip-and-replace.

Big Data is fascinating. Please join us for the second Big Data TechCon, coming to the San Francisco Bay Area in October. See www.bigdatatechcon.com.

While in Cambridge wrapping up the conference, I received an press release from IDC: “PC Shipments Post the Steepest Decline Ever in a Single Quarter, According to IDC.”

To selectively quote:

Worldwide PC shipments totaled 76.3 million units in the first quarter of 2013 (1Q13), down -13.9% compared to the same quarter in 2012 and worse than the forecast decline of -7.7%.

Despite some mild improvement in the economic environment and some new PC models offering Windows 8, PC shipments were down significantly across all regions compared to a year ago. Fading Mini Notebook shipments have taken a big chunk out of the low-end market while tablets and smartphones continue to divert consumer spending. PC industry efforts to offer touch capabilities and ultraslim systems have been hampered by traditional barriers of price and component supply, as well as a weak reception for Windows 8. The PC industry is struggling to identify innovations that differentiate PCs from other products and inspire consumers to buy, and instead is meeting significant resistance to changes perceived as cumbersome or costly.

The industry is going through a critical crossroads, and strategic choices will have to be made as to how to compete with the proliferation of alternative devices and remain relevant to the consumer. 

It’s all about the tablets, folks. That’s right: iPads and Android-based devices like the Samsung Galaxy, Kindle Fire, Barnes & Noble Nook and Google Nexus. Attempts to make standard PCs more tablet-like (such as the Microsoft Surface devices) just aren’t cutting it. Just as we moved from minicomputers to desktops, and from desktops to notebooks, we are moving from notebooks to tablets.

(I spent most of the time at the Big Data TechCon working on a 7-inch tablet with a Bluetooth keyboard. I barely used my notebook at all. The tablet/keyboard had a screen big enough to write stories with, a real keyboard with keys, and best of all, would fit into my pocket.)

Just as desktops/notebooks have different operating systems, applications, data storage models and user experiences than minicomputers (and minicomputer terminals), so too the successful tablet devices aren’t going to look like a notebook with a touchscreen. Apps, not applications; cloud-based storage; massively interconnected networks; inherently social. We are at an inflection point. There’s no going back.

,

The 8-year-old Git is coming on strong

Git, the open-source version control system, is becoming popular with enterprise developers. Or so it appears not only from anecdotal evidence I hear from developers all the time, but also from a new marketing study from CollabNet.

The study, called “The State of Git in the Enterprise,” was conducted by InformationWeek, but was paid for by CollabNet, which coincidentally sells tools and services to help development teams use Git. You should bear that in mind when interpreting the study,  which you can only receive by giving CollabNet your contact information.

That said, there are five interesting findings in the January 2013 study, which surveyed 248 development and business technology professionals at companies with 100 or more employees who use source code management tools:

First: Most developers are not using or planning to use Git. But of those that do, usage is split between on-premises or in a public/private cloud.

How do you use (or intend to use by 2013) Git deployment?

On premises: 30%
Private cloud/virtualized: 23%
Public cloud: 10%
Don’t use/do not intend to use 54%

Second: What best describes your use of Git today?

Git is our corporate standard: 5%
Git is one of several SCMs we use: 20%
Still kicking the tires on Git: 18%
Not currently using Git: 57%

Third: What do you like about Git?

Power branching/merging: 61%
Network performance: 53%
Everyone seems to be using it: 35%
It’s our corporate standard: 13%

Fourth: How do you conduct code reviews?

Automated and manual: 46%
Manual only: 24%
Manual, but only occasionally: 17%
Automated only: 7%
Not at all: 6%

Fifth: By the end of 2013, which SCM tools do you plan to use?

Microsoft TFS/VSS: 33%
Subversion: 32%
Git: 27%
IBM ClearCase: 22%
CVS: 21%
Perforce: 11%
Mercurial: 7%
None: 4%

Some of these technologies have been around for a long time. For example, CVS first appeared in 1986. CollabNet started Subversion in 2000, and it’s now a top-level Apache project. By contrast, Git’s initial release was only in 2005, and it flew under the radar for years before getting traction. Git’s rise to the third position on this study is impressive.

, ,

Moving into Big Data mode

Packing lists – check.  Supplies ordered – check. Show bags on schedule – check. Speakers all confirmed – check. Missing laptop power cord located – check. Airline tickets verified – check. Candy purchased for reservation desk – check.

Our team is getting excited for the debut Big Data TechCon. It’s coming up very shortly: April 8-10 in Boston.

What drove us to launch this technical conference? Frustration, really, that there were mainly two types of face-to-face conferences surrounding Big Data.

The first were executive-level meetings that could be summarized as “Here’s WHY you should be jumping on the Big Data bandwagon.” Thought leadership, perhaps, but little that someone could walk away with.

The second were training sessions or user meetings focused on specific technologies or products. Those are great if you are already using those products and need to train your staff on specific tools.

What was missing? A practical, technical, conference focused on HOW TO do Big Data. How to choose between a wide variety of tools and technologies, without bias toward a particular platform. How to kick off a Big Data project, or scale existing projects. How to avoid pitfalls. How to define and measure success. How to leverage emerging best practices.

All that with dozens of tutorials and technical classes, plus inspiring keynotes and lots and lots of networking opportunities with the expert speakers and fellow attendees. After all, folks learn in both the formal classroom and the informal hallway and lunch table.

The result – Big Data TechCon, April 8-10 in Boston. If you are thinking about attending, now’s the time to sign up. Learn more at www.bigdatatechcon.com.

See you in Boston!

, , , , , ,

Android + Chrome = Confusion

What is going on at Google? I’m not sure, and neither are the usual pundits.

Last week, Google announce that Andy Rubin, the long-time head of the Android team, is moving to another role within the company, and will be replaced by Sundar Pichai — the current head of the company’s Chrome efforts.

To quote from Larry Page’s post

Having exceeded even the crazy ambitious goals we dreamed of for Android—and with a really strong leadership team in place—Andy’s decided it’s time to hand over the reins and start a new chapter at Google. Andy, more moonshots please!

Going forward, Sundar Pichai will lead Android, in addition to his existing work with Chrome and Apps. Sundar has a talent for creating products that are technically excellent yet easy to use—and he loves a big bet. Take Chrome, for example. In 2008, people asked whether the world really needed another browser. Today Chrome has hundreds of millions of happy users and is growing fast thanks to its speed, simplicity and security. So while Andy’s a really hard act to follow, I know Sundar will do a tremendous job doubling down on Android as we work to push the ecosystem forward. 

What is the real story? The obvious speculation is that Google may have too many mobile platforms, and may look to merge the Android and Chrome OS operating systems.

Ryan Tate of Wired wrote, in “Andy Rubin and the Great Narrowing of Google,”

The two operating system chiefs have long clashed as part of a political struggle between Rubin’s Android and Pichai’s Chrome OS, and the very different views of the future each man espouses. The two operating systems, both based on Linux, are converging, with Android growing into tablets and Chrome OS shrinking into smaller and smaller laptops, including some powered by chips using the ARM architecture popular in smartphones.

Tate continues,

There’s a certain logic to consolidating the two operating systems, but it does seem odd that the man in charge of Android – far and away the more successful and promising of the two systems – did not end up on top. And there are hints that the move came as something of a surprise even inside the company; Rubin’s name was dropped from a SXSW keynote just a few days before the Austin, Texas conference began.

Other pundits seem equally confused. Hopefully, we’ll know what’s on going on soon. Registration for Google’s I/O conference opened – and closed – on March 13. If you blinked, you missed it. We’ll obviously be covering the Android side of this at our own AnDevCon conference, coming to Boston on May 28-31.

, ,

Is Big Data a fancy way of saying Big Social?

What do companies use Big Data technologies to analyze? Sales transactions. Social media trends. Scientific data. Social media trends. Weather readings. Social media trends. Prices for raw materials. Social media trends. Stock values. Social media trends. Web logs. And social media trends.

Sometimes I wonder if the entire point of Big Data is to sort through tweets. And Pinterest, Facebook and Tumblr – as well as closed social media networks like Salesforce.com’s Chatter and Microsoft’s recently acquired Yammer.

Perhaps this is a reflection that “social” is more than a way for businesses to disintermediate and reach customers directly. (Remember “disintermediation”? It was the go-to word during the early dot-com era of B-to-B and B-to-C e-commerce, and implied unlimited profits.)

Social media – nowadays referred to simply as “social” – is proving to be very effective in helping organizations improve communications. Document repositories and databases are essential, of course. Portal systems are vital. But traditional ways of communication, namely e-mail and standard one-to-one instant messaging, aren’t getting the job done, not in big organizations. Employees drown in their overflowing inboxes, and don’t know whom to message for information or input or workflow.

Enter a new Big Data angle on social. It’s one that goes beyond sifting through public messages to identifying what’s trending so you can sell more products or get on top of customer dissatisfaction before it goes viral. (Not to say those aren’t important, but that’s only the tip of the iceberg.)

What Big Data analysis can show you is not just what is going on and what the trends are, but who is driving them, or who are at least on top of the curve.

Use analytics to find out which of your customers are tastemakers – and cultivate them. Find out which of your partners are generating the most tractions – and deepen those ties. And find out which of your employees, through in-house social tools like instant messaging, blogs, wikis and forums, are posting the best information, are attracting followers and comments, and are otherwise leading the pack.

Treasure those people, especially those who are in your IT and development departments.

Big Social is the key to your organization’s future. Big Data helps you find and turn that key. We’ll cover both those trends at Big Data TechCon, coming to Boston from April 8-10. Hope to see you there.

, , ,

Bug Invaders! Angry Code! World of Compilecraft!

Everything, it seems, is a game. When I use the Waze navigation app on my smartphone, I earn status for reporting red-light cameras. What’s next: If I check in code early to version-control system, do I win a prize? Get points? Become a Code Warrior Level IV?

Turning software development into a game is certainly not entirely new. Some people live for “winning,” and like getting points – or status – by committing code to open-source projects or by reporting bugs as a beta tester. For the most part, however, that was minor. The main reason to commit the code or document the defect was to make the product better. Gaining status should be a secondary consideration – a reward, if you will, not a motivator.

For some enterprise workers, however, gamification of the job can be more than a perk or added bonus. It may be the primary motivator for a generation reared on computer games. Yes, you’ll get paid if you get your job done (and fired if you don’t). But you’ll work harder if you are encouraged to compete against other colleagues, against other teams, against your own previous high score.

Would gamification work with, say, me? I don’t think so. But from what I gather, it’s truly a generational divide. I’m a Baby Boomer; when I was a programmer, Back in the Day, I put in my hours for a paycheck and promotions. What I cared about most: What my boss thought about my work.

For Generation Y / Millennials (in the U.S, generally considered to be those born between 1982 and 2000), it’s a different game.

Here are some resources that I’ve found about gamification in the software development profession. What do you think about them? Do you use gamification techniques in your organization to motivate your workers?

Gamification in Software Development and Agile

Gamifying Software Engineering and Maintenance

Gamifying software still in its infancy, but useful for some

Some Thoughts on Gamification and Software

TED Talk: Gaming can make a better world 

, , ,

Big challenges with data and Big Data

Just about everyone is talking about Big Data, and I’m not only saying that because I’m conference chair for Big Data TechCon, coming up in April in Boston.

Take Microsoft, for example. On Feb. 13, the company released survey results that talked about their big customers’ biggest data challenges, and how those relate to Big Data.

In its “Big Data Trends: 2013” study, Microsoft talked to 282 U.S. IT decision-makers who are responsible for business intelligence, and presumably, other data-related issues. To quote some findings from Microsoft’s summary of that study:

• 32% expect the amount of data they store to double in the next two to three years.

• 62% of respondents currently store at least 100 TB of data. 

• Respondents reported an average of 38% of their current data as unstructured.

• 89% already have a dedicated budget for a Big Data solution.

• 51% of companies surveyed are in the middle stages of planning a big data solution

• 13% have fully deployed a Big Data solution.

• 72% have begun the planning process but have not  yet tested or deployed a solution; of those currently planning, 76% expect to have a solution implemented in less than one year.

• 62% said developing near-real-time predictive analytics or data-mining capabilities during the next 24 months is extremely important.

• 58% rated expanding data storage infrastructure and resources as extremely important.

• 53% rated increased amounts of unstructured data to analyze as extremely important.

• Respondents expect an average of 37% growth in data during the next two to three years.

I can’t help but be delighted by the final bullet point from Microsoft’s study. “Most respondents (54 percent) listed industry conferences as one of the two most strategic and reliable sources of information on big data.”

Hope to see you at Big Data TechCon.

, , , ,

The complications of cloud adoption

Cloud computing is seductive. Incredibly so. Reduced capital costs. No more power and cooling of a server closet or data center. High-speed Internet backbones. Outsourced disaster recovery. Advanced edge caching. Deployments are lightning fast, with capacity ramp-ups only a mouse-click away – making the cloud a panacea for Big Data applications.

Cloud computing is scary. Vendors come and vendors go. Failures happen, and they are out of your control. Software is updated, sometimes with your knowledge, sometimes not. You have to take their word for security. And the costs aren’t always lower.

An interesting new study from KPMG, “The Cloud Takes Shape,” digs into the expectations of cloud deployment – and the realities.

According to the study, cloud migration was generally a success. It showed that 33% of senior executives using the cloud said that the implementation, transition and integration costs were too high; 30% cited challenges with data loss and privacy risks; 30% were worried about the loss of control. Also, 26% were worried about the lack of visibility into future demand and associated costs, 26% fretted about the lack of interoperability standards between cloud providers; and 21% were challenged by the risk of intellectual property theft.

There’s a lot more depth in the study, and I encourage you to download and browse through it. (Given that KPMG is a big financial and tax consulting firm, there’s a lot in the report about the tax challenges and opportunities in cloud computing.)

The study concludes,

Our survey finds that the majority of organizations around the world have already begun to adopt some form of cloud (or ‘as-a-service’) technology within their enterprise, and all signs indicate that this is just the beginning; respondents expect to move more business processes to the cloud in the next 18 months, gain more budget for cloud implementation and spend less time building and defending the cloud business case to their leadership. Clearly, the business is becoming more comfortable with the benefits and associated risks that cloud brings.

With experience comes insight. It is not surprising, therefore, that the top cloud-related challenges facing business and IT leaders has evolved from concerns about security and performance capability to instead focus on some of the ‘nuts and bolts’ of cloud implementation. Tactical challenges such as higher than expected implementation costs, integration challenges and loss of control now loom large on the cloud business agenda, demonstrating that – as organizations expand their usage and gain more experience in the cloud – focus tends to turn towards implementation, operational and governance challenges.

,

You can’t analyze what you don’t capture

Big Data can sometimes mean Big Obstacles. And often those obstacles are simply that the Big Data isn’t there.

That’s what more than 1400 CIOs told Robert Half Technology, a staffing agency. According to the study, whose data was released in January, only 23% of CIOs said their companies collected customer data about demographics or buying habits. Of those that did collect this type of data, 53% of the CIOs said they had insufficient staff to access or analyze that data.

Ouch. 

The report was part of Robert Half Technology’s 2013 Salary Guide. There is a page about Big Data, which says,

When you consider that more than 2.7 billion likes and comments are generated on Facebook every day — and that 15 out of 17 U.S. business sectors have more data stored per company than the U.S. Library of Congress — it’s easy to understand why companies are seeking technology professionals who can crack the big data “code.”

Until recently, information collected and stored by companies was a mishmash waiting to be synthesized. This was because most companies didn’t have an effective way to aggregate it.

Now, more powerful and cost-effective computing solutions are allowing companies of all sizes to extract the value of their data quickly and efficiently. And when companies have the ability to tap a gold mine of knowledge locked in data warehouses, or quickly uncover relevant patterns in data coming from dynamic sources such as the Web, it helps them create more personalized online experiences for customers, develop highly targeted marketing campaigns, optimize business processes and more.

,

Honors for the father of fuzzy logic, Lotfi Zadeh

“In contrast to classical logical systems, fuzzy logic is aimed at a formalization of modes of reasoning that are approximate rather than exact. Basically, a fuzzy logical system may be viewed as a result of fuzzifying a standard logical system. Thus, one may speak of fuzzy predicate logic, fuzzy modal local, fuzzy default logic, fuzzy multivalued logic, fuzzy epistemic logic, and so-on. In this perspective, fuzzy logic is essentially a union of fuzzified logical systems in which precise reasoning is viewed as a limiting case of approximate reasoning.”

So began one of the most important technical articles published by AI Expert Magazine during my tenure as its editor: “The Calculus of Fuzzy If/Then Rules,” by Lotfi A. Zadah, in March 1992.

Even then, more than 20 years ago, Dr. Zadeh was revered as the father of fuzzy logic. I recall my interactions with him on that article very fondly.

I was delighted to learn that Fundacion BBVA, the philanthropic foundation of the Spanish bank BBVA, has recognized Dr. Zadeh with their 2012 Frontiers of Knowledge Award.

To quote from the Web page for the award,

The BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies (ICT) category has been granted in this fifth edition to the electrical engineer Lotfi A. Zadeh, “for the invention and development of fuzzy logic.” This “revolutionary” breakthrough, affirms the jury in its citation, has enabled machines to work with imprecise concepts, in the same way humans do, and thus secure more efficient results more aligned with reality. In the last fifty years, this methodology has generated over 50,000 patents in Japan and the U.S. alone. 

The key paper, the one that started it all, was “Fuzzy Sets,” published by Dr. Zadeh in June 1965 in the journal “Information and Control.” You can read the paper here as a PDF. I would not call it light reading.

Congratulations, Dr. Zadeh, for your many contributions to computer science and software engineering – and to the modern world.

, , ,

Big Data, by any other name, would smell as sweet

Modern companies thrive by harnessing and interpreting data. The more data we have, and the more we focus on analyzing it, the better we can make decisions. Data about our customers, data about purchasing patterns, data about network throughput, data in server logs, data in sales receipts. When we crunch our internal data, and cross-reference it against external data sources, we get goodness. That’s what Big Data is all about.

Data crunching and data correlation isn’t new, of course. That’s what business intelligence is all about. Spotting trends and making predictions is what business analysts have been doing for 40 years or more. From weather forecasters to the World Bank, from particle physicists to political pollsters, all that’s new is that our technology has gotten better. Our hardware, our software and our algorithms are a lot better.

Admittedly, some political pollsters in the recent United States presidential election didn’t seem to have better data analytics. That’s another story for another day.

Is “Big Data” the best term for talking about data acquisition and predictive analytics using Hadoop, Map/Reduce, Cassandra, Avro, HBase, NoSQL databases and so-on? Maybe. Folks like Strata conference chair Edd Dumbill and TechCrunch editor Leena Rao think not.

Indeed, Rao suggests, “Let’s banish the term ‘big data’ with pivot, cloud and all the other meaningless buzzwords we have grown to hate.” She continues, “the term itself is outdated, and consists of an overly general set of words that don’t reflect what is actually happening now with data. It’s no longer about big data, it’s about what you can do with the data.”

Yes, “Big Data” is a fairly generic phrase, and our focus should rightfully be on benefits, not on the 1s and 0s themselves. However, the phrase neatly fronts a broad concept that plenty of people seem to understand very well, thank you very much. Language is a tool; if the phrase Big Data gets the job done, we’ll stick with it, both as a term to use in SD Times and as the name of our technical training conference focused on data acquisition, predictive analytics, etc., Big Data TechCon.

The name doesn’t matter. Big Data. Business Intelligence. Predictive Analytics. Decision Support. Whatever. What matters is that we’re doing it.