Some very cool Silverlight demos

These are referenced elsewhere (in the gallery on the Silverlight.net site), but here are a couple of the samples I find particularly interesting:

Definitely worth looking at, and seeing what is possible.

Business life lesson – Don’t let anyone steal your dream : Atlantic Canada’s Small Business Blog – IQI Strategic Management Inc.

 

Business life lesson – Don’t let anyone steal your dream : Atlantic Canada’s Small Business Blog – IQI Strategic Management Inc.

This is an interesting post, and fits in well with other things which have been on my mind lately, and with things about which I have posted.

It occurs to me that over the years, I really have let the world steal my dreams. I think we all do this – we get so wrapped up in the day-to-day “operations” of life that we lose track of the grand visions. We also tend to be told that we need to think realistically, and be reasonable, and play it safe. We spend much of our lives being taught what is possible, and even worse, what is impossible. I think that is why so much advancement in science, arts, and other fields comes from the young, because they have not yet learned that what they are trying to do is “impossible”. 

One of the nice things about a grand vision is that you spend much less time worrying about whether it is possible of not, and more time just working towards it.

A Picture of the Multicore Crisis -> Moore’s Law and Software

I was reading A Picture of the Multicore Crisis, and got to thinking of something which has bothered me for a long time. This issue is related to Moore’s Law and the growth of processing capacity (whether through raw clock speed, or the multicore approach, or magic and hampsters). Looking at the last 10 years or so, we probably have something like 10-20 times the processing power we had 10 years ago.

As a producer of server-side software, a user of server software, etc., it makes me wonder – why are my servers (document management, document production, and many others) not providing a corresponding increase in throughput? Why do many server systems maintain the same performance over time, or offer only marginal improvements?

(I leave aside client side performance for now, because on the client side much of the performance improvements have shown up in different ways, such as new capabilities like multimedia, prettier graphics in the UI, the ability to multitask and keep 10 different applications open at the same time).

So, why are my servers not 10 times as fast as they were? I can think of a few reasons:

  1. As has been discussed in other places, the shift from clock-speed-driven improvements to a multicore approach has had an impact. Much software, especially older software, is not written in a way which takes advantage of multiple processors. And often, re-engineering this software to better use multiple processors is non-trivial, especially when you have to worry about things like backwards compatibility and supporting a large number of customers, finding time to add the new features product management wants, etc. Very few of us can afford to divert a significant group of our development resources for an extended period of time, and it is frequently hard to justify from a business perspective.
  2. Even if your software is architected for multiple processors, oftent he algorithm is inherently “single threaded” in places, which throttles the whole process.
  3. Also, even if you are well architected for multiple processors, this does not come for free. The overhead introduced in managing this algorithm can easily consume a non-trivial portion of your processor gains.
  4. Even excluding the shift to multicore, much software has not kept up with performance improvement provided through pure clock speed. There are a number of reasons for this:
    • We are frequently very feature driven. The desire to compete, expand and grow often leads us to add features to existing software at an alarming rate. Wile this is necessary from a business perspective, often the addition of these new features slows down the software faster than the hardware speeds it up. Note, this is why I think it is very important to be architect software so as to be able to isolate “core” processing from “features”. This way, features can be removed from the configuration when not needed, and not allowed to impede performance. Also, this is why it is important in each cycle of development on a product to assess whether performance on the same hardware is at least as good.
    • Processing power is not the whole story (yeah, I know, we all know this). Much of our software is not entirely CPU bound. The bottlenecks are often elsewhere. Much of our processing, especially for large documents, is more bound by memory, disk speed, network speed, and dependencies on other systems. Given that, there is only a limited amount of benefit to be gained through pure processor speed.

Why No One Plays on CodePlex

In response to the post Microsoft’s Open Source Software is Junk? and the article which triggered it, I would like to offer a few comments:

  1. CodePlex is not “Microsoft’s Open Source Software”. CodePlex is a sandbox where others can create open source software based on the Microsoft platform(s). 
  2. To say there are no interesting projects on CodePlex is something of an exaggeration. To say that it is “all junk” is just a sensationalistic headline trying to suck in readers. That said, much of the more interesting stuff actually comes from Microsoft’s Patterns and Practices group (such as the Enterprise Library), and so open source zealots do not recognize it. I would agree that there are few, if any, mature projects there which did not originate inside Microsoft.
  3. Do people not enjoy developing in .NET? Well, given the number of people using it, I would say many do enjoy developing using .NET (and no, they are not all Microsoft cronies, and they are not all forced to by evil, imperialistic employers). 
  4. While there are few mature .NET projects on CodePlex, that does NOT mean open source project based on .NET do not exist. Look around SourceForge. There are a lot of successful, valuable projects there based on the .NET platform.
  5. Look at the profile of the typical open source developer. Typically, they are coming from a very anti-Microsoft state of mind. Given that, they are not likely to develop their great idea on the .NET platform (even if it would be an ideal platform for it – they are making emotional and philosophical decisions, not technical ones). Even if they can bring themselves to use a Microsoft platform, they are definitely not going to host that project on a Microsoft-controlled site, where the evil empire could steal their radically brilliant work.
  6. Until relatively recently, there were no Microsoft-supplied free tools to develop on .NET (there have been a couple of open source tools, such as #develop, which is of course hosted on SourceForge). Open source developers are even less likely to pay Microsoft for the privilege of developing on .NET.
  7. Look at the life cycle of “successful” open source projects. Apache and Linux have been around for a very long time. Of course they are going to be much more mature than anything on the .NET side (though I am not sure many open source projects in any context will have the level of success these have had). FireFox hardly started from scratch, but from a large code-base of pre-existing code. A significant advantage. If Microsoft were to open source IE, you might see a big jump in open source browser development over top of it (though I doubt it, given point (4)).
  8. The existence of successful open source projects (again, leaving aside Apache and Linux) is largely a by product of having lots of open source projects. It is like ideas, the more you have of them, the more likely you might have a good one. There are not enough open source projects on .NET to have that “critical mass”, and given point (4), there may never be.

Is Vista as bad as they say?

Over the last few months (or the last year or more), it has become extremely fashionable to beat up on Vista. Heck, it is a great way to generate hits on you site or blog, maybe get Dugg, whether you have anything useful to say or not. I am talking about posts like this, or this, or this whole blog.

Personally, I run Vista on several machines, and have few problems which were not related to the failure of third parties to provide updated drivers, or updated versions of software for Vista (sometimes makes me wonder if there has been a conspiracy on the part of other vendors to purposely sabotage Vista – but it is more likely just not bothering to provide what customers pay for). I also still run XP on a couple of boxes, and Win2K3. On my main development box, I also run a number of operating systems in VMWare, including WinXP, Win 2K3, Fedora, Ubuntu, and several “minimalist” Linux distros for playing around with.

An unfortunate fact of life is that all operating systems available right now suck, at least in some aspect or another. Linux suffers from many driver limitations (though this is getting better), and a wannabe user interface that spends far too much time trying to look like Windows, while missing the point of usability altogether. Windows (all versions) suffer from security issues, and from performance and stability issues inherent in trying to be all things to all people. I will not comment on Mac OSX, because I have not run it. It is also kind of irrelevant, since I cannot run it unless I buy Apple’s hardware.

Vista has its own usability issues. Some that are pointed out are valid. The UAC implementation is moronic. The UI path you have to follow to connect to a wireless network is annoying. Here is one I discovered today – disk defragmentation. When you defragment you hard drive you get this useful dialog:

defrag

Isn’t that helpful? No progress indication. No estimated time to completion. Just a statement that it could take anywhere from a few minutes to a few hours. Gee, thanks.

The problem is, this kind of thing is not just a problem in Vista, or Windows in general. It is pervasive in all operating systems, and almost all software written to run on them. Most software is filled with minor little usability gaps like this.

So stop beating up on Vista (unless you need the traffic), and start thinking about how to make the whole situation better.

VentureBeat » Google continues its assault on Microsoft, offering StarOffice suite

VentureBeat » Google continues its assault on Microsoft, offering StarOffice suite

Two thoughts spring to mind:

  1. If StarOffice cannot compete successfully against MS Office, does it matter that Google is bundling it?
  2. If Google had faith in its web-based office applications, and in the vision of all apps as web apps, why is it bundling a desktop-based office suite?

Apple’s Mac Set to Soar?

I am always amazed (and somewhat amused) to listen to the press and many bloggers pound on Microsoft, and hold up Apple as this golden idol of alternatives. Don’t get me wrong, I love Macs – I have ever since I started using and programming them back in the late 80s. I even liked the Newton. And the new iMacs – damn I want one.

But there are a few points of the Microsoft is evil/apple is great discussion that I find deeply amusing and ironic:

  1. Apple, with Steve Jobs, handed the desktop market to Microsoft on a platter. The Mac UI in the early eighties was way beyond anything Microsoft would produce until Windows 95. With that lead, Apple could have taken over the desktop. However, through the closed, anti-clone, “we must maintain the purity of the platform” view they had through the eighties, they gave that advantage away. Even though DOS was crap in terms of usability, and Windows was graphical crap, the availability of cheap clones and many, many hardware choices, the PC won out. Once again, inferior technoogy won because the people behind the better technology acted stupidly. (Note that Steve Jobs continued this stupidity with more great technology with Next).
  2. Apple has always been the ultimate “closed platform”. Standards rarely come into play. If you want to develop on the Mac (at least anything useful) you use our tools. Until recently, even all of the hardware has been non-standard. If Microsft were anywhere near as closed as Apple, the Justice Department would have shut them down. Heck, on many Apple devices, you are not even allowed to change your own battery, or add an industry standard memory card.
  3. Apple has rarely created technology which benefited (from a tech community sense) anyone but Apple. Consider Microsoft’s Tablet PC platform. Microsoft could have “gone it alone” on the Tablet, as Apple would have (and probably will). Instead, Microsoft defined the specification for a Tablet PC, and left it to hardware vendors and startups to build the hardware, and IVSs to build the application, thus creating a sub-industry benefiting many businesses beyond Microsoft. Compare to Apple and the launch of the iPhone.

Again, I love Apple, and I think they have some of the best design people in the world. But I do not fool myself into believing that they are in business for anyone’s benefit but their own.

But I want to be Disruptive!

I have spent a great deal of time over the last couple of years thinking about the process of innovation, different types of innovation, and how to innovate in a small but established organization versus a startup organization. I was reading Innovator’s Dilemmas: Do You Really Need To Be Disruptive? over on consultaglobal this weekend, and got to comparing some of Jose’s thoughts with work I have done in the last year.

As Jose says in that post, he is more interested in the process of defining a product roadmap in terms of gradual innovation, and in managing product portfolios. We have been very successful with this type of innovation, having a strong product management process for our existing product suite. In my role, I have been more interested in how we do larger scale innovation – how do we come up with the innovations now which are going to drive our growth 2+ years from now?

I have defined an innovation cycle as shown below.

image

Recognizing that disruptive innovation is, well, disruptive, as this cycle is traveled counter-clockwise starting from the upper right, we go from a high-chaos, low-process environment to progressively higher process and lower chaos.

In this model, the upper right quadrant represents what we are really good at, evolutionary innovation driven by product management.  The upper right quadrant represents the starting point – the idea generation engine. This is traditionally a hit and miss process of collecting ideas from various parts of the organization (or just a few people), and trying to pick which ones to invest time and money in. It is my belief that this activity can be wrapped in a process without destroying the creativity needed to really come up with ideas. Among the activities I consider important in this quadrant are:

  • Establish some context for innovation (see this earlier post)
  • Get ideas from everybody, not just R&D or Product Management
  • Get out and talk to customers
  • Involve your staff who are in front of customers, especially professional services people if you have them
  • Engage in structured/facilitated brainstorming with groups from various cross-sections of your company
  • Know how you are going evaluate ideas and decide which ones to investigate more deeply

The last point is important – it is no use having lots of ideas if you have no way to evaluate them. No organization can go deep on all the ideas generated, and a small organization can only really attack a couple. See this earlier post for my thoughts on using the Needs, Approach, Benefits, Competition (NABC) approach. At the end of this stage, and ideas should have a reasonable Needs definition, with a rough indication of the other three categories.

The next quadrant is what I have called Play. This is where ideas which survive the evaluation in the Ideas stage and start to play with them, flesh them out, create prototypes, and generally move the NABC definition forward. Early in this phase, the Approach needs to be clarified, while the Needs are evaluated more deeply.  Later in this stage, if a viable Approach is identified, and the Needs continue to make sense, then the Benefits and Competition need to be addressed (note that in reality, it is never anywhere near this linear, but this is for the benefit of description). By the end of this stage, we should be able to present a fairly strong value proposition for those ideas which have survived the process.

The next stage is to Build the products (ok, probably only one) for which the value proposition seems best. I will not get into the build process, except to say that the NABC analysis should be kept at the forefront throughout the process, and not be afraid to make hard decisions if things stop making sense.

The final stage is the Evolution stage, where the product moves into the incremental, evolutionary development cycle of a completed product. Note that for a new product, there may be some iteration between Build and Evolve.

Finally, the cycle is closed by having ideas from ongoing product evolution feed back into the Ideas stage.

So, is it ever this neat and clean and linear? Well, no. But that does not mean it is not valuable to have a model which you at least pretend you are following!