Innovation and Lean

I’ve been doing a lot of reading around Disruption Theory, Lean product develop, and Lean Startup theory including the application of all three. One of the more interesting aspects of Disruption theory is that people hire products to do specific jobs. This is fairly similar to the questions that Lean Startup and product development ask, what problem are you trying to solve. This hired to solve a job approach seems to instill some limitations if these jobs aren’t continually re-evaluated. For example, in the Innovator’s Solution they talk about RIM and their Blackberry device, now this book was written in the early 2000s, which means that there was an actual debate if they should include cellular service or not. They argue that the job RIM is hired to do is to provide short periods of distraction to business people.

The risk of not re-evaluating these jobs on an extremely frequent basis, such as quarterly or annually, would likely lead to missing out on something like the iPhone. What I found interesting in this section of the book is that they argue against cramming every type of feature into a single device. I think that, at the time, this was sound advice due to the limitations of the technology, the costs of doing that, and immaturity of the markets still.

I believe this is where the Lean Startup approach really would help. Innovator’s Solution basically argues for the minimum viable product for a given job. Afterwards, through collecting data on how the users actively use the product the team can learn in which direction the product should mature. Through engaging continually with the customers it’s possible to understand and, with the right questions, determine if and when the job the product is hired to do is starting to change over time.

For example, iTunes was originally designed to be a light weight music playing piece of software. The job was to play music. Over time, because of the goal to move up market and capture other markets, Apple added new features, changing the jobs that iTunes was capable to fulfill. In some cases this lead to clear overserving customers and has since been accused of becoming bloatware. Using the correct metrics Apple would know if they were losing market share or if their market share was being artificially maintained because of the iPhone/iPad. This means that the music playing space is clearly ripe for disruption. The most popular product is over serving most of the market and causes excess performance drain on systems using the software. This is clearly why, despite iTunes popularity, services like Last.fm, Pandora, and Google Music are so popular. They are meeting the market where the market is and moving.

Over the next few weeks I plan to explore these theories and techniques in more detail. I plan to work towards something of a unifying theory and then attempt to deploy them in a startup of my own and write a book about the process. I have no idea what startup I plan to start but that’ll be half the fun! As a writer for KBMOD, I plan to work with the leaders of that team and deploy these theories with them. Hopefully seeing positive results for those guys.

Review: The Innovator’s

Written by Steve Jobs Biographer Walter Isaacson this book takes a look at how we got to now in the computing world. Starting with Ada Lovelace and ending with the pair of Sergey Brin and Larry Page, Isaacson covers dozens of the people that have enabled our world to be what it is today. He came to an non-intuitive result throughout his research, the lone inventor didn’t exist. He found that the most successful company or organization actually had a mixture of personalities and passions leading to success. His history is full of the men and women that made computing happen. Interestingly, it wasn’t until the 70’s and 80’s that women were no longer prominent in the history of computing. This is unfortunate as many of their contributions are still crucial to software development today. COBOL, subroutines, and most of the rules around software development were developed by women working either in the Navy or at some of the institutions that partnered with the military.

Much of this book was not new to me as I had read the history of Bell Labs and the history of Xerox PARC however a great deal of the book was news to me. He didn’t completely focus on the US even though a great deal of the history of computing took place here. He also explored the impact of the Germans, the British, and a bit of the Chinese.

If you’re a fan of computers and like the history of technology this book is definitely for you. Isaacson discusses the culture of the organizations that allowed technology to flourish (Intel) and the types of environments where it did not (Shockley Semiconductor). How no company created anything in a vacuum and that many ideas are, in fact, independently co-created.

Technology is a messy collaborative thing that could have been very different depending how just a few changes. Without collaboration, risk taking, and some big personalities we wouldn’t have the computers we have today.

Net Neutrality Vs. Title II – They Aren’t the Same

Since Title II passed I’ve seen a lot of articles that either indicate buyers remorse or have always been against Title II and are gloating that it’s going to be overturned. For example, Wired had an Op-Ed yesterday that used major points from Chairman Pai’s dissent against using Title II. Title II is clearly a divisive issue, as the guys over at KBMOD, where I also write, are completely divided over the supposed benefits of Title II. I sincerely hope that when we look back at this debate that we see this discussion as a confusing bit of history, because nothing happened. Where the Internet didn’t change and remained an open platform for everyone to easily and equally use.

Net Neutrality and Title II are not the same thing. Title II is an old law originally written in 1934 to regulate a single monopoly with the hopes of create more competition. It wasn’t successful but the legacy of Title II played an important role in the creation and development of the Internet. Title II was the policy regime that APRANET was developed. Whenever a scientist at MIT wanted to use a graphically powerful computer in Utah Title II was in full effect on that data system. Furthermore, Title II was the law of the land for all of dial up Internet. Which was actually a very good thing. The fact that there was Local-Loop unbundling meant that you could have an Internet service that was different than your phone company. It was also likely, given how low the costs were, that these ISPs didn’t have to pay many of the taxes that the Phone company did that you used to buy access to the Internet. We already know that Title II has and can foster a culture of innovation.

Net Neutrality is different than Title II because it was the architectural approach the initial designers took for creating the internet. There were a few key reasons for this, it was easier, required less computing power, and the majority of the early pioneers believed in what became the Open Source movement. In many cases it was the exception rather than the norm, early on, for scientists to patent their computer research. It’s likely because most of these researchers were Mathematicians and Physicists that came from a military background (WWI and WWII and all), so they weren’t used to patenting due to their educational background and the requirement for secrecy contributing to the war effort.

To provide preferential treatment to one packet of data over another required tools that simply would have prevented the data from arriving at its destination in a timely fashion in the 70’s. Remember this was during the time when a personal computer didn’t exist and computing used mainframes and terminals to do the work (interestingly we’re going back to that a bit with the cloud). This means that the routers would have had to have been mainframes themselves to decode the data and figure out what type of data it was before sending it to it’s next location. This was seen as a waste of computing power as well as an invasion of privacy. The point of the Packets was to help keep the data save and secure as much as to maximize capacity on lines connecting the computers.

One of the largest complaints about implementing Title II is that there’s not enough economic evidence to support it. I believe that to be true to some extent. It’s hard to forecast something that’s happening as it’s happening. Especially since the FCC was unlikely to get access, legally, to the Netflix-Comcast/Verizon deals to ensure equal access (or maybe preferred) to their lines. It was clearly shown by Netflix that Comcast/Verizon were intentionally causing issues they could easily resolve and they did immediately after they got paid. With Comcast/Verizon planning to foreclose the video streaming market in this fashion and violating the spirit of Net Neutrality, some sort of regulation was needed to prevent this foreclosure.

I would have rather not had any sort of regulation go into effect. However, I believe that the actions that Comcast and Verizon are taking are anticompetitive and anti-consumer. Time Warner Cable supposedly makes 97% profit on their broadband service, which isn’t a surprise whenever you have a local monopoly/duopoly for broadband.

Could there have been a better way? Yes, the FCC could have taken action that would have forced increased competition. Something like setting goals for every city in the US to have no fewer than 3 broadband providers and providing assistance to municipalities that wanted to develop their own to meet that goal. Ironically, the one provision not included in the Title II rule that would help with that is local-loop unbundling, which would reduce the cost of a new ISP entering the market as they wouldn’t have to build their own network, which has slowed Google Fiber down considerably.

New FCC Rules and competition

A friend retweeted the Tweet below today and it got me thinking about the broader context of the FCC rules that past last Thursday

Two things struck me about this tweet. First, it’s disappointing that the author doesn’t understand Title II better considering he co-founded the EFF. Second, that Title II as implemented was designed to do nothing about ISP competition. As I wrote on KBMOD this week, Net Neutrality has no provision for “Unbundling” which would promote competition amongst ISPs at the local level. Unbudling, according to Wikipedia, is a regulation that requires existing line owners (such as Comcast) to open up their lines to anyone that wants to sell cable, internet, or telephony access. Unbundling, under a much more restrictive Title II, is the only reason that AOL was successful as a business model. Since this provision of Title II was forborne, Title II will not, in fact, be for promoting competition in ISPs at all.

Instead, the FCC, at least in my opinion, looked at the Internet as a general purpose platform technology. They were looking to ensure competition ON the technology not between technology carriers. For example, the FCC wants to see as much competition as possible between companies like Netflix, Amazon Prime Video, Hulu, and Comcast’s Xfinity service. However, they want to make sure that Comcast cannot foreclose on the video delivery service by leveraging their existing monopoly in telecommunications. What that means is that Comcast could create rules or an environment where Netflix cannot compete and Comcast customers MUST use the Xfinity service because alternatives didn’t function well (Foreclosure is the thing that got Microsoft with Web browsers).

The FCC did enact a rule that will impact competition at the local level though. It’s a limited rule because it impacts only Tennessee and North Carolina. It is preempting state law by stating that it is legal for municipalities to develop their own broadband networks. Broadband build out is prohibitively expensive for an entrepreneur to set up a network, however if they had a backing of a municipality that is willing to share the risk and the reward, it might be possible for an entrepreneur to build out their own broadband network on a limited scale. Municipalities aren’t the ideal solution to this, it would be significantly more preferable if other businesses moved into areas and built new broadband networks, however unless they have a massive amount of money, like Google, it’s unlikely to happen. A bridge between is a public-private partnership where private enterprise, which has the telecommunications expertise, partners with a municipality, which has the demand and financial support, to build a network.

With the ruling on municipal broadband being so limited, it’s not going to make much of an initial impact, however it’s likely that other municipalities will try to jump on that bandwagon and overrule laws at the state level (as a note I’m not going to argue if this is something they have the authority to do, I’m just looking at the potential impact of the rule).

Innovation Isn’t Just a Buzzword

It’s rather unfortunate that everything has to be an innovation these days. Even worse, is that for a business to be effective, it seems they must drive disruptive innovations. Innovations are simply inventions that have been successful in the market, those inventions might actually business model changes that have been successful in penetrating the market. I personally find that looking at innovation as a framework to analyze business pressure to be extremely interesting. I did this today in an interview and it felt really good as I was able to create context around changes impacting the health insurance industry.

Several years ago I wrote about the 4 types of innovation, Incremental, Modular, Architectural, and Radical. This is a bit different than the framework that Christenson argues, since he only looks at 2 types, Incremental and Disruptive. I believe that disruptive encapsulates both Architectural and Radical. Architectural changes are business model innovations while making a very similar product but one that significantly undercuts existing businesses or creates a new market. While Radical innovations creates a new market but also attacks existing customers, through a new business model and a completely different type of product. Think of a fan competing against an air conditioner window unit, while central air is an architectural change for the window unit.

da9cf-typesofinnovation

I believe that this framework is just as useful for businesses to analyze their environment as Porter’s 5 Forces because it forces businesses to confront the disruptive innovations that they might have overlooked otherwise. Without using this framework it is likely that businesses would ignore the new entrants force as they don’t feel that those businesses will ever compete with them. However, based on historic evidence those entrants that have a different business models or a different metrics for their performance eventually supplant incumbents. I believe that this type of analysis should be conducted annually or bianually as many industries and markets have continually increasing uncertainty and faster rates of change than historically.