Net Neutrality Vs. Title II – They Aren’t the Same

Since Title II passed I’ve seen a lot of articles that either indicate buyers remorse or have always been against Title II and are gloating that it’s going to be overturned. For example, Wired had an Op-Ed yesterday that used major points from Chairman Pai’s dissent against using Title II. Title II is clearly a divisive issue, as the guys over at KBMOD, where I also write, are completely divided over the supposed benefits of Title II. I sincerely hope that when we look back at this debate that we see this discussion as a confusing bit of history, because nothing happened. Where the Internet didn’t change and remained an open platform for everyone to easily and equally use.

Net Neutrality and Title II are not the same thing. Title II is an old law originally written in 1934 to regulate a single monopoly with the hopes of create more competition. It wasn’t successful but the legacy of Title II played an important role in the creation and development of the Internet. Title II was the policy regime that APRANET was developed. Whenever a scientist at MIT wanted to use a graphically powerful computer in Utah Title II was in full effect on that data system. Furthermore, Title II was the law of the land for all of dial up Internet. Which was actually a very good thing. The fact that there was Local-Loop unbundling meant that you could have an Internet service that was different than your phone company. It was also likely, given how low the costs were, that these ISPs didn’t have to pay many of the taxes that the Phone company did that you used to buy access to the Internet. We already know that Title II has and can foster a culture of innovation.

Net Neutrality is different than Title II because it was the architectural approach the initial designers took for creating the internet. There were a few key reasons for this, it was easier, required less computing power, and the majority of the early pioneers believed in what became the Open Source movement. In many cases it was the exception rather than the norm, early on, for scientists to patent their computer research. It’s likely because most of these researchers were Mathematicians and Physicists that came from a military background (WWI and WWII and all), so they weren’t used to patenting due to their educational background and the requirement for secrecy contributing to the war effort.

To provide preferential treatment to one packet of data over another required tools that simply would have prevented the data from arriving at its destination in a timely fashion in the 70’s. Remember this was during the time when a personal computer didn’t exist and computing used mainframes and terminals to do the work (interestingly we’re going back to that a bit with the cloud). This means that the routers would have had to have been mainframes themselves to decode the data and figure out what type of data it was before sending it to it’s next location. This was seen as a waste of computing power as well as an invasion of privacy. The point of the Packets was to help keep the data save and secure as much as to maximize capacity on lines connecting the computers.

One of the largest complaints about implementing Title II is that there’s not enough economic evidence to support it. I believe that to be true to some extent. It’s hard to forecast something that’s happening as it’s happening. Especially since the FCC was unlikely to get access, legally, to the Netflix-Comcast/Verizon deals to ensure equal access (or maybe preferred) to their lines. It was clearly shown by Netflix that Comcast/Verizon were intentionally causing issues they could easily resolve and they did immediately after they got paid. With Comcast/Verizon planning to foreclose the video streaming market in this fashion and violating the spirit of Net Neutrality, some sort of regulation was needed to prevent this foreclosure.

I would have rather not had any sort of regulation go into effect. However, I believe that the actions that Comcast and Verizon are taking are anticompetitive and anti-consumer. Time Warner Cable supposedly makes 97% profit on their broadband service, which isn’t a surprise whenever you have a local monopoly/duopoly for broadband.

Could there have been a better way? Yes, the FCC could have taken action that would have forced increased competition. Something likeĀ setting goals for every city in the US to have no fewer than 3 broadband providers and providing assistance to municipalities that wanted to develop their own to meet that goal. Ironically, the one provision not included in the Title II rule that would help with that is local-loop unbundling, which would reduce the cost of a new ISP entering the market as they wouldn’t have to build their own network, which has slowed Google Fiber down considerably.

The NSA, FBI, and Internet Security

Over the past few months we’ve learned a lot about how the US government looks at its own citizens. We’ve learned this through the actions of Edward Snowden. He’s done us a great service by forcing a conversation that the NSA and FBI didn’t want us to have. The NSA lied to the Senate recently by claiming that it never tracked US citizens through Cell Phones. We would never have known about these activities if it wasn’t for Snowden.

Snowden was using email to send information back and forth between himself and Glenn Greenwald. Since email is in one of those fuzzy gray areas of the law around data retention and government access to it this has caused a bit of a problem. It make things more difficult Snowden used an encrypted email service called Lavabit. It’s encryption was at such a level that when the FBI requested data from it, they were confounded and essentially attempted to blackmail (legally of course) the owner into handing over the encryption key. This would have effectively rendered the service these people were paying for worthless. They were paying to have their email traffic be secured from both public and private entities.

As we hear and more about how the US government has been behaving towards internet security, the more we’re learning that the NSA and other US agencies are doing their best to thwart it. They have worked with the NIST and weakened the encryption key they developed. The problem with these backdoors is that if it’s there for the “good guys” (whoever that might be) it’s also there for the “bad guys” (whoever that might be). This isn’t just general encryption keys, it’s things that we use every day without using it. Whenever we are using any website that includes “https” we are using a basic encryption protocol called SSL. Think about when you’re banking, you see the https. Google now allows you to use this when you send information to and from them. This encryption has also been broken by the NSA. This is our personal stuff and if it’s broken by the NSA it can be broken by other people. Now does this mean we’re likely to have a rash of new fraud cases or theft cases? No, as it’s been compromised for some time. However, people do know about it now and this of course is a greater cause for concern.

What can we do about this? Well, first, look into more secure encryption methods. I wouldn’t be surprised if Google and applications like HTTPS everywhere will change their algorithm in result. Second, contact your representative and your senator. I’m lucky my senator in Oregon is very vocal (Ron Wyden) not everyone is so please help inform your leaders. Third, buy from companies that you know haven’t given up data to the NSA, don’t use Facebook and the like and basically try to follow the great writing that Sean did several months ago over on KBMOD. He nailed it then and it’s even more pressing than before to keep up with security.

Stuxnet, Flame and security

First of all, I’d like to thank all my readers, I’ve had over 10,000 views in my first year of blogging. That’s amazing and is so many more views than I expected to ever have. Thank you for making it well worth my time to blog!

Recently a friend of my asked me to comment about the latest cyber attack, Flame, uncovered by Kaspersky, a Russian security firm. It’s still not entirely certain who unleashed the attack, but at the time I argued that it could have been Israel acting alone as they have a very capable tech sector. They put out high quality software, they have security experts and they have some serious R&D from US companies like MS and Intel.

Flame targeted Iranian computer systems, very much like Stuxnet did. At the time, it was unclear who released Stuxnet, which attacked Iranian centrifuges. It could have very easily been Israel acting alone or with some help from the US. Being a realist I fully expected the US to be involved, however I did not expect Obama to have issued the order himself. Based on history it is equally likely that Flame was initiated by the US as well.

Flame targeted data being sent over the internet such as PDF, Office and AutoCAD data and did not actively attack anything like Stuxnet did, according to Kaspersky. However, this doesn’t mean that it’s not being used by a spy agency. It’s also interesting to note that the infected computers are all outside of the US, which indicates that it could very easily be a US spy agency as they are not usually allowed to spy on US citizens.

These two programs leave me with a great deal of concern, because “the Pentagon has concluded that computer sabotage coming from another country can constitute an act of war, a finding that for the first time opens the door for the U.S. to respond using traditional military force.” Does this mean that if Iran responded with military force that our own Pentagon would argue that they were justified? I don’t think they would, but essentially they already have.

Aside from the risks of war it also gives greater leverage for a regime like Iran’s to argue for a more suppressed internet. They can now without any worry claim that they are doing it for national security. They are doing it for that reason, their centrifuges have been attacked (Stuxnet) and their people are being spied on (Flame). In addition other repressive regimes will likely use Flame as justification as a crack down on the internet. There may also be repercussions for Microsoft as Flame exploited a weakness within their auto update.

This also raises other concerns about what other types of cyber programs Obama has given the OK to. As he is the most technically savvy president we’ve had since the rise of the Internet, I think he fully understands the choices he is making. With Bush it may have been argued that he didn’t really understand as well what he was approving as he doesn’t have an in depth knowledge of how people use the internet and how systems interact with technology. He also wouldn’t have a good understanding of how viruses like this could turn against their creators. In this case Obama should. He should know that once in the wild a worm can mutate in a way that could turn against the people that released it and that we could destroy ourselves.

I think that these actions will weaken our position in any negotiations with Iran and possibly other countries that we have pushed for a more open internet. They could, rightly perhaps, argue that we only want the internet open, so it’s easier for us to infiltrate.

I don’t believe that’s the reason. I believe that the internet is the an amazing tool that has improved people’s condition to at least some extent. It has allowed for freer flowing of knowledge, but it can be used for wrong just as easily as any other media or communication tool.

A bit remiss

Sorry dear readrs, I’ve been very bad about writing any blogs lately. I’ve had some pretty big changes in the past two months as you all know. I’ve moved back from the Netherlands to the US, did some consulting work and I just started a job at AMD. Consequently, I’ve not been able to post as much as I have in the past. Big changes have been happening in my life.

Because of these changes I wasn’t able to pay enough attention to the CISPA fiasco that just occurred in the US. This law is a terrible step in the direction of data tyranny. I’m even being hyperbolic about this either. I wrote about the risks of having a voluntary data sharing program and in my review of Consent of the Networked I discussed the different data and Government regimes out in the “wild.” These concerns are valid. We need to be aware of what’s going on. Now, I have to say we pretty much blew our collective internet protest load with the SOPA/PIPA protests. Which is actually a problem. I would hazard that in many ways CISPA is as bad or worse than SOPA, however I didn’t see as much chatter about CISPA on reddit, twitter, Google+ or Facebook about CISPA as I did about SOPA.

I think there are a few reasons for this actually. First, the majority of the people were able to clearly understand the risks associated with SOPA. These risks are pretty straight forward and understandable. These risks affect us tomorrow not in some future time period. In many ways SOPA like acts can already happen today. This makes it extremely obvious why SOPA/PIPA are terrible laws and should be opposed at many levels. Second, with CISPA coming so quickly after the SOPA/PIPA protests there was likely something of a protest overload or disbelief that another law could come through so quickly that is as bad or worse than SOPA. Especially with the language that was being used at the time of SOPA. It would have broken the Internet, how could anything be worse than that? Third, there was more support by large companies for this law than for SOPA. Apparently that actually matters more than we realized. We were able to push Wikipedia, Facebook, and other large companies to protest this law. However in this case Facebook and Microsoft supported the law while Google sat on the sideline saying nothing about the law.

I think from this stand point, people that weren’t happy with CISPA but didn’t understand the importance likely didn’t do anything about it. However, whenever a fantastic website like Wikipedia blacks out in protest for a law it will get people who are only on the fence about the law to actually do something about the law.

CISPA and SOPA are both bad but in very different ways. CISPA is something of an abstraction of risk. Losing your privacy when so many people already voluntarily give up so much information about themselves on Facebook and Twitter might not seem like as big of a deal. The secondary abstraction is a lack of understanding of the impact of the data sharing. It’s unclear of what exactly the Feds would do with the data once they have it. It’s unclear how data sharing would occur within the government. However, it is likely that the data would be shared throughout the government including the military. Which many privacy experts are say essentially legalizes military spying on US civilians. The third problem is that many people also feel that if you aren’t doing something wrong you don’t have anything to worry about. However, this is a fallacy as even people who are doing things that aren’t wrong can get in trouble. I’ve discussed the cases where people are fired for posting drunken pictures on Facebook. Additionally, this type of law represents the biggest of the big government that we can imagine. There’s no reason why the government needs to know what we’re doing in this level of detail.

It’s going to be a long and difficult fight to keep our internet free. However, it’s something that we must do and I believe we can do it. We will just need to keep vigilant and work together to ensure that our internet stays our internet.

Are we talking past each other with the net neutrality debate?

I started reading (yes another book) “Internet Architecture and Innovation” on my flight to Portland Tuesday night. It’s going to be a really interesting read, if you like the internet, economics and innovation of course. One of the first parts discusses the history of the internet and a design principle called end to end. This means that when something is transmitted certain events must happen. There are two meanings to the same principle though, which complicates things. In one version only peers can “talk” to each other and share the information. This isn’t exactly literal, because if I’m skyping the data isn’t just between skype on my pc and yours, it goes through many, but the idea is that only your pc and mine know we are skyping. In the second method, some intermediaries might know that we are skyping, through something called deep packet inspection where a router is able to read the information it processes. Both ways are still called end-to-end. Which is obviously a problem.

Another easy example. One version would require equal up and download speeds, the other doesn’t. Let’s say you have a picture and want to upload it, in the one version it would take you the same time to upload as to download it the next day back to your pc. We know this doesn’t happen.

Until reading this book I really thought that the internet was truly designed in an equal and neutral manner. However, this isn’t the case. Using these two design principles results in an internet that looks very different and we would expect it to evolve differently based on which understanding was applied.

It’s obvious that for consumers the first option is better. Where the network behind the internet is neutral and a “dumb” pipe. Why is it better? Because no one would be able to intercept your data or change the speeds you get your information or even cap your data downloads. This is bad for network owners because they can’t charge or filter as easily for specific content. They simply become a pipe that information flows through.

The differences in incentives and contexts which the design rules are applied drives this discussion. Since the participants believe they are talking about the same thing there is confusion over the disconnect. This leads to an obvious other problem, our clueless elected officials. They don’t understand how the internet works at the simplest level, let alone the esoterics of the minute differences in this argument. It is no wonder they have tried to do back door deals to get this topic to go away.

This also has led to confusion within the internet community of how the telcoms can say that the internet wasn’t developed as a neutral platform. In a way they are correct, in other ways they are wrong. It was just a matter of what was being discriminated. Before it was up vs down speeds, now it could be content. Which to them is no different. For us, it matters a whole lot more.