Unity and Enshittification

Last week, I wrote about Enshittification, which is a great lead into for what just happened with Unity. Unity is a game development engine. It’s a very popular platform and a number of universities use it as part of their game development degrees. Popular YouTuber Mark Brown, of GameMakersToolKit uses Unity in his series exploring novice game development.

Which is rather unfortunate, because Unity has gone and decided to retroactively add a way to claw back value from game developers. They announced it on 9/12, it is a Run Time Fee for every new install of a game. It might not sound like a lot, $0.20 (at the lowest end) for an install, however, if you are a low priced game, because it’s a small game, that could be 1/5 of your revenue. Add in Steam’s 30% fee, you’re already at 50% of your revenue chewed up in fees. The other challenging part of this fee, is that it would apply to every computer an end user installs the game on.

I have a Steam Deck, a laptop, and a Gaming PC. If I installed a $0.99 game on all three systems, the company that developed by game would have lost 10 cents on my purchase. That’s untenable for game developers. It’s not a survivable product strategy. Game devs and publishers would need to charge at least $2.00 for every game to ensure they made money. Which might kill the viability for that game.

In a related cash grab eX-Twitter is contemplating charging a fee for every user on the platform. No matter how much you use the site, it’s now a subscription price. The benefit of that product was always the free association and random interactions on the platform. Another way it’s been enshittified has been the significant growth of Nazi users.

Companies that dramatically change the terms of use, will drive customers away. They will push companies from the platform or potential companies away. The Developers of Terraria, didn’t even use the engine, have elected to fund two Open Source development engines as a result.

I think it’s important for technology companies to take a long look at what the users are saying in response to Twitter, Unity, Google (and it’s Google Graveyard), and Telecom companies. End Users, the life blood of any platform, avoid these companies and their products as much as they. They do not want to be exploited. It’s toxic.

The executive class do like these schemes as they push up the “value” of their products to the stock market. However, this is ultimately at the expense of their long term revenue and value to customers. Working with your customers, both immediate and end customers, will drive value for your business in the long run. Trust lost is lost value. Don’t enshittify your products.

Net Neutrality Vs. Title II – They Aren’t the Same

Since Title II passed I’ve seen a lot of articles that either indicate buyers remorse or have always been against Title II and are gloating that it’s going to be overturned. For example, Wired had an Op-Ed yesterday that used major points from Chairman Pai’s dissent against using Title II. Title II is clearly a divisive issue, as the guys over at KBMOD, where I also write, are completely divided over the supposed benefits of Title II. I sincerely hope that when we look back at this debate that we see this discussion as a confusing bit of history, because nothing happened. Where the Internet didn’t change and remained an open platform for everyone to easily and equally use.

Net Neutrality and Title II are not the same thing. Title II is an old law originally written in 1934 to regulate a single monopoly with the hopes of create more competition. It wasn’t successful but the legacy of Title II played an important role in the creation and development of the Internet. Title II was the policy regime that APRANET was developed. Whenever a scientist at MIT wanted to use a graphically powerful computer in Utah Title II was in full effect on that data system. Furthermore, Title II was the law of the land for all of dial up Internet. Which was actually a very good thing. The fact that there was Local-Loop unbundling meant that you could have an Internet service that was different than your phone company. It was also likely, given how low the costs were, that these ISPs didn’t have to pay many of the taxes that the Phone company did that you used to buy access to the Internet. We already know that Title II has and can foster a culture of innovation.

Net Neutrality is different than Title II because it was the architectural approach the initial designers took for creating the internet. There were a few key reasons for this, it was easier, required less computing power, and the majority of the early pioneers believed in what became the Open Source movement. In many cases it was the exception rather than the norm, early on, for scientists to patent their computer research. It’s likely because most of these researchers were Mathematicians and Physicists that came from a military background (WWI and WWII and all), so they weren’t used to patenting due to their educational background and the requirement for secrecy contributing to the war effort.

To provide preferential treatment to one packet of data over another required tools that simply would have prevented the data from arriving at its destination in a timely fashion in the 70’s. Remember this was during the time when a personal computer didn’t exist and computing used mainframes and terminals to do the work (interestingly we’re going back to that a bit with the cloud). This means that the routers would have had to have been mainframes themselves to decode the data and figure out what type of data it was before sending it to it’s next location. This was seen as a waste of computing power as well as an invasion of privacy. The point of the Packets was to help keep the data save and secure as much as to maximize capacity on lines connecting the computers.

One of the largest complaints about implementing Title II is that there’s not enough economic evidence to support it. I believe that to be true to some extent. It’s hard to forecast something that’s happening as it’s happening. Especially since the FCC was unlikely to get access, legally, to the Netflix-Comcast/Verizon deals to ensure equal access (or maybe preferred) to their lines. It was clearly shown by Netflix that Comcast/Verizon were intentionally causing issues they could easily resolve and they did immediately after they got paid. With Comcast/Verizon planning to foreclose the video streaming market in this fashion and violating the spirit of Net Neutrality, some sort of regulation was needed to prevent this foreclosure.

I would have rather not had any sort of regulation go into effect. However, I believe that the actions that Comcast and Verizon are taking are anticompetitive and anti-consumer. Time Warner Cable supposedly makes 97% profit on their broadband service, which isn’t a surprise whenever you have a local monopoly/duopoly for broadband.

Could there have been a better way? Yes, the FCC could have taken action that would have forced increased competition. Something like setting goals for every city in the US to have no fewer than 3 broadband providers and providing assistance to municipalities that wanted to develop their own to meet that goal. Ironically, the one provision not included in the Title II rule that would help with that is local-loop unbundling, which would reduce the cost of a new ISP entering the market as they wouldn’t have to build their own network, which has slowed Google Fiber down considerably.

New FCC Rules and competition

A friend retweeted the Tweet below today and it got me thinking about the broader context of the FCC rules that past last Thursday

Two things struck me about this tweet. First, it’s disappointing that the author doesn’t understand Title II better considering he co-founded the EFF. Second, that Title II as implemented was designed to do nothing about ISP competition. As I wrote on KBMOD this week, Net Neutrality has no provision for “Unbundling” which would promote competition amongst ISPs at the local level. Unbudling, according to Wikipedia, is a regulation that requires existing line owners (such as Comcast) to open up their lines to anyone that wants to sell cable, internet, or telephony access. Unbundling, under a much more restrictive Title II, is the only reason that AOL was successful as a business model. Since this provision of Title II was forborne, Title II will not, in fact, be for promoting competition in ISPs at all.

Instead, the FCC, at least in my opinion, looked at the Internet as a general purpose platform technology. They were looking to ensure competition ON the technology not between technology carriers. For example, the FCC wants to see as much competition as possible between companies like Netflix, Amazon Prime Video, Hulu, and Comcast’s Xfinity service. However, they want to make sure that Comcast cannot foreclose on the video delivery service by leveraging their existing monopoly in telecommunications. What that means is that Comcast could create rules or an environment where Netflix cannot compete and Comcast customers MUST use the Xfinity service because alternatives didn’t function well (Foreclosure is the thing that got Microsoft with Web browsers).

The FCC did enact a rule that will impact competition at the local level though. It’s a limited rule because it impacts only Tennessee and North Carolina. It is preempting state law by stating that it is legal for municipalities to develop their own broadband networks. Broadband build out is prohibitively expensive for an entrepreneur to set up a network, however if they had a backing of a municipality that is willing to share the risk and the reward, it might be possible for an entrepreneur to build out their own broadband network on a limited scale. Municipalities aren’t the ideal solution to this, it would be significantly more preferable if other businesses moved into areas and built new broadband networks, however unless they have a massive amount of money, like Google, it’s unlikely to happen. A bridge between is a public-private partnership where private enterprise, which has the telecommunications expertise, partners with a municipality, which has the demand and financial support, to build a network.

With the ruling on municipal broadband being so limited, it’s not going to make much of an initial impact, however it’s likely that other municipalities will try to jump on that bandwagon and overrule laws at the state level (as a note I’m not going to argue if this is something they have the authority to do, I’m just looking at the potential impact of the rule).

Grants to build out networks rules change

Recently there have been a serious debate between the FCC and major telecoms about the minimum rate for broadband. It’s pretty obvious that there’s a strong disagreement between most customers and their ISPs. For the most part rural ISPs are pretty terrible. If you live outside of a major city it’s unlikely that you’ll have a very fast internet service. For a country of our size and population, we have an extremely large portion of our population that does have access to the internet, however we don’t have the deepest penetration of the internet in the world. Which for a country of our wealth that is something of a shame. We’ve been investing, through governmental grants since the middle of the 90’s and we haven’t seen the expected return on investment that we’d expected as investors. We paid for companies like Verizon and Comcast to invest in our network, and I mean we, as in the tax payers. We’re paying for them to get rich off of grants.

Internet Population and Penetration

Smaller countries like the Netherlands and the UK have significantly greater penetration. Sure they have smaller populations than we do, but they also have significantly faster internet speeds than we do across the board including rural areas. Korea has speeds an order of magnitude higher than we do, despite the fact that we’re a significantly richer country than South Korea.

One of the first moves in a long time that the FCC has done that is a positive move in a really long time. As of today, the FCC has decided that the minimum speed for broadband must be 10mbps which is a huge step in the right direction. This will change the minimum threshold for any investment by a company to earn a grant to increase from 4mbps to 10mbps. This is the right direction for our country and I’m really excited about the possibilities. It means that the FCC is starting to really understand that the telecoms don’t fully have our best interests in mind when they make their arguments. We’ll see what happens in the upcoming months.

More than two sides, the complexity of a story

In a lot of my writing, I typically focus on one aspect of the story. For example, with my writing about Ferguson I really focused on the wrong that I believed the police were doing. I didn’t really touch on the violence that the protesters were doing to the community (contained to the first few days) or the violence they were committing on the police. I didn’t ignore it personally, or as I was thinking about the articles, I just didn’t want to discuss it because it didn’t fit with the story I was trying to outline. That’s perfectly fine. You can’t fit everything into any given story. However, that doesn’t mean that omission was support of the actions of the protesters. I abhor their behavior and I think that it really negatively impacted their message. 

The past few days, we’ve had some pretty serious leaks. Over 100 celebrities have had their nude images leaked. The suspected culprit is iCloud. The iPhone, like most Android phones have the option to automatically backup your photos to a storage unit online. Apparently, there was a vulnerability in an application called Find My Phone, which allowed a person to try as many times as they wanted to access an account. What this meant was that brute force methods for cracking a login for an account would work eventually. It might have taken days or longer for whatever algorithm was used to crack the logins, but eventually it would have worked. There’s no way for it not. Essentially, the approach would run through as many permutations as possible for the login. furthermore, it could have actually been run concurrently on multiple different systems to test in parallel. It’s pretty horrible that someone was able to sneak into iCloud and steal these pictures, however, it’s also incumbent on the users of these systems and the owners of the systems to ensure that these simple lapses don’t happen. 

The users of these services bare a responsibility for understanding what is happening to their data once it leaves their phones. This is a requirement for any user, not just the famous. The famous likely should have someone help them with their security features, as it’s unlikely that many of them have the desire or knowledge to do it on their own. Not that this is any different for much of the rest of the population. They are as vulnerable as the famous, but aren’t a target simply by being uninteresting. 

In both cases, it’s fully acceptable to be upset by both sides of the story. It’s not impossible to say that police violence and militarization is bad and that the criminal element of the Ferguson protests is bad too. It’s also fine to say that you shouldn’t hack and that the people that develop the systems and use the systems are accountable as well. In most of our stories, there are complexities that are withheld or ignored because there is an angle the writer is going for, the story would take too long, or the writer has a low opinion of the readers. In my case, I was going for a specific angle with the Ferguson stories, because I assumed that it was obvious to the reader that the violence committed by the protesters was both known and understood to be a terrible wrong. Not mentioning it did make the police seem less rational than they were behaving though.

In the case of the leaks, most of the attention has been put on the leaker and the people enjoying the leaks, however, it’s important that we keep in mind that there’s a responsibility of the companies to keep that data safe.