Philanthropy, Private industry, and science

Apparently I’m not too happy with the NYT magazine and their exposés of late. First there was the long article about millenials and how they don’t want to work for the “old guard” which is ahistoric and ignores a great deal of the similarities between the silicon valley of today and the past silicon valleys and other similar environs.

Now they are rushing about in concern over private scientific research. Apparently, it’s a new big problem. It’s neither new nor a problem. First of all some historical context. Scientific labs as we know them today were truly founded through industrial labs. These labs were initially in the dye industry back in Germany in the late 1800s, sure there were university labs, but they weren’t researching as big of thing as the industrial labs started. These labs had problems that couldn’t be solved in academic settings. The universities were training grounds for scientists, but in many cases the scientists actually did their doctoral research at Bayer or a similar type dye company. These dye companies almost all became pharmaceutical companies over time because of the similarity in chemistries between dyes and pharmaceuticals.

This was in the 1800s and really hasn’t abated. I’ve written about Bell Labs and Xerox in the past which are essentially the Bayer equivalent for telecom, semiconductors, and computers.

Science has always been a combination of public, private, and universities. In fact, research that I conducted through my master’s degree has shown that the INTERACTION between private industries and universities produces the most important work (in terms of citations). Our concern should not be if science is going private or not. Our concern should be if they are sharing with the broader scientific community. That’s the biggest risk. It’s one of the biggest problems with industrial scientific research – it never reaches the light of day even if it becomes a product.

Why doesn’t it? Well, simply because it’s better protection for some processes for the technique not to be patented. In the case where something is relatively easy to copy (an iPhone) it’s best to patent because you’re protected them. In the case where it’s very difficult to copy (a nitride layer on an Intel chip) it’s best to hide that process as deep as possible. In fact, it’s best if any technique that would uncover the underlying process to make that nitride layer from reverse engineering destroys the product. For Intel, this is the best result, for the rest of the world, it’s suboptimal as Global Foundries and TSMC will struggle for years to reverse engineer the layer if they ever can. This slows the innovation process as a whole, but we’re willing to suffer this inefficiency because Intel makes some nice chips.

Beyond this debate, the author is upset that someone would want to push scientific research in one direction that might only help white people or rich people. Unfortunately, this is capitalism. We may not like it in basic research that is going to be used to cure diseases, but we tolerate it with Intel so we need to be realistic and tolerate it in this case. Furthermore, I think that the author doesn’t understand that adjacencies in research in diseases will arise and we’ll learn more about all humans, not just them white folks. Ironically, at this point the author calls out a researcher that is working with an Oracle billionaire – that researcher works at Rockefeller University.

What are seen now as seminal research institutions in many cases started out through the very philanthropy the author is upset about. Carnegie Mellon University was the combination of two institutions in Pittsburgh started by an industrialist and a banker. It is one of the most respected research organizations in the world. These men were driven by the same desire to push scientific research as Bill Gates and the other (mostly) men on the list.

Is this a perfect system? Not by a long shot, however in the current political environment scientists are going to take money from whatever source they can. It’s merely practicality. A professor will typically have anywhere between 1-10 grad students. These students at the PhD level will likely be fully funded by the professor. If that professor does not get funding, those kids don’t get to keep working and either have to find another adviser or quit. Here’s the kicker in the case that professor does get money – a large proportion of that funding is taken and allocated to less profitable portions of the organization. At University of Texas, this meant that the EE department was probably funding part of the Chemistry Department. Some departments are like the Football team, while others are like the Swimming team. The swimming team might be winners, but are in a small market.

If we truly wanted change in the way we fund scientific research we need to increase the amount of public investment across multiple institutions. We need to increase funding across multiple types of research fields, specifically focusing on the intersections between academic fields. Push for collaboration between industry and universities as well as collaboration across national boundaries. All of these improve the citation rate and quality of the research. We can even work to partner public funds with private funds – we just need full disclosure.

The problem isn’t privatization. We’ve had an oscillation between really publicly funded (1960-70’s with NASA) and really privately funded. In all cases science has marched on – we just need to make sure it keeps on marching.

Science, evidence, and paradigms

Last night was a big debate between Bill Nye the Science Guy and Creationist Ken Ham. This was to help inform people that the science supporting evolution and how that refutes the “science” behind creationism. One of the key questions during the debate was around what would be required to convince Bill Nye that creationism was true and evolution was false. He said “Evidence” essentially. While, this is the ideal answer for a scientist, I find it unlikely. This, of course, isn’t a popular oppinion. It’s not that Bill Nye doesn’t believe that he would change his mind or that he would change his mind quickly, but it’s unlikely. People aren’t purely rational, in a purely rational world, yes that’s exactly what would happen. Even scientists have a serious problem with this. Scientists still suffer from the same sort of denial that global warming denialist, however, this impact is the largest inside of their field rather than outside.

Why do we know that this is true? According to Karl Popper whenever theories are incommensurate it’s unlikely that a leading theoriest in that field will switch to the new theory or paradigm. What does this mean? Well, if we think about scientific theories in terms of technology it will become easier to understand. Let’s look at jets and propellors for airplanes. It was clear in the early 50’s that jet engines were the way to go, but not all companies decided to pursue that type of engine. Instead these companies decided to continually tweak the capabilities of props instead. A similar reaction happened with sail technology and steam engines in this case sail techology was still more effective than steam, it took years before steam would catch up let alone surpass sail.

This similarly happens with scientific theories. What happens is that flaws start to appear that the theory cannot easily explain. For example, in the Geocentric theory planets would seem to track backwards over time and then begin to move forward. Theories about how these planets had small circles that would regularly appear through the course of their normal revolution around earth. The mathematics for this theory became increasingly complex and seemingly less realistic. The heliocentric approach reduced the complexity and eliminated the small circles and allowed for the eventual creation of Newtonian physics. However, whenever this started to break down and Einstein proposed relativity, it was largely ignored for decades. Essentially, it took until that generation retired for relativity to finally get accepted by the broader scientific community. This happens to scientific theories on a regular basis.

In fact, there are some pretty serious debates going on about the full mechanics of evolution. The original basis of the theory are still true, heredity, competition/pressure, and variety, however the nuances are being debated. For instance Richard Dawkin’s theories have started to fall a bit out of favor, while we’re learning that there are some things that we do in our lives that impact our genes. Those changed genes could be inhereted, which could change the next generation – this was Lamarcain to the core. However, Dawkins will likely not accept a different theory than the one he’s devoted to his life to. So, while to some extent it’s true that scientists will and do change their mind, it’s more likely that Science will change while individual scientist will take significantly longer if they ever do.

Advances in Healthcare science will change our lives

We are at the very beginnings of a healthcare industrial revolution. Perhaps, we’re beyond just the beginnings, we’ve seen huge leaps and bounds in diagnosis and treatment in the last 20 years. In my life time Aids has gone from being a quickly terminal disease to a disease where people are able to live relatively healthy lives which also last nearly as long as the average person. This is all without needing to have the sums of money of Magic Johnson.

This alone is something to celebrate, but there’s a lot more work to be done in easing the pain of suffering patients. Many Americans suffer from dibilitating heart diseases, which in some cases require heart transplants. Transplants are extremely risky procedures because the body can and does reject the new hearth. So, even after receiving the new heart, many patients are on immune system suppressing drugs. This increases their risk of infection and contracting other diseases, which can of course reduce the quality of life and inreases the risk of premature death due complications of the transplant. Recently, there’s been a serious break through in building a heart scaffold.

What is a heart scaffold? This is essentially a structure that allows your own cells to re-build your heart. In the case of this breakthrough, another heart had all of the specific cells that would have been rejeted by the transplant patient. The underlying structure remains because it is generic tissue that is transferrable between species. This break through allows patients to use their own stemcells and heart cells to convert the scaffolding into a function heart that is the patient’s own heart, not a transplant heart. This reduces the likelihood of rejection by the patient, eliminates the need for the immune suppressing drugs, and improves the results.

In the paragraph above, I said transferrable between species. The heart that is being used isn’t even a human heart, it’s a pig heart. This greatly increases the supply of hearts, because the number of hearts that can be transplanted is based on the number of peole willing to donate a heart and that are in good enough condition to be transplanted into another patient. We can hope for future advances for other types of organs as well.

This isn’t the only type of advancement we’re seeing along this vein. There is a pen that can draw cells on bone, and there are 3D printers that are being developed to print organs, like livers.

We live in exciting times for sciene in healthcare. We just need to figure out how to have medicare and insurers pay for the next.

Review: Dealers of Lightning Story of Xerox PARC

This is the Third historical book written about a business. The first was the history of Bell Labs and compared to that book, this was a wild ride in terms of organization. It would bounce back and forth over the span of ten years, while Idea Factory (Bell Labs book) was a stately possession moving forward with time. I believe that the major difference was that while a lot was happening at Bell Labs, it wasn’t crammed into 10 years. It occurred over 40 years or more, which allowed the author to pick and choose the people to follow. In Dealers of Lightning so much was happening at the same time with the same people and unique people that it forced the author to jump backwards and forwards through time.

Despite that, it really made me realize how much we owe to PARC researchers in the 70’s for technology we have today. If you’re using a tablet, one of the very first visionaries that created that concept was Alan Kay, he first envisioned it in the 60’s and from what was described in the book, the iPad is pretty much true to his vision. Amazing to be honest.

Here’s a list of things they made:
Object Oriented Programming
Ethernet
The First mass produced PC
The predecessor to Word
The original Desktop
VLSI, what has enabled the development of basically every semiconductor chip
The first Graphics Chip
Copy, Cut, and Paste
The right click
First Laser Printer
The predecessor to Postscript (Adobe)
A piece of software where you could edit text and pictures at the same time
A computer in 1982 that had 6000 Japanese characters and could type in 100+ languages and it’s capabilities wouldn’t be match again until the 90’s

Dramatically influenced Apple, Microsoft, 3Com (Metcalfe founded this after leaving PARC), Adobe (2 PARC researchers founded this), and many other companies.

Xerox was a visionary company to fund a research agency like PARC. PARC was likely one of the last of its kind as well. There are very few companies that have a similar branch of research facilities that push basic and applied scientific research. I suggest reading this book, just so it helps you understand where the technology we all use and love came from.

I give this book 4/5. Well researched, great topic, difficult to write because of the concurrent activities.

Cash reserves, risks and innovation

In my last post I discussed the large cash reserves that companies have been holding since the 2007 recession. As I mentioned there are several reasons for this, some of it has to do with lack of R&D investment. R&D is an expensive investment. This requires both train scientists and equipment to conduct the research. In addition there are extra requirements for technicians and other employees to support the R&D effort. This isn’t cheap. As we can see in the bottom half of the chart all types of research funding has decreased recently.

R&D is not a certain thing by any stretch of the imagination. This is why companies are paring with universities to share the burden of R&D. Universities are doing much of the basic and applied research, while industry is developing it into product. This is where the money is and the greatest amount of certainty. You can’t really blame companies for this, but they need to work to develop their own technologies regardless of the work being performed at universities. To compensate many companies do engage in corporate venturing. This is where they fund a start up to conduct research and get a product to a certain position and possibly buy that company after a certain maturity point, set up an exclusive license or license the technology once it’s mature. This reduces the large company’s risk exposure.


The final piece that has increased since the late 80’s has been the amount of litigation due to patent infringement. In 2011 the amount of money spent on patent litigation was $29 Billion. That is a lot of money. That’s a quarter of the money that Apple has in it’s reserves. We also know that Apple is one of the largest spenders on litigation. I know there are a lot of Apple lovers out there, but they could have invested that money into more products and reduced their risk of a flop with the next iPhone. We all know that iOS6 was a major disappointment for many people, spreading their revenue stream into more sources with some cool research could mitigate any fall out from that or if iOS7 is more of the same. 


Litigation is such an outsized risk because it can lead to your entire firm being shut down by a non-producing entity. This reduces the incentives for innovation and increases the incentives for hoarding cash.