SaaS is Dead. Long Live SaaE

How has economic evolution changed the way birthday cakes are made? In their classical Harvard business review article, Joseph Pine IIand James H. Gilmore asked this question and gave their answer:

  1. In an agrarian economy, parents make birthday cakes by themselves. 
  2. In an industrial economy, parents purchase premixed ingredients from the market.
  3. In a service economy, parents order cakes as a yearly service for their kids.
  4. In an experience economy, there will be “experience vendors” who could create an awesome birthday party for kids. 

Their framework is also applicable to software.

  1. Agrarian economy: every business needs to write software for their use cases.
  2. Industrial economy: there are software vendors dedicated to building software, which could be purchased by customers to solve their needs.
  3. Service Economy: The software is delivered as a service (often as a subscription) and it would still require customers to build upon the service to create an awesome experience. This is the Software-as-a-Service (SaaS) model. 
  4. Experience Economy: We are entering a new era of Software-as-a-Experience (SaaE). In the SaaE model, we use software to directly orchestrate experiences for our users. Software is not considered only a utility, but a conduit of digital experiences.

SaaE is a fundamental leap forward for the SaaS business model because it significantly reduces the friction for creating awesome user experiences. According to Joseph and James

an experience occurs when a company intentionally uses services as the stage, and goods as props, to engage individual customers in a way that creates a memorable event. …  No two people can have the same experience, because each experience derives from the interaction between the staged event (like a theatrical play) and the individual’s state of mind.

In the SaaS model, the software service itself is rigid. There is a predetermined way, often determined during the product build stage, to use the software. Users would continue their subscription only if the problem they face could be solved by the predetermined software flow. When this is no longer the case, customers would churn.

In the SaaE model, the software would be adaptive rather than rigid. The ultimate goal is to deliver the best experience to users by staging a sequence of software services intentionally.

To reach the goal, we need two new software layers:

  1. A feedback-collection layer that could continually integrate the feedback from each customer to the product.
  2. A learning layer for software behaviors to adjust the software services to deliver the best customer experience.
Figure: Two layers of the SaaE.

The two layers were not practical in the software building process before because it would take an enormous amount of effort to personalize the software flow for each user.

The advancement of AI has made personalizing unique experiences for each user possible. We have already done it — YouTube’s personalized feed creates a unique video watching experience for every user, and Amazon’s product recommendation has created a unique purchase experience for every customer. 

Those systems are currently referred to as recommendation systems. But this is just the tip of the iceberg for a large paradigm shift – the advent of the “Software-as-a-Experience” age. 


Morris Chang: From Refugee to the Godfather of Taiwan’s Semiconductor Industry

Morris Chang may not be a household name in the western world but his achievement is comparable to that of western business Titans like Rockefeller and Carnegie. His life consists of one miracle by another. After fleeing from China to the US during the China Civil War, he worked his way up to be the general manager of the Semiconductor businesses and 3rd ranked person of Texas Instrument — one of the biggest semiconductors companies in the world. Chang was one of the first Chinese Americans to become top business leaders. In his 50s, he returned to Taiwan and founded Taiwan Semiconductor Manufacturing Company (TMSC) and became the “godfather” of Taiwan’s semiconductor industry.

Chang was born in China in 1931. Most of his early life was deeply shaped by war — he was forced to flee three times due to the Second Sino-Japanese war and China civil war. After the wars, he came to the United States at the age of 17 to study at Harvard University, a school that primarily focused on arts and humanities education. As an underrepresented minority ethnic group, most Chinese Americans at the time worked in low-end restaurants or laundry businesses. Academic jobs were the only few alternative options that would fit with Chang’s mission of achieving big societal impacts. He later transferred to MIT in the hope of becoming a scholar in engineering majors. Unfortunately, he failed the qualifying exams twice at MIT and had to quit his academic path and went to the job market after obtaining his master’s degree. 

He entered the Semiconductor industry in his first job at Sylvania, an industry leader then. However, his team was dismantled three years later. He then moved to Texas Instrument, which was famous for the invention of integrated circuits (IC) and was fast-growing. After working at Texas Instrument for three years, he went back to Stanford to pursue a Ph.D. degree and then returned to Texas Instrument to continue his corporate duty. By the 1970s, Chang was already the general manager of the whole semiconductor business of Texas Instrument. 

Chang was a keen observer of the semiconductor industry. While working at Texas Instrument, he observed that a lot of brilliant people in the company were hoping to create new businesses but heavy investment requirements prevented them from getting started. As chips became more and more sophisticated, the chip manufacturing business became super capital-intensive. The cost of creating a chip manufacturing line (also known as “foundry” or “fab”) could easily be over 3-4 billion US dollars. Besides, new startups cannot maintain a sustainable stream of needs to keep their manufacturing line busy all the time, which is the only way to justify the heavy investments.

In contrast, chip designing requires much less capital. It would be a win-win situation if there is a “pure-play” company that focuses on manufacturing so that startups could focus on designing. This model of chip making process, also known as Fabless manufacturing as it features the split of designing and manufacturing, is crucial for the booming of the semiconductor industry. There were a lot of chip design talents in the US, but very few were good at both chip manufacturing and cost management. Chang was one of the few talents who had the expertise.

Chang started to face career setbacks in the early 1980s. At the time, Texas Instrument shifted focus away from semiconductors and became a diversified device manufacturer. Chang disagreed with the shift and had to leave the company. After the career setback, Change decided to turn his observation into action. At the same time, Taiwan government was eager to find ways to break into high-end industries like chip manufacturing. Chang was the perfect person to lead the cause. After a short stint at another company, Chang accepted the invitation from the Taiwan government to be the first chairman of the Industrial Technology Research Institute, an institute that played a critical role in the industrial transformation of the island. With the support of the Taiwan government, Chang founded the TSMC one year later. TSMC created a whole new industry of “pure-play” chip manufacturing (a.k.a., foundry industry). By focusing on only manufacturing but not designing, TSMC assured its partners, typically US chip designing firms, that TSMC won’t compete with them or share their trade secrets with their competitors. 

Now TSMC is undoubtedly the market leader in the industry and occupies 28% of the market share in a recent study. Also thanks to TSMC, Taiwan became crucial for the global semiconductor supply chain, which Bloomberg has recently published an article to illustrate.

Global Semiconductor Market 2020.png

Global semiconductor market share (image via Counterpoints Research)

In one of his recent talks, Chang gave a summary of what he thinks is the key reason for the TMSC’s success. Chang attributed the success to three factors. 

The number one is Taiwanese people’s hard-working spirit. For example, during his second stint at TMSC, Chang started the “nightingale program”, which included both day and night shifts to ensure there were R&D activities 24/7. This program would be unimaginable in U.S. companies. According to Chang, this nightingale program was the key reason why the TSMC could eclipse all of its competitors in technology. In chip manufacturing, the size of the device the manufacturing process could produce is a key indicator of technology level — smaller size means more devices in the same area but also is much harder to manufacture. After losing in the competition of the 14nm manufacturing process, TSMC reached the 10nm, 7nm, and 5nm manufacturing processes one by one in just a few years. Till now, none of its competitors have reached the 10nm milestone yet.  Please refer to RISC-V, China, Nightingales for more details.

Figure. The timeline of manufacturing process in major foundry companies.

The second factor is the local professional management. This factor is crucial because chip manufacturing is operation-heavy and efficiency-driven. Chang also mentioned that managerial talent doesn’t transfer well across borders because of culture and factors.

The third factor is the good infrastructure provided by the Taiwan government. It is easy to see how good infrastructure makes the transportation of goods much easier. Chang also mentioned an important point. The good high-speed railway and the small Island of Taiwan make it possible for talents to be relocated to any place within the island without the need to be separated from their families. The benefits of good infrastructure on the human management side are often ignored by governments but are crucial for businesses that require a lot of talent.

All the three points are about one thing: TSMC can attract a huge amount of disciplined and high-quality talent and can retain them through good management and providing a convenient life. The company has an envious 3-4% employee turnover rate. For those who leave for various reasons, they became the most sought-after talents in the industry. I also highly recommend this great essay by Kevin Xu based on Chang’s talk.

Throughout his life, Chang overcame one challenge by another and successfully turned setbacks into new opportunities. Although he became a refugee three times in his youth, he immigrated to the US to pursue a new life. After failing in MIT Ph.D. qualifying exams and crumbling his academic pursuit, Chang entered the newborn semiconductor industry and worked tirelessly to become an expert in semiconductor manufacturing. After facing a career setback in Texas Instrument, he took the courage to leave the US, a place he had spent 36 years, to Taiwan for creating TSMC and became the godfather of the semiconductor industry of the Island. 

Chang is also a great writer. I highly recommend his Chinese autobiography that covered his early life before 33 years old (unfortunately I haven’t found any translated version yet). Besides, he is working on the second half of this auto-biography, and hopefully, will publish it soon. I am very much looking forward to reading it and will share a sequel in the future.


Renaissance of Intelligence

For people who are interested in artificial intelligence, the past decade feels like another Renaissance — the boundaries between humans and machines are repeatedly re-defined by the invention of new technologies, from machines that could beat world Champions in games to AI assistants that could talk like real humans.

This article shares some stories behind this incredible AI Renaissance. The sources of the stories include my first-hand observation in the field as well as Cade Metz’s recent book  Genius Makers, which I highly recommend.

A Sputnik Moment

A key reason for the current revolution is the re-invention of deep learning, a technology that simulates human brains as complex network architectures through computers.

The idea is not new. Scientists have been searching for the truth of human intelligence for a long time, and a natural starting point is our brain — the only intelligent machinery built by our mother nature. Artificial neural networks, predecessors of deep learning, were very popular between the 1950s and 1980s but lost their popularity because data was not enough and the computers then were too weak to solve any interesting problems.

It would take another two decades before its revival. In the late 2000s, a group of young scientists started to connect the power of the booming Internet with artificial intelligence research. In 2009, an assistance professor of Princeton Dr. Fei-fei Li1 compiled a vast database of Internet images (a.k.a. ImageNet dataset). The ImageNet dataset soon became the benchmark for Computer Vision, a subfield of artificial intelligence. In 2012, Geoffroy Hinton and his team significantly improved the metrics by more than ten percent, a jaw-dropping achievement that was a magnitude higher than any previous improvement.

This was a “Sputnik moment” for the artificial intelligence research community, at the time the mainstream research direction was for scientists to figure out the solution and to program the software based on the solution. Hinton’s success in the ImageNet challenge showed that an alternative approach — letting the neural networks learn a solution without prescription from humans — would work better. 

Godfather is Heading to Industry

Huge amounts of data and computer resources were essential to this success. No one knew this better than Geoffroy Hinton himself, who is also known as the Godfather of deep learning. Hinton was one of the early persons that popularized back-propagation, a fundamental algorithm used to train neural networks. When the field entered into a winter between the 1990s and early 2000s, most researchers switched to other research directions due to scarce funding sources. However, Hinton was still a stubborn proponent of the idea and was trying to revive it.

Hinton knew that his research would need resources from elsewhere, and only the big Internet companies had the pocket deep enough and data big enough to make the idea work. In addition to his academic achievement, Hinton also had great business savvy. With his two students, Hinton founded the DNNResearch company in 2012 and soon decided to sell it to big Internet companies. The book Genius Makers gave a vivid description of how Hinton orchestrated the auction in a Lake Tahoe hotel and how tech companies all over the world wooed him. DNNResearch was eventually acquired by Google for 44 million US dollars.

More important than the price tag is the precedence that Hinton created. To lure Hinton, Google allowed him to keep his position on both sides — but he had to be an “intern” in Google to work around the company’s rules. Before Hinton, it was rare for eminent researchers to work for tech companies because of the fear of losing their tenured positions in universities. Soon after the purchase, a lot of AI researchers followed Hinton’s example to join technology companies, including Yann LeCun,  another deep learning pioneer who later led Facebook’s AI lab, Andrew Ng, who led the research lab in Baidu. What’s more, following their advisors, students from various research labs flocked into big technology companies.

Among the technology companies, Google (and its parent company Alphabet) stood out for its unparalleled role in this wave of AI Renaissance. Its research divisions — Google Brain and DeepMind — are the driving force behind a lot of the greatest breakthroughs. What’s more, the fact that it could use AI to create so many profitable applications has a demonstration effect on all other companies.

One important person behind this is Jeff Dean, a legendary engineer that laid the foundation of Google’s infrastructure. In 2011, Andrew Ng introduced the deep learning concept to Jeff and he was intrigued immediately.  Jeff was looking for his next application and deep learning was a perfect one.  Andrew, Jeff, and another researcher Greg Corrado founded the Google Brain team. As a founding engineer of Google, Jeff has a great influence on Google’s management team and also enjoys enormous popularity within its engineering and research organizations (so much so that people made fun of him by creating the “Jeff Dean facts“). Jeff created an umbrella where the Google Brain team could operate without worrying about anything else.

In Google Brain, Andrew Ng and his colleagues helped create the system that could learn the “cat” concept from millions of YouTube videos, which drew a lot of media attention and publicized the field.  Andrew Ng eventually left the Brain team to work on his own startup. But he recommended Geoffroy Hinton as his replacement, which triggered the DNNResearch acquisition. Under the leadership of Jeff and Hinton, Google Brain significantly contributed to the field by both pushing the research frontiers and publishing the TensorFlow framework that makes the technology accessible to outside communities.

New World Champion

One limitation of the techniques Hinton was trying (a.k.a. supervised deep learning) was that it requires datasets labeled by humans. Demis Hassabis co-founded DeepMind to address this limitation and explore other applications. He wanted to build a system that doesn’t depend on human supervision and could perform better than humans. A child prodigy in Chess, Hassabis believes that games are the best starting points. Although games had been a proving ground for AI since the 50s, no one has been more committed and successful than Hassabis in this direction.

Hassabis and his DeepMind team combined deep learning with reinforcement learning, a technology that allows computers to adapt their behaviors through trial and error (the same way we humans learn). With this new technology, DeepMind built a system that could learn the nuances that were never found by humans before in popular video games like Breakout and published their results in Nature. This publication drew the attention of  Google’s management. In 2014, Google purchased DeepMind for more than $500M — this time both Hinton and Jeff are on the buyer side. With the resources from Google, DeepMind doubled down on its mission. In May 2017, DeepMind’s AlphaGo AI beat the world champion Ke Jie. Since then, it has kept beating humans in one field after another.  In addition to DeepMind, Google’s other AI division also released the BERT system that significantly improved performance in natural language tasks.

There were also a lot of breakthroughs outside Google and DeepMind. For example, OpenAI, which was co-founded by some of Hinton’s students and Silicon Valley elites like Elon Musk and YC CEO Sam Altman, tackled many other games and robotics applications through reinforcement learning and they released the language models that achieved amazing results. The successes of Google/DeepMind/OpenAI and other AI research teams have brought the public interest in AI to an unprecedented level.

What’s ahead?

A keen observer would find that the current AI Renaissance consists of many small cycles. Each cycle starts when a difficult yet well-defined benchmark problem is solved. Thanks to the huge public attention, the research team that solved the problem would be able to get a huge amount of resources to continue their research. The team then tackles the next more challenging benchmark problem with a larger model.  The cycles were started by academics and their students and were reinforced by big technology companies. People knew, either consciously or unconsciously, that it was the best way of attracting attention, funding, and talents.

Notwithstanding, ImageNet and Go games are still not real-world problems. In addition, there have been increasing concerns that this type of AI research pattern has caused enormous resource consumption and has made the AI models to be overly complex.For example, the GPT-3 language model related by OpenAI includes 175 billion parameters and each train takes around 4.6 million dollars.  In addition, many AIs that overfit man-made tasks turn out to perform poorly in many real-world applications.

We should and would break such cycles. Building cost-effective AI and making it really work in real-world applications is crucial to keep the movement going. In the next decade, there will be a lot more exciting stories ahead of us.

Disclaim: All opinions are mine and not endorsed by my current or previous employers.

  1. Fei-fei Li was an assistant professor of Princeton University at the time but moved to Stanford later. The original version called Fei-fei Li a Stanford professor by mistake, thanks Jike Chong for pointing it out.

In Search of Memory

Eric Kandel won the Nobel prize in 2000 for his contribution to the understanding of memory at the molecular level. His autobiography, In Search of Memory, describes both his experience of escaping from Austria to America and his inquiry into the science behind our memory throughout his career. This essay is created based on the autobiography and extensive researches on the web.

Escape from Holocaust

Eric was born in a Jewish family in Vienna. As a Jew, Eric’s childhood was dominated by the Nazi’s growing influence in Austria. From its very beginning of the 1920s, the Nazi party aimed to merge all German-speaking people into a Greater Germany. In 1937, Hilter forced the Austrian chancellor Schuschnigg to resign and sent his troops to occupy the country, which was the largest German-speaking state outside Germany. The event, known as Anschluss, was welcomed by the Austrian Germans because a lot of them felt Austria was not fairly treated in the Treaty of Saint-Germain signed after WWI. After the Nazis took the power in Austria, a lot of Austrian Jews were forced to leave the country due to the violence targeting them (e.g., Kristallnacht). Thanks to the help of a local Jewish organization Kultusgemeinde, Eric’s family was able to emigrate to the United States of America in 1939. The young Eric was only Nine by then.

Hitler announces the Anschluss on the Heldenplatz, Vienna, 15 March 1938.

Most of the Jews who didn’t escape Nazi Austria became the victims of the Holocaust. The experience of escaping the Holocaust greatly influenced Eric throughout his whole life. Later after Eric won the Nobel prize in 2000, he used his influence to press the Austrian government to recognize the misfortune of the Jew community during Anschluss, which was largely ignored post-WWII, and to advocate the rights of Jews community in the country.


After emigrating to the US, Eric finished his education first in a Jewish school and then attended Harvard College. There Eric was attracted to psychoanalysis because it was imaginative, comprehensive, and empirically grounded. His attraction to psychoanalysis was further enhanced by the fact that its founder Freud was Viennese and Jewish and had been forced to leave Vienna. He later enrolled in New York University and aspired to become a psychoanalyst.  In the fall of 1955, Eric decided to take an elective at Columbia University with the neurophysiologist Harry Grundfest. Since then, Eric’s research career gradually shifted to find the biological basis of mental function.

Eric is particularly interested in the formation of memory. In 1890, William James concluded that memory must have at least two different processes: a short-term process and a long-term process. The basic units of the brain are the neurons, which are connected through synapses. Signals of one neuron are passed to the next neuron through chemical neurotransmitters that are available in synapse. One common hypothesis is that short-term memory is stored as the distribution of neurotransmitters across different synapses. A stimulus would activate a spatial pattern of activity across neurons in a brain region,  which will deplete the neurotransmitters. The distribution of neurotransmitters will form a trace of the stimuli, which is the short-term memory.

The short-term memory trace decays over time as neurotransmitters are re-generated. As a result, short-term memories need to be consolidated to long-term storage. Behavioral experiments suggest it happens through repetition — what is well known as “Practice makes perfect”.

Scientists also realized the importance of the hippocampus in turning short-term memory into long-term memory thanks to the extensive research on Henry Molaison (H.M.), who is probably the most famous patient in the history of memory research. After a treatment operation in which his hippocampus was removed, H.M.’s intelligence was intact, yet he lost the ability to form new memories. Other than this vague picture, Scientists had very little knowledge of the exact biochemical process of memory. It was under such a background that Eric entered the domain of memory research.

Most of Molaison’s two hippocampi were removed bilaterally.


The first question Eric needs to answer is how neurons could adjust their connections based on environmental stimuli. Unfortunately, Human brains are too complex for any thorough analysis, each human brain has about 100 billion neurons. As a result, Eric experimented on Aplysia instead, whose brain has only about 20,000 cells, making it a perfect model animal to analyze how neurons work. In 1962,  Eric joined the lab of a French scientist Ladislav Tauc, one of the few scientists who worked on Aplysia then, as a post-doc to learn about this interesting sea slug. Eric’s work on Aplysia has laid the foundation for understanding the mechanism of memory — so much so that Eric presented a picture of Aplysia wearing a Nobel medal during his Nobel prize ceremony,

“Aplysia Won the Nobel Prize”

In Search of Memory

Eric and his team realized that long-term memories are formed as anatomical changes of the neurons. A single neuron has approximately 1300 presynaptic terminals (only 40% of which are active) with which it contacts about 25 different target cells. Through the consolidation process, the creates long-term memory, both the percentage of active presynaptic terminals and their total number. The number of synapses changes during learning. Memory is recalled when a certain sensory stimulus triggers the “reads out” of the new state of the synapse, which has been altered by learning.

In 1953, Waston and Crick proposed the famous Double Helix model of DNA, which opened the new world of molecular biology.  In the memory-research field, Louis Flexner from the University of Pennsylvania discovered that applying a drug that inhibits the synthesis of proteins would disrupt long-term memory. Eric realized that the same process also applies to Aplysia and that long-term memory storage requires the synthesis of new proteins.

One revolutionary breakthrough in molecular biology was the realization that gene function can be regulated up and down in response to environmental signals. Inspired by this breakthrough, Eric continued to investigate genes’ role in learning and memory formation.  Through researching Aplysia, Eric and his team realized that long-term memory is formed through switching on and off certain genes that increase or inhibit the growth of certain synapses.

For decades, Kandel has been studying how we create short-term and long-term memories at the molecular level. His work helps reveal the full picture of the memory-forming mechanism:

  1. The memory storage takes place in at least two stages: A short-term memory lasting minutes is converted — by a process of consolidation that requires the synthesis of new protein — into stable, long-term memory lasting days, weeks, or even longer. 
  2. A single stimulus strengthens the synapse through the depletion of neurotransmitters, which form the short-term memory.
  3. Repeated stimulation causes certain genes to be switched on and the growth of new synapses, which creates long-term memory.

Eric’s journey from a refugee from Austria to a Nobel Laureate is a great example of how the tolerant and open environment of America could release boundless energy from immigrants like Eric and inspire them to think in new ways. In contrast, the city of Vienna, once a center of art and science, lost its glory under the suppressive occupation by the Nazis. His experience is still important for us after a hundred years.


What modern business leaders could learn from Genghis Khan?

Genghis Khan as portrayed in a 14th-century Yuan era album;

Genghis Khan created the Mongol Empire, the biggest empire in human history. At its height, the Mongol Empire covered a land area of more than 9.15 million square miles and a population of more than 100 million. Another surprising fact is that the population of Mongols was only a few million.

Why were Mongols able to conquer the world with such a tiny population? One important reason is that Genghis Khan created a specialized organization that could leverage the most advanced technology at the time (Mongolian horses) to solve the most ambitious problem (conquering the world). 

Despite the small number of human soldiers, there were a huge number of horses in the Mongol army. Each Mongol soldier has 3-4 Mongolian horses at his disposal at any time. Mongolian horses had very great endurance and were the most advanced military technologies during the cold-weapon era. In contrast, their enemies either don’t have any horses or could only use inferior horses. 

More importantly, Genghis Khan organized his Mongol soldiers in a way that could leverage the advantages of those Mongolian horses to the full extent. The command structure of the Mongol army was much more flexible than other armies during the period. Lower-level leaders have significant license to execute orders in the way they considered best. The super flexible organization allowed Mogol armies to attack en masse, divide into smaller groups to encircle and lead enemies into an ambush, or divide into small groups to mop up a fleeing and broken army. Because they could fully leverage the mobility of horses, a few Mongolian cavalry soldiers could easily defeat hundreds of foot soldiers.

Thanks to horses, the Mongolian army could cover up to 100 miles (160 km) per day, which was unheard of by other armies of the time. Mongolian soldiers were able to travel thousands of miles without stopping by rotating horses during the trip. Because of such great mobility, the Mongol empire could allocate resources on a global scale to defeat every local enemy. For example, the Mongols were able to fight with both the Muslim world and China at the same time. After Mongols conquered Muslims, they were able to leverage the technology they got from Muslims (like the counterweight trebuchet) to destroy the Song dynasty.

Genghis Khan and his Mongolian armies have taught us two things:

  1. New technology requires a new form of human organization to fully leverages its power. 
  2. An organization that could leverage the new power would be able to unlock even more new opportunities.

In the past few decades, we are creating new technologies to extend our brains. One notable new technology is artificial intelligence (AI), which allows machines to make predictions and decisions autonomously.  The relationship between the new AI tools and humans is similar to horses and Mongolian soldiers.

A business would need to transform its organizational structure to fully leverage the power of AI tools.

  1. For a lot of traditional businesses, the bottom of the organizational chart is a huge number of employees who work on operational tasks. As a result, management is based on carrots and sticks. More advantage management like (motivation alignment) is only available for strategic positions.
  2. In AI-first organizations, even junior employees will have hundreds of AI tools at his/her disposal and their influence on the organization is equivalent to a much higher-level person in those traditional organizations. Organizational management needs to be more motivation-driven throughout the organization.  The organization also (is able to ) and needs to be leaner and flatter, which encourages innovation.
The hidden workforce of AI-first organizations

Proactively leveraging AI tools not only reduces cost but also unleashes new powers (like horses do to Genghis Khan’s troops).  

  1. The natural way of organizational growth is to throw hiring humans. However, more people would create a communication burden and operational overhead. As an organization grows, the Return-On-Investment (ROI) of extra hiring will eventually decrease to be below 1, which prevents the company to scale further. 
  2. “Hiring” AI systems, in contrast, would not incur extra overhead. What’s more, AI systems typically get smarter as more people use them. As a result, the ROI will increase as the usage of the AI system increases.

The only ceiling floor for the scaling of an AI system is from the technical side. Currently, most of the commercial-viable AI system is only designed for a single problem. And for most of the problems, AI systems haven’t reached the human-level yet. This will be a bottleneck in the foreseeable future but more and more AI systems will be invented as time goes by. Human + AI collaboration would be a strong disruptive power for industries in which AI solutions are available. 

Hiring more people doesn’t make the manager’s job redundant. Instead, it makes their jobs more important. Similarly, the adoption of AI systems doesn’t make their users redundant. They will increase the scope of their users and the whole organization. Humans are tremendously flexible and could always find creative new usage of new capabilities.  For example, AI may be able to help doctors to diagnose basic medical conditions, but it won’t be able to replace doctors. Instead, doctors would be able to focus on more complicated medical problems. As long as humans haven’t reached immortality, there are always new problems for doctors to solve.

We don’t want another Mongol empire that causes deaths, but we do need business growth that could make human life better.  In addition to scaling the human part of the organization, every business leader should also consider where their “horses” are and how to provide organizational support to enable employees to use them.


  1. Mongol military tactics and organization
  2. Wikipedia: Mongol military tactics and organization.
  3. “The Mongol Empire’s Best Weapon: The Mongolian Horse” History on the Net © 2000-2021, Salem Media.
English Newsletter

Conan’s Newsletter No. 16

Book of the Week: Shoe Dog

I typically recommend a book only if it is worth reading multiple times. Shoe Dog by Nike co-founder is one of such books. I finished reading this book a few months back when my manager recommended it and completed another pass when I was on a road trip to Death Valley. Every read gives me fresh thoughts because Phil is a great storyteller, and Shoe Dog is not a typical memoir.

The book starts with a vivid description of Phil’s world trip right before he founded Nike. In the 1960s, German products dominated the American sports shoe market. As a former track runner at college, Phil envisioned that Japanese running shoes would become significant competitors to German shoes. During his stop in Japan, he made contact with a Japanese shoemaker Onikusa. When Phil returned from his word trip with a contract to distribute Onitsuka shoes in the U.S., he started his legendary Nike journey.

In the early days of Nike, Cashflow has always been a bottleneck for its growth. Although the American venture capital was booming then, most of them were in Silicon Valley, far away from Nike’s headquarter in Portland. Besides, the Shoe business is not the high-growth field V.C.s were looking for. As a result, Nike had to grow from bootstrapping and from bank loans. For the first five years, Phil had to keep a day job to earn Nike cash and work on Nike in the evenings and weekends. Nike was continuously groaned by its bankers and almost fell into bankruptcy in 1975.  

Nike rides the tide of Globalization. From Nike, you could see how Globalization (particularly Japan) have profoundly influenced the United States in the 1970s. Nike was initially just a distributor of Onikusa in the USA. Later, when it started selling shoes, it relied on loans from Nissho, a Japanese trade company.

Flying geese paradigm

The history of Nike is also a history of supply chain outsourcing from the U.S. to east Asia. Despite that it is the most famous sports shoe brand globally, Nike itself manufactures nothing and entirely relies on a global supply chain, which gave Nike the edge over Adidas. Nike’s supply chain was first in Japan, then was moved to Taiwan and later to China. Nike is not alone. A lot of other American companies (such as Apple and Tesla) followed the pattern.

Japanese scholar Akamatsu’s came up with a concept of Flying geese paradigm for the phenomenon that the Asian countries would catch up with the West like flying geese because the production of commoditized goods would continuously move from the more advanced countries to the less advanced ones in the region hierarchy. 

The flying geese paradigm is the reason why there are so many economic miracles in east Asian in the past few decades (Japanese economic miracleMiracle on the Han RiverTaiwan Miracle)

Then why Asia? Some crucial reasons are the region’s social and cultural characteristics: hard-working ethic, Collectivism, and low labor cost (initially), which are somewhat very different and complementary with the Western culture. You could get more context for the difference in this excellent documentary American Factory.

However, the social and cultural characteristics that help the countries catch up with the West are a double-edged sword. Although the flying geese paradigm created economic miracles in those countries, it also makes them prone to the “technology snapshot trap,”— a phenomenon in which a society develops involutely in a “snapshot” of the outdated technology because it fails to learn from outside or generate innovation innately continuously.

For example, Japan developed an advanced automobile and Electronics industry in the 1970s but failed to lead the personal computer revolution. Korea and Taiwan picked up the semiconductor industry but missed the Internet. Recently, “Involution” also became a hot topic in China social media. More and more people complain that society starts to stagnate and more and people have to face more fierce competition on limited resources.

In east Asia, the working population suffers from severe over-working (e.g., 996Karoshi) in the catchup process. Over-work culture will prevent people from learning new things and reduces fertility rates, which will drive up labor costs and reduce the competitive advantage of society in the long run. The obedient culture in the region also reduces the diversity of ideas and disruptive innovation within itself.

Companies like Nike combine both ends’ advantages by leveraging the West’s marketing and sales creativity and delivering high-quality yet cheap products using the Asia supply chain. However, this fundamentally drives the tension on both ends and is the inherent reason for a couple of trade wars.

We are facing a dilemma for Globalization. American people complain about the loss of manufacturing jobs, and Asian countries complain that the West captures most profits. We are at the crossroad of deglobalization, and the pandemic adds fuel to the process. A healthy society needs to strive for the right balance between the two cultures. Both ends should learn more from each other and take the opportunity to transform their culture and industry structures.

English Newsletter

The Revenge of Apple to Intel

One hot topic recently is that Apple released its new ARM chips — M1. It is not the first time Apple designs chips — Apple has successfully designed chips for its iPhone and IPads. It is also not the first time Apple uses non-Intel chips in its Mac products — Mac had Intel cores only since 2006.

Then why is it important? In short, this is a declaration of war from Apple to Intel and a game-changer for Reduced Instruction Set Computer (RISC) in performance-sensitive applications.

What are Instruction Sets?

Developers use chip instruction sets to communicate with computer chips. Metaphorically chip instruction sets are similar to the alphabets of human languages.

There are only twenty-six characters in English, but more than three thousand in Chinese. Similarly, the size of chip instruction sets also varies. Reduced Instruction Set Computer (RISC) refers to building chips using a small instruction set. In contrast, Complex Instruction Set Computer (CISC) refers to the option that uses an extensive instruction set. (Please see here for more descriptions)

A little history

Early computer chips were all CISC and mostly were designed by Intel. In the 1980s, there was a movement of reducing the instruction set. The ARM technology was founded in this period, and Apple–IBM–Motorola alliance built the PowerPC chips for Macintosh computers.

On the other side of the table, the Windows-Intel alliance (a.k.a “WinTel“) kept investing heavily in CISC. The rest is history; WinTel crushed Apple computers in personal computing. Apple had to switch to Intel chips in 2006. ARM survived only in a then niche market of IoT devices thanks to its energy efficiency.

Then the mobile Internet era came, thanks to Apple’s iPhone release. ARM is appealing for those applications because people care about the battery life of smartphones. As a result, ARM captured 90% of the market share for mobile processors. Intel lost the mobile war because it suffered from the Innovator’s Dilemma and wasn’t willing to risk upsetting its existing CISC business.

Despite ARM’s success in mobile phones, Intel still holds the crown for applications that require high-performance. Many people think this is due to CISC’s inherent superiority in high-performance computation, and Intel is safe in those fields.

Apple declared this is wrong through the release of M1. Intel maintained CISC’s advantage in the high-performance applications through massive investment, and previously there was no significant player who could compete.

Except for Apple. Some early users mention the performance of M1 could be comparable to NVIDIA’s popular 1080Ti GPU. The TensorFlow team also shows new M1 chips could outperform many workstations for AI applications, which have the highest computation requirements. 

What’s more, Apple has a great track record for disrupting industries. A lot of ARM manufacturers will follow Apple’s path to optimize ARM for high-performance applications, and they are eager to do so, given that the mobile phone market is saturating.

Besides, NVIDIA now owns ARM. The merger gives both edges in the age of AI. The road ahead for Intel is not rosy. Would the aging Titan be able to hold its position? It’s hard to say. But one thing is sure. More competition in the field is a great thing for companies in downstream areas like Cloud and AI, which could benefit from increased computation powers and reduced cost.

The market share ARM in different fields

Note: There is an interesting podcast from A16z about Apple Silicon. 16 Minutes #46: Apple Silicon — A Long Game, Changing the Game

Note: Although it is very promising, please still wait for a few months before you decide to upgrade to Big Sur or M1 chip if you want to use it for ML training. A lot of the libraries are not compatible with the new system yet (tweet)


Conan’s Newsletter No. 14

The Market Curve


  1. The Market Curve. Mike Vernal from Sequoia points out that a great product is not sufficient for a great business — a great market is needed too. In this essay, Mike introduce the concept of “Market Curve,” a  long-tail curve for the relationship between the number of customers and revenue per customer variables for a given Market size. Mike divides companies into five categories (Enterprise, SMB, Prosumer, Commerce + Marketplaces, and Consumer Apps) and offers some great examples for each type.
  2. Moving upmarket and the ascent of SMB SaaS. Adam Fisher from Bessemer Venture Partner shares his thoughts on how SMB SaaS companies could move to the left side in the Market Curve. There are two broad types of go-to-market strategies. One is the Customer-pull strategy, which relies on the growth of customers. The other is the Bottom-up strategy, which targets individual employees or specific types of employees as entry points within an organization. Adam then shared ten best practices for SMB-focused SaaS vendors to move to the upmarket.
  3. To own or not to own delivery? Grocers reassess the Instacart dilemma. This article discusses the dilemma food retailers need to face in dealing with eCommerce platforms like Instacart. I have been relying on service like Instacart since the start of the pandemic. This weekend is the first time I do grocery shopping in a physical store, and it feels very strange (and inefficient) to me. I think grocery eCommerce will continue to be a thing after the pandemic ends.

Interesting Facts

  • How could plankton in the Cretaceous influence modern American politics? This fascinating tweet stream summarizes the formation of a “swoosh” of counties in red states that consistently vote for the democratic party in the past decades. In the Cretaceous, the area was the coastal shore where millions of plankton live. As the planet cools down, the oceans recede. But the dead bodies of plankton make the soil extra organic and more suitable for cotton to grow. As a result, many African-Americans whose ancestors worked in cotton plantations live nowadays in the region, and they vote for democrats. You could also read this essay and this wiki for more descriptions of this interesting link between ancient history and modern politics.



Conan’s Newsletter No. 13

This is an overwhelming week for everyone, so I will only recommend one book — a book about a president. Nothing more and nothing less.

The book is Destiny and Power: The American Odyssey of George Herbert Walker Bush. The 41st President’s biography also gives a great view of American politics from the sixties to the end of the 20th century. Is there a better time than now to reminisce about American traditions?

Bush’s road to Whitehouse was by no means rosy. He experienced much more failure than success throughout his career. He was defeated twice for his Senate bid (1960 and 1970), lost to Regan in the 1980 republican primary, and failed to get a second term. The loss of the 1992 presidential election to Bill Clinton was exceedingly hurtful for him. He called the pain “ghastly” many years after he left the Whitehouse.

No matter what happened, the 41st president held on American values and traditions to his heart. He had rivals but virtually no enemies, and he had proven himself an attractive and reliable man to those who know him. 

Nothing is more exemplar than his handling of the government transition after the agonizing defeat. In a Whitehouse tour after the election, Bush told Clinton: “I want to tell you something when I leave here; you are going to have no trouble for me. The campaign is over. It was tough, but I’m out of here, and I will do nothing to complicate your work, and I just want you to know that.”  He also left a well-wish letter for Bill Clinton when he left the Whitehouse — a beautiful symbol for American value. Even without his other outstanding achievements, this letter alone would make George H.W. Bush a revered and memorable president.

The letter that George H. W. Bush left to Bill Clinton.


Conan’s Newsletter No. 12

Book of the Week

This memoir from the less famous Netflix co-founder Marc Randolf is an excellent read if you are interested in good stories about startups. Startup-founding stories are often about a group of genius, with a eureka moment, creates a terrific product to change the world. Those stories are beautiful but unfortunately less useful for us — the real world doesn’t work in that romantic way. Successful people are more hesitant to tell us the real story, and even if they do, they may have survivorship bias. As a result, I am always interested in the stories of either invisible founding members or companies that are not home-run hit.

Marc Randolph fits into that type: he is the founder of Netflix and its CEO for the first year, but he later departed and was outside the limelight. This book reveals many details about Netflix’s early days, including conceiving the idea and building initial prototypes. Marc also described a lot of his personal life when he was working on Netflix ideas. 

The book also includes some more drastic moments, e.g., when Reed Hastings — the more famous founder — sidelined Marc using a PowerPoint presentation and Marc’s eventual decision to depart Netflix. Many of the moments are personal and emotional, but Marc describes what happened from his perspective objectively. He is honest about his limitation and was in full respect of Reed Hasting and all his decisions. He sets his contribution to Netflix straight but gives Reed the most credit for what Netflix had achieved.


GOTO 2012 • Scaling Yourself • Scott Hanselman. This video is a great and fun tutorial from Scott Hanselman about improving your focus and productivity. Although I already know many concepts, I still watched it end-to-end because Scott presents those concepts delightfully and exactly.

One tip I find interesting is here“Conserving your keystrokes is important … You should never write a long email to someone, anything longer than three sentences should be in Blogs/Wiki/Product document/Knowledge database, anywhere but in your email. Email is where your keystrokes die”.


  1. How to price your SaaS product. This excellent article from Patrick Campbell discusses the pricing of SaaS products. Here are my takeaways from the report:
    1. There are three critical steps for reasonable pricing.  1) Understand and quantify what value you bring to your customers. 2) Understand what your ideal customer profiles are. 3) Do user research and experiment frequently.
    2. Patrick also offers ten rapid-fire bonus tips. Her some interesting ones to me:
      1. Revenue per customer is 30% higher when you use the proper currency symbol. 
      2. In B2B, value propositions can swing the willingness to pay ±20%. In DTC, it’s ±15%
      3. Don’t discount over 20%. Large discounts get people to convert, but they don’t stick around.
      4. Social proof is important. Case studies can boost willingness to pay by 10-15% in both B2B and in DTC
      5. Design helps boost the willingness to pay by 20%.
  2. Measuring the engagement of an open-source software community. This study from Bessemer Venture Partners discusses the metrics that are useful to measure open-source communities’ engagement. My takeaways:
    1. The authors think the North Star metric for a project is its unique monthly contributor activity. A contributor is any user that has created a Github Issue or Issue Comment, or logged a Pull Request or Commit in a given month. 
    2. Out of the top 10,000 projects, only 2% have reached 250 monthly contributors in 6 or more months. One hundred contributors per month is a substantial milestone.
    3. The authors expect more and more companies will open-source their core technologies—for the mutual benefit that open source provides to both the community and the company—and focus on monetizing only a small portion of their user base. 

Other Stuff

Cecilia Chiang, an S.F. legend and the matriarch of Chinese food in America, dies at 100. Chiang’s incredible life goes beyond food and encapsulates the 20th-century history of Chinese culture in San Francisco. Chiang fled from China in 1949, first went to Japan, and then came to the US to found the groundbreaking Mandarin restaurant. She changed the course of Chinese restaurants in America. She introduced many dishes that become the canon for Chinese food in the United States: potstickers, hot-and-sour soup, sizzling rice soup, beggar’s chicken, and the bestseller, smoked tea duck.

Chiang is also the subject of a documentary: Soul Of A Banquet (amazon prime videoYoutube trailer)