What Do You Know About the History of Intel, AMD, and Nvidia, the World’s Largest Processor Manufacturers? & Do You Know How Deeply Interconnected They Are?

What Do You Know About the History of Intel, AMD, and Nvidia, the World's Largest Processor Manufacturers? & Do You Know How Deeply Interconnected They Are?

If you are a tech geek, a professional gamer, or even if you are a casual user, you have definitely heard of or know those three giant companies that all of our devices run on their processors.

These companies have definitely get the credit for the development of technology and computers of all kinds to become this powerful. You can almost say that their inventions have brought Silicon Valley to life.

We always hear about the heated competition between these companies every time new processors or technologies are released, not to mention the debates and discussions that always occur among their fans about who is the best.

But, has it ever occurred to you to know what the history of these companies is? How did they get to where they are now? Who are the founders? What are the stages of ups and downs in their lives? What is the reason for the popularity of these companies to this extent?

In this article, we’ll take you back to tell you about the history of the three companies, Intel, AMD, and Nvidia, and also tell you about some interesting facts you haven’t heard and how deeply interconnected they are.

1. Intel

Intel logo
Intel Headquarters. Source: Wikipedia

Before starting to talk about Intel, let’s review some of the historical events that led to the emergence of electronic circuits and processors in their current form, which in turn led to the emergence of the company. Let’s go back a little bit, specifically to 1947, when the first appearance of the “transistor.” It was a very small semiconductor device used in electrical circuits and works to switch the current state between 1 and 0 (this is the basis of electronic devices that form a state. The current in it is either zero or one) so the transistor can either pass a current or stop it. Today, a processor contains billions of transistors, and perhaps 40 billion in server processors. The transistor we know now was invented by 3 scientists, William Shockley, John Bardeen, and Walter Brattain.

But at that time, two big problems arose. The first was that transistors had to be connected with wires, and the second was that electric circuits and processors were being manufactured specifically for one purpose. There was no processor that could handle everything, and we will discuss this shortly.

In the beginning, scientists dreamed of being able to unify all the components of an electrical circuit, from wires, transistors, and others, into one electronic chip (Printed circuit board). The first to invent this integrated circuit was the American physicist Robert Noyce, in 1959, and it works by placing a thin layer of metal on a silicon chip, then etching the appropriate areas of the metal to form an integrated electrical circuit on that slide.

But what is this Robert Noyce story?

Robert Noyce. Source: Wikipedia

Let’s go back to William Shockley, who was the supervisor of the team that invented the transistor. He decided several years later to move to the western United States to establish his own laboratory for semiconductor design, and in 1956 he was able to gather the best American scientists at the time and hire them in the company.

But due to a disagreement with Shockley, eight scientists who were not impressed with his management decided to leave the company and agree to create a new company, known at the time as the “Traitorous Eight”, and among them was Robert Noyce. They then started to look foe investors so they went to a company named “Fairchild Camera and Instrument”, which was founded by Sherman Fairchild. He was impressed by the vision and aspirations of these scientists, which led him to create a new division of the company called Fairchild Semiconductor.

The Traitorous Eight. © Wayne Miller

The new company was formally established in 1957, and was mainly planning to make transistors and semiconductors composed of silicon instead of germanium, which was the most common material in the semiconductor industry at the time. The reason is that silicon is inexpensive because it is extracted from sand, and electronics made of silicon can be easily replaced after being damaged rather than trying to repair it due to its low cost.

Now you know where the Silicon Valley revolution originated from! Credit goes to Fairchild Semiconductor and everyone who worked on it at the time.

From within this company, in 1959 Robert Noyce created the integrated circuit. This means that it is now became possible to reduce the size of electrical circuits, wires and devices that used to take up an entire room, to a small chip of silicon and some copper!

Over the years, employees at Fairchild began to leave the company and pursue other businesses, as competitors appeared in the field of semiconductors such as Motorola and Texas Instruments. That’s when Noyce realized that he must create his own company and start a new project. So him and his fellow Gordon Moore, who was one of the “Traitorous Eight”, left the company and founded the company we’ve always loved, Intel!

From left to right: Andrew Grove, Robert Noyce, and Gordon Moore. Source: Wikipedia

From here our story begins with Intel (the word is an acronym for Integrated Electronics), which was founded in 1968 and started as a manufacturer of memory. It also got a huge financial funding in millions. The first employee that got hired in Intel was Andrew Grove, who was also working In Fairchild Semiconductor!

In 1969, a Japanese company called Busicom contracted with Intel to design 12 integrated chips for its new calculators.

Remember the two problems we talked about earlier? The first problem was solved with the invention of integrated circuits, but what about the second? Since each of the applications was made for its own chip and dedicated only to it?

Hence the great idea of ​​an engineer at Intel named Ted Hoff, who was responsible for designing the processor architecture for each of those chips. Hoff suggested to Noyce a different idea of ​​designing a single chip instead of 12 that handles everything. Noyce welcomed the idea and encouraged him to do it. Indeed, Hoff managed to design the architecture for this processor, and the last step was to put all these paper designs on silicon and manufacture the processor. To do this, Intel hired physicist and engineer Federico Faggin, who was working at Fairchild Semiconductor, and was able to successfully put Hoff’s designs on silicon and manufacture the final processor and deliver it to the customer in 1970.

In that same year, Intel launched the 4004 processor, the world’s first central processor (CPU)… a complete system inside a single chip that can handle anything you want! Imagine how important this invention was to our daily lives, so much so that people at the time couldn’t believe that a finger-sized, $60 chip could hold the capabilities of a room full of circuits, processors and wires, like the famous ENIAC machine.

Intel 4004 processor

From here, Intel’s series of successes in the world of processors began, and with many companies trying to compete with them, most of them were unable to withstand, and Intel remained the first in the manufacture of processors throughout all these years. One of the reasons the company was so successful in its early days was its CEO, Andrew Grove, who is undisputed as one of the greatest and smartest CEOs in history.

We can say that the golden era of Intel was in 2006 and beyond, after the release of Core processors for the first time. They were really good processors! So the good and fast processors and the absence of competition from AMD at that time led to Intel’s official dominance of the market, so that its market share reached more than 80%. That dominance increased in 2011 and beyond with the release of Core i3, i5, i7 and most recently i9 processors, and it was clear that no one would be able to compete with the power of these processors.

Intel has passed through the years from success to success, and it still has the lion’s share in computer processors, and has been able to develop its processors in an incredible way, but unfortunately, the company has gone through many flops as well, due to the inability to follow technological developments quickly enough in the last years, and the installation of executives with business or accounting experience instead of having real experience in processors and engineering, unlike its competitors AMD and Nvidia. Intel was also known for its failure to ride the wave of smart phones, and it did not give the subject sufficient importance, so its presence in this market became completely non-existent. However, at least it is still creating powerful processors that everyone loves, and has successfully entered many industries such as artificial intelligence, self-driving cars, drones, and others, and recently announced the arrival of its first discrete graphics card under the name Intel Arc. It is great to see the competition back again after its fierce competitor, AMD, dominated the processor market during the last two years. But, how did AMD gain control after its violent downfall? Let’s get to the full AMD story.

2. AMD

AMD logo
Radeon Technlogies Group (the graphics division of AMD) logo
AMD headquarters. Source: Wikipedia

AMD’s story is also interesting. Firstly, let’s know how the company started from a modest manufacturer of processors, its violent fall a number of years ago and its close to bankruptcy, to return again and impose its control and defeat Intel so far at least.

In 1969, Jerry Sanders, an electrical engineer and marketing manager at Fairchild Semiconductor, left the company with 7 colleagues to start their own company, AMD (Advanced Micro Devices), a year after Intel was founded.

Jerry Sanders

Have you noticed that the founders of Intel and AMD were all working for the same company before creating their own companies?

The company started as a second chip factory for a number of other companies such as Fairchild and National Semiconductor (the second factory is the company that gets a license to manufacture and sell chips that were originally designed by other companies).

In 1971, AMD entered the processor market with its Am9080 processor that was compatible with the Intel 8080 processor, and released other processors after that, but that all changed in 1978, when Intel introduced its new x86 architecture that powers all our machines.

In 1981, IBM released its personal computer, which our machines still depend on its basic design, and decided to rely on Intel processors and its new x86 architecture, but with a condition, which is that Intel provides it with a second chip factory, in order to ensure the availability of a second chip supplier in the event that the supply is cut off from Intel, and hence the cooperation between Intel and AMD. The two companies entered into a 10-year agreement that requires each company to give the other the right to use its technologies and manufacture its processors.

Relations remained good between the two companies, until 1984, when Intel noticed that AMD was able to make processors faster and better than Intel. In addition to that, Intel released its new processor, famous for its high capacity i386, and decided to prevent AMD from acquiring its new technologies and new processor designs which violates the agreement between them. From here, the war began between the two companies as this dispute reached the court, and AMD won the case after several years.

In fact, this conflict extended to many more issues that lasted for several years, and although AMD won most or all of them, this negatively affected the company, as it drained lots of money and was one of the reasons for its downfall while Intel was taking advantage of these issues despite losing more market share. But, let us tell you about AMD’s achievements over the years and then we will discuss the factors of its downfall and then its rise again and its dominance in the market.

Despite the spoiling of the agreement between the two companies, every time AMD designed processors similar to Intel processors using reverse engineering (that is, studying how the processor works and then cloning it and designing a processor exactly like it) the AMD processor ended up being faster than its counterpart Intel. Because of this, the company was able to achieve profits of $2 billion in 1994, until the company decided for the first time to design its own processor from scratch without the need to clone Intel processors, and called it K5.

AMD K5 processor

Unfortunately, this processor came late, after the dominance of the Pentium processors of the entire market, in addition to the rise of several problems that existed in the processor which lead to poor sales. AMD realized that it would not be able to confront the Pentium processor with its current capabilities, so it bought another company which was called NextGen. They were making processors that compete with the Pentium processors, and after purchasing it, the company released its new processor K6, which proved to be a resounding success.

That period was the golden period for AMD, especially in 1999, when the company released the famous Athlon processor, and at that time it was the fastest x86 processor in the world. And it didn’t stop there, in 2003 it released the world’s first 64-bit processor, as the architecture was first released in 1999 (all modern computers and phones now run on 64-bit architecture, which means that the processor can handle larger data than older 32-bit processors). In fact, the credit goes to an engineer named Jim Keller in the development of these processors. He was also the chief engineer who worked on developing the K8 processor, which was based on the 64-bit architecture, and he himself led the development of the new Ryzen processor after his return to the company and then leaving later.

Jim Keller

In 2006, AMD purchased ATI, a company that designs graphics processors, thus AMD competes with two companies instead of one: Intel in the central processors, and Nvidia in the graphics processors. During most of the previous years, competition was fierce between AMD and Nvidia, unlike the CPUs, which always had the largest share of the market, Intel.

But, all this changed in 2006, when Intel released Core processors that had no competitor, and from here AMD’s problems that led to the company’s subsequent collapse began to appear. But, what are the reasons for the collapse of the company?

In fact, there are many reasons, the first is the issues between Intel and AMD that drained the company’s money greatly, and secondly that the company had a great mismanagement of its money. It was spending billions on advanced processor factories that it didn’t really need, and also that it bought ATI for a huge amount of $5.4 billion. Thirdly, the great mismanagement in the company and the lack of focus on making correct plans for the future, and fourth, one of the biggest downfalls: bad treatments!

After the advent of Intel Core processors in 2006, AMD couldn’t compete with those processors very much. But nothing compares to the bad FX processors that appeared in 2011 and continued into 2016, which were weak and bad in every sense of the word, and could not even cope with the mediocre processors from Intel, in addition to the big problems it was showing and its high price compared to its performance. In terms of graphics processors, the competition was more intense between AMD and Nvidia, but previous problems also affected the company’s graphics processors, and as a result, it lost a large market share to Nvidia.

All this led to the fall of AMD and its almost bankruptcy in 2014, when the price of one share of the company was only two dollars!
Until everything changed… In that same year, a new CEO Lisa Su, an engineer with a PhD with more than 20 years of experience in electronics and processors, has joined AMD.

Lisa Su
Lisa Su

Su and some new managers with a lot of experience with processors put the company back on track, gathering and focusing on the main thing the company has been good at since its inception: making powerful processors. It may sound easy, but building a multi-year roadmap is never easy.

Indeed, the company has begun work on Project Zen, a new processor architecture built entirely from scratch. At the end of 2016, AMD successfully launched its all-new Ryzen processor from scratch, and everyone was amazed. AMD was able for the first time in nearly 10 years to compete with Intel and surpass it. What distinguished the processor was its very little power consumption, the large number of cores in the processor, and its very low price, so you could buy a Ryzen processor that exceeds the performance of the most powerful processor from Intel for literally half the price.

In 2019, the company launched the third generation of Ryzen processors, which included the world’s first processor with 32 cores, as well as the world’s first server processor with 64 cores, and for the first time managed to outperform Intel in gaming performance, and it became clear that Intel had lost the battle against AMD. And till this day, Intel is still struggling to keep up with AMD.

AMD has become the number one choice for gamers, content creators and tech geeks, and its market share is increasing year by year. Their share price recently reached nearly $100, and now the company has the most powerful processor in the world undisputedly. Will the competition between Intel and AMD return soon? Probably!

In terms of graphics processors, the company was able in 2020 to finally keep pace with Nvidia after years of lack of competition, and there is no doubt that in 2022 the competition will intensify when the two companies launch new generations of their processors.

3. Nvidia

Nvidia logo
Nvidia headquarters

Finally, let’s talk about the company that everyone loves, especially gamers, and how it managed to revolutionize the world of graphics processors and dominate the market so far.

But first, what exactly is a GPU? In short, a GPU is a special type of processor primarily concerned with processing graphics, 2D and 3D shapes and displaying them on the screen. From a technical point of view, the graphic processor is characterized by its ability to process data in parallel (Parallel Processing), while the central processor processes data serially (Serial Processing), and each of them has practical applications.

An illustration shows what inside the CPU and GPU look like

In the 1970s, gaming consoles did not have “specialized” processors for graphics, but rather silicon chips programmed to display only specific shapes, but for many years, graphics and shapes were processed by the central processor.

In the mid-eighties, some processors began to appear that were able to process some graphics alone without the use of the central processor. With the emergence of operating systems that contain a graphic interface such as Windows 1.0, these processors spread and people could buy and install them on their devices in the form of cards. The cards are able to perform some tasks besides displaying graphics on the screen, such as processing lines and shapes without relying on the central processor.

Over the years, the capabilities of these cards have evolved, and they were able to process shapes faster, and they were also capable of processing 3D shapes, and new companies specialized in manufacturing graphics cards have been established such as Matrox, 3dfx, ATI, and of course… Nvidia!

Nvidia was founded in 1993 by Taiwanese-American electrical engineer Jensen Huang, and interestingly enough, the reason behind his passion for processors and electronics was his current rival, AMD!

Jensen Huang

When he was at university, Huang saw in the lab a poster with one of the famous AMD processors that was used in many gaming devices. He was very passionate about it, and as soon as he graduated, he went to AMD to work there for two years, then left the company and created his own company, Nvidia.

The company started as a strong competitor to other graphic card manufacturers, and was famous for its Riva TNT card that was capable of ultra-fast 3D graphics processing, which was one of the reasons for the company’s popularity and strengthening its position in the market.

Until everything changed in 1999, when Nvidia took control of the market with the release of the Geforce 256 processor, the first modern graphics processor (which we know as GPU nowadays) that took graphics processors to a new level. It was able to handle complex light effects and ” Transformation” (which is the ability to produce a two-dimensional image on the screen from three-dimensional scenes), as all these operations were not able to be processed by the computer except by using the central processor.

Geforce 256

Nvidia completed its streak of success with the announcement of the Geforce 3 card in 2001, which included something called “Pixel Shader”, which allowed visual effects to be configured on every pixel of an image. All of this led to the exit of all Nvidia competitors from the market because they were unable to meet the powerful, fast Nvidia cards with complex features, and only ATI, which was sold to AMD in 2006, has been able to compete with it until then. However, Nvidia holds the largest market share in the processor market graphical.

In recent years, the company released its new RTX 2000 series (and recently the RTX 3000), which is considered the most powerful graphics processor for personal computers in the market. It features a unique technology used in new games called “ray tracing”. It is a method for processing light that result high-definition visual and visual effects that are very realistic.

The company has great respect and a very good reputation among gamers and tech geeks, and the youth group in general. People always associate the name Nvidia or Geforce with high power and performance, and the first thing people usually look for when building their computers or buying a laptop is Nvidia processors.

In fact, Nvidia is more than just a manufacturer of graphics cards, the company now manufactures processors and designs modern technologies related to artificial intelligence, robotics, self-driving cars, huge data centers and servers, etc., and is now close to acquiring ARM, which is responsible for designing the processors that all our phones run on!

Leave a Reply