Decades of work have paid off for Nvidia. The next computer revolution is here, and the company is set to dominate its competition, according to Jefferies.
“IBM dominated in the 1950’s with the mainframe computer, DEC in the mid 1960’s with the transition to mini-computers, Microsoft and Intel as PCs ramped, and finally Apple and Google as cell phones became ubiquitous,” Mark Lipacis wrote in a note to clients. “We believe the next tectonic shift is happening now and NVDA stands to benefit the way these aforementioned tech giants did in prior transitions.”
Nvidia has been working on its CUDA computing platform and its graphics processing unit (GPU) technology for years. Traditionally, a computer has worked in a linear way, processing one task at a time on the central processing unit (CPU).
Shortly after GPUs were introduced in the 1990s, programmers began using them to break tasks into lots of smaller problems and solving them all at the same time on the GPU. This is called “parallel processing.”
For certain types of problems, like rendering lots of graphics elements in a video game, GPUs were far superior to the single-minded CPU. They were slower at single tasks, but could handle lots of problems at the same time. Nvidia developed a programming platform, called CUDA, to take advantage of the way their GPUs could handle these multi-faceted problems. CUDA made it easy to break traditional problems into multiple parts that ran much faster on a GPU than the traditional CPU.
Fast forward to modern times where artificial intelligence and deep learning technologies are the hot trends. Companies like Google, Tesla and Amazon are using artificial intelligence to program self-driving cars, conquer ancient board games and develop smart personal assistants. Luckily for Nvidia, artificial intelligence and deep learning programs are perfectly suited to run on its GPUs and CUDA platform.
Jefferies thinks these two technologies give Nvidia a huge advantage over the competition.
“We see NVDA as a major beneficiary of the 4th Tectonic Shift in Computing, where serial processing (x86) architectures give way to massively parallel processing capabilities as the next wave of connected devices approach 10b units by 2022,” Jefferies said.
As tech giants build out new data centers to handle their ballooning artificial intelligence research, they often turn to Nvidia to supply the hundreds or thousands of GPUs they need. MIT recently said Nvidia has spent around $3 billion to develop its current data center chip, and it’s a move that has paid off for the company. MIT named Nvidia as the smartest company in the world in 2017, in part, because of this investment.
Nvidia has been making waves in the autonomous-car business as well. The company recently announced partnerships with Baidu, Volvo and Volkswagen to improve their self-driving car technologies and its technology is already being used in vehicles made by Tesla, Audi and Toyota.
Cryptocurrency mining is another example of a process that runs better on GPUs. Nvidia has been raking in profits in that area too, and one Wall Street bank thinks it will be just another sector that Nvidia will come to dominate.
Investors have been rewarding Nvidia as it takes the computer world by storm. Shares of Nvidia are up 48.55% this year.
While it might take some time before Nvidia’s $87.04 billion market cap comes close to the companies that dominated the last computing revolution (Alphabet at $598.61 billion and Apple at $751.88 billion), Jefferies has faith in the company. The investment bank raised its price target to $180, up about 19% from Nvidia’s current price.
This Article Was Originally From *This Site*