One of the most striking debuts in Pichai’s talk was the second version of Google’s custom “TPU,” or “tensor processing unit” circuitry, for cloud computing.
Those chips had traditionally been used for simpler types of processing, but Google now says it will make them available for both “inference,” the simpler tasks, such as answering a Web query, and now “training,” the more “computationally-intensive” task.
Pichai said the chips are the company’s focus on “A.I.-first,” a slogan that is replacing last year’s slogan of “mobile-first.” Pichai unveiled “Google.ai,” an umbrella for several things including “state of the art research” and “applied” artificial intelligence.
Pichai said the TPU boards will be made available to outsiders to use in Google’s cloud computing service. He also made a point of mentioning Google is making available the “great” GPU chips from Nvidia (NVDA), a reference to the “Volta” parts announced by Nvidia last week.
“We want Google cloud to be the best cloud for machine learning.”
In a note this morning, Trip Chowdhry of Global Equities Research, who has written frequently on how the TPU relates to Nvidia GPUs, writes that “Google statements can safely be ignored for now” and Nvidia is in no danger.
“Our research is indicating that Google Cloud will continue to grow its GPU farm by >100% this year.”
Other analysts were captivated by other announcements at the presentation. Overall, the responses indicated those covering the stock were highly satisfied.
Pacific Crest‘s Andy Hargreaves writes that “there was nothing ground-breaking,” but that “we view the enhancements to Google Photos, the Google Assistant, Google…