I Remember AI

Oct 14, 2019News

At the 2019 Global Foundries Technology Conference (GTC) as well as at the 2019 AI Hardware Summit, there are some interesting insights on the capabilities of today’s and tomorrow’s AI-enabled smart systems and the memory technology that underlies these capabilities.

Thomas Caulfield, GF CEO at the GTC gave this example of how virtualized models of automobile interrelated systems could speed development of all the important aspect of vehicle safety and performance.

Caulfield described how the company’s Fusion Design Platform, powered by AI can lead to better models and fewer mistakes going into production.

At the 2019 AI Hardware Summit there were talks by established and many start-up AI companies, as well as the companies that supply them, on what it takes to build a successful AI ecosystem.  John Hennessey from Stanford University and Alphabet showed some examples of training processing unit (TPU)-based data center AI training systems, using large amounts of high bandwidth memory (HBM).

AI applications require local fast memory/storage to support them and minimizing data movement.  This is increasing the quantity and performance requirements for training AI models and the low power AI inference efficiency needs for edge and endpoint applications.  According to Michael Dukakis Institute for Leadership and Innovation (MDI), AI technology can be a force to apply in both software and hardware including datasets, algorithms, intended impacts, goals and purposes for enhancing system performance and transparency.

The original article can be found here.