Bringing DevOps Control To Bear On AI Applications

Sep 1, 2019News

Enterprises are putting a lot of time, money, and resources behind their nascent Artificial Intelligence (AI) efforts, banking on the fact that they can automate the way application leverage the massive amounts of customer and operational data they are keeping. The challenge is not just to bringing machine learning into the datacenter. It has to fit into the workflow without impeding it. For many, that’s easier said and done.

Dotscience, a startup comprised of veterans from the DevOps world, dropped out of stealth mode this week and published a report that showed that enterprises may not be reaping the rewards from the dollars they are putting behind their AI projects. According to the report, based on a survey of 500 IT professionals, more than 63 percent of businesses are spending anywhere from $500,000 to $10 million on AI programs, while more than 60 percent also said they are confronting challenges with the operations of these programs. Another 64.4 percent that are deploying AI in their environments found that it is taking between seven and 18 months to move these AI workloads from an idea into production.

There’s a need to ensure that not only can machine learning developers collaborate and make code that can be reproduced but also easily track models and data, trace the a model from its training data back to the raw data (provenance), view relationships between parameters and metrics and monitor models to ensure they are behaving as expected. In addition, they need to be able to attach external S3 datasets and to attach to any system, from a laptop and a GPU-powered machine to datacenter hardware and cloud instances.

The original article can be found here.

The end-to-end integration of AI applications with enterprise system is essential for any company business, which is also highlighted in AI Ethics report by AI World Society (AIWS) for developing AI algorithms and data management with ethical principles and practices.