Your New Cloud for AI May Be Inside a Colo | CIO Skip to content
Enterprises moving their artificial intelligence projects into full scale development are discovering escalating costs based on initial infrastructure choices. Many companies whose AI model training infrastructure is not proximal to their data lake incur steeper costs as the data sets grow larger and AI models become more complex.
The reality is that the cloud is not a hammer that should be used to hit every AI nail. The cloud is great for experimentation when data sets are smaller and model complexity is light. But over time, data sets and AI models grow more complex as companies seek greater accuracy from the models. Data gravity creeps in generated data is kept on premises and AI training models remain in the cloud’; this causes escalating costs in the form of compute and storage, and increased latency in developer workflow.
In the IDC 2020 Cloud Pulse Survey, 84% of businesses said they were repatriating workloads from the public cloud back to on-premises infrastructure due to data gravity, concerns about security and sovereignty, or the need for a higher frequency of model training.
Potential headaches of DIY on-prem infrastructure
However, this repatriation can mean more headaches for data science and IT teams to design, deploy and manage infrastructure optimized for AI as the workloads return on premises. Often the burden of platform development can fall on data science