IDC: Why Developing and Deploying AI Technology on Workstations Makes Sense
Today, many businesses are working hard on AI initiatives including generative AI that do not require a supercomputer. Indeed, a lot of AI development — and increasingly AI deployment, notably at the edge — is actually taking place on powerful workstations. Workstations have numerous advantages for AI development and deployment. They liberate the AI scientist or developer from having to negotiate server time, they provide GPU acceleration even as server-based GPUs are still not easily available in the datacenter, they are extremely affordable vis-à-vis servers, and they represent a smaller, one-time expense rather than a rapidly accumulating bill for a cloud instance, plus there is the comfort of knowing that sensitive data is securely stored on premises. That way, they also free the scientists or developer from the anxiety of racking up costs while merely experimenting on AI models.
IDC is seeing the edge grow faster than on premises or cloud as an AI deployment scenario. Here too, workstations play an increasingly vital role as AI inferencing platforms, often not even requiring GPUs but performing inferencing on software-optimized CPUs. The use cases for AI inferencing at the edge on workstations are growing rapidly and include AIOps, disaster response, radiology, oil and gas exploration, land management, telehealth, traffic management, manufacturing plant monitoring, and drones.
This white paper looks at the increasing role that workstations play in AI development and deployment and briefly discusses Dell's portfolio of workstations for AI, powered by NVIDIA RTX GPUs