Table of Contents
Table of Contents
Related Blogs
The Evolution of Device Clouds: From Public to Private to Shared
As a Product Manager responsible for delivering administrative capabilities within the Digital.ai Testing platform, I view infrastructure through a specific lens: the balance between availability and control. My persona, the Cloud Administrator, is the gatekeeper. The administrator allocates resources, manages user permissions, and above all ensures that the resources are available when testing teams need them.
Over the years, our industry has changed a lot. We moved from the total control of physical hardware to the wild flexibility of the public cloud. But neither extreme solved the real problem. Today, we’re charting a different course, one that challenges the false binary the industry has accepted for too long.
The future is not about choosing between “locked down” and “open to all”. It’s about a third state: Shared Instance
Let me show you how we got here, and why this distinction matters more than most vendors want to admit.
Phase 1: The Era of “Hardware Hugging” (In-House & On-Prem)
We all remember the traditional model: the in-house device lab. USB cables snaking across desks. Swollen batteries of mobile devices that needs replacing every six months. Manual OS updates that consumed entire afternoons. That one iPhone 6 that somehow still worked when everything else had been retired.
There was something satisfying about the tangible control, you could literally walk over and pick up the device causing problems. But the operational burden was crushing.
For our security-conscious customers, particularly in banking and defense—this evolved into formal On-Premises and Air-Gapped solutions. In these setups, the infrastructure is fully isolated, so data remains under full control.
Devices never touch the public internet. Test data never leaves the building.
Why this emerged
Banking apps handling financial data. Healthcare apps managing patient records. Government applications with classified information. These couldn’t run on infrastructure where another company’s tests might have touched the same device hours earlier.
The industry’s answer was simple: “These devices are yours and yours alone isolated in your environment. No one else touches them.”
This solved real needs
- Full isolation of devices and data for applications handling regulated or classified information
- Custom infrastructure configuration to match internal security, network, and compliance requirements
- Guaranteed data residency and sovereignty, with no dependency on shared public resources
- Air-gapped deployments for environments that cannot connect to the public internet at all
The problem this created
But here’s what I observed managing these environments: this solved the security problem but created a new one – economics.
- High total cost: Organizations were spending significant time, resources, and money maintaining the infrastructure.
- Slow access to new devices: I watched customers wait three months to test on the iPhone 15 after launch. The purchase had to be approved by management. Procurement had to order it. IT staff had to configure it and add it to the lab. Meanwhile, their users were already downloading their app on the latest iPhone.
On-Premises gave enterprises the security they needed, but at costs that didn’t scale with device coverage that couldn’t keep pace with market fragmentation.
And yet, for truly sensitive workloads, this remains valid. Air-gapped and on-premise solutions aren’t going away, they’re necessary for classified data and legally restricted scenarios. But they shouldn’t be the default answer for every testing need.
Phase 2: The SaaS Fork (Public vs. Dedicated Cloud)
As the market moved to SaaS, it split into two binary options. Neither fully serves the modern enterprise Cloud Admin who’s trying to balance security, coverage, and cost.
Option 1: The Public Cloud
Public device clouds emerged as the first solution to a problem that was becoming economically impossible: developers couldn’t afford to buy every device their users owned.
Devices dynamically allocated on a first-come, first-served basis with limited control, shared across all customers of the platform.
Why this emerged
In the early 2010s, mobile fragmentation exploded. Android launched dozens of new devices quarterly. iOS added new models annually. Testing your app manually on physical devices became economically impossible for all but the largest enterprises.
Public clouds offered a breakthrough: instant access to hundreds of devices without purchasing hardware.
Pay-as-you-go. Zero setup. Just click and test.
For startups and fast-moving development teams, this was transformative.
This solved real needs
- Eliminated the need to purchase and maintain physical device labs
- Enabled rapid, on-demand validation without setup or infrastructure planning
- Made broad device coverage economically accessible, even with limited budgets
The limitations that emerged
But as enterprises adopted these platforms, problems surfaced that I heard repeatedly in customer conversations:
- Limited control: “Our application requires specific VPN and network configurations to connect to internal environments. Public cloud devices rely on standardized network setups that don’t support this.”
- Availability issues: “When Apple releases a new iPhone, demand spikes instantly. We end up waiting in queues during critical testing windows.”
- Compliance gaps: “Our security team reviewed the architecture and rejected it. Multi-tenant public infrastructure isn’t acceptable for applications handling sensitive financial data.”
Public clouds democratized mobile testing, but they weren’t built for enterprise security and control requirements.
Option 2: The Dedicated Cloud
To solve the security gap, the industry standardized on Dedicated (Private) Cloud offerings. Single-tenant environments where devices are reserved exclusively for one customer.
Devices reserved exclusively for a single customer, offering full control without the need to manage the lab infrastructure.
This solved real needs
- Access to a private, single-tenant cloud environment dedicated to your organization.
- Meet security and regulatory requirements through complete isolation.
- Maintain full control over devices and configurations to optimize test scenarios.
- Reduce operational overhead related to lab maintenance, upgrades, and IT administration.
- This gave enterprises the security posture they needed, but it inherited some of the economic problems of on-premises solution.
The limitations that emerged
- Continued cost of dedicated devices, resulting in limited device diversity and incomplete test coverage.
- Restrictive device allocation, especially during peak periods such as pre-release testing or device-specific debugging, reduces testing flexibility and can lead to release delays.
The Disconnect: What Customers Actually Needed
By then, we had two extreme options:
- Public: Affordable, accessible, diverse device coverage—but security concerns and limited control.
- Dedicated: Secure, controlled, compliant—but expensive and limited device variety.
But when I sat down with customers and asked about their actual testing workflows, they described needs that didn’t fit either extreme:
“Our CI/CD pipeline runs functional tests across 50 devices every hour. We need those tests to run on our private network with our VPN configuration, execute quickly, and reuse device setups across test suites. Public clouds don’t support this, and dedicated devices are too expensive to scale.”
“We use dedicated devices for production testing with real customer data. But when it comes to development, our teams just need to validate bug fixes quickly across a wide range of devices. They don’t need dedicated resources—they need coverage.”
The pattern was clear: customers needed something between public and dedicated.
They needed:
- Shared infrastructure economics with on-demand access to devices
- Broad device and OS coverage, comparable to public clouds
- Enterprise-grade security and isolation, without exclusive device ownership
- Full network control, including VPN and site-to-site configurations
- Reliable execution of large-scale test suites, without disruptive setup or teardown between runs
The industry had framed the problem as a binary choice. That framing was wrong, and it’s where we started building something different.
Phase 3: Shared Devices in Private Clouds—The Third Way
This is where Digital.ai Testing has focused on innovation over the past few years. Not because we’re trying to be different for the sake of it, but because we listened to what customers actually needed and looked to solve a real need.
What this means
Devices dynamically allocated on a first-come, first-served basis but with higher control and flexibility compared to Public Devices. Suitable for running large test suites, requiring specific configurations (e.g., using the same VPN configuration as dedicated devices), and non-disruptive usage.
You get the economics of shared utilization with the security posture of a private environment.
Why this emerged
Three forces converged to make this model both possible and necessary:
1. Cloud security technology matured
Private Cloud architecture advanced to the point where we could create genuine isolation within multi-tenant infrastructure. Your devices, network configurations, and data remain completely separate from other customers, even though the underlying cloud infrastructure is shared.
This level of isolation wasn’t reliably achievable in 2015. But now, it has become standard in enterprise cloud architecture.
2. Cost Pressures Intensified
Teams wanted the same security guarantees with better utilization economics. The old answer, “because security requires it”, wasn’t satisfying anymore.
3. Testing Needs Diversified
Teams today don’t have just one type of testing, they have multiple workloads, each with different requirements:
- High-security production testing using real customer data (requires dedicated resources)
- Large-scale CI/CD functional testing with synthetic data (prioritizes scale and coverage over exclusivity)
- Broad compatibility testing across hundreds of device and OS combinations (not economically viable on dedicated devices alone)
- Day-to-day development validation of bug fixes (needs fast access and variety, not guaranteed availability)
A one-size-fits-all infrastructure, whether public or dedicated, doesn’t reflect how testing actually happens today.
What This Means for Your Testing Strategy
If you’re evaluating device cloud options today, I’d challenge you to reject the “public vs. private” framing entirely. It’s outdated and doesn’t reflect how modern testing actually works.
Instead, ground the decision on how your teams really work. To help you find those answers, here are some thoughtful questions to ask:
1. What are your actual workload requirements?
- Do all workloads handle sensitive or regulated data, or only some of them?
- Do you require guaranteed device availability at all times, or only during defined windows?
- Do your tests depend on persistent device configurations, or do they require clean environments each run?
Most teams, when they actually map this out, discover they have a mix.
2. Can you separate workloads by security requirements?
- High-security → Dedicated SaaS or On-Prem (production data, regulated or restricted applications)
- Mid-security → Shared Devices within a Private instance (functional testing, CI/CD, compatibility)
- Low-security → Public or Shared Devices within a Private instance (early development, non-sensitive validation)
The key takeaway: not every testing workload requires the highest security tier.
Conclusion: Evolution, Not Revolution
The evolution from Public to Private to Shared wasn’t driven by vendor innovation for innovation’s sake. It was driven by customer needs and market forces that the industry couldn’t ignore:
Shared Devices within a Private SaaS environment emerged because organizations needed to balance economics they couldn’t ignore, device diversity they couldn’t test without, and security they couldn’t afford to sacrifice.
At Digital.ai Testing, we’ve learned that the future of device cloud infrastructure isn’t about choosing a single deployment model. It’s about building architectures flexible enough to match different workloads to appropriate tiers, all within a secure, compliant environment that the Cloud Administrators can actually manage.
The real question was never “public or private?”
The real question is: “How do we give testing teams the coverage they need without exposing our data or breaking our budget?”
That’s what “Shared, Not Exposed” means. And that’s the future we’re building.
Rethink your device cloud strategy and explore how shared devices in a private SaaS environment can unlock scale without compromise.
Explore
What's New In The World of Digital.ai
Automating QA for Automotive Applications
Whether you’re building a music app, an EV charging service,…
When AI Accelerates Everything, Security Has to Get Smarter
Software delivery has entered a new phase. Since 2022, AI-driven…
The Invisible Wall: Why Secured Apps Break Test Automation
Modern mobile apps are more protected than ever. And that’s…