
I started my career at the tail end of the mainframe era (IBM ES/9000, OS/390, MVS, DB2, etc.) and I have been fortunate to watch enterprise IT procurement reinvent itself a few times since. I am not going to suggest modern IT purchasing is sexy and exciting, but “back in my day” enterprise IT procurement was legitimately boring. It was often performed without a second thought right alongside fleet vehicles and other expensive fixed assets.
In the mainframe era, computing was centralized. Hardware, software, and workloads were tightly coupled, and the vendors you could choose from were exceedingly few. Seriously, you really had only two choices, and they were both wildly expensive.
As a result, these contracts were usually large, and the pace of change was slow, but at least (your perception of) control was inherently high. You bought capacity, ran the business, and went home. It turns out that IT, finance, procurement, and users were inadvertently aligned because there really was only one way to consume the available technology.
Then came the early 1990s. The PC revolution moved computing from the data center to the desktop. Client/server architectures took off. Software was no longer monolithic; it was modular, installable, and increasingly decentralized. That changed IT procurement almost overnight. Departments and divisions began buying their own software. Licenses were purchased in bulk (often based on overly optimistic growth projections), and shelfware became routine. Deployment planning was also manual, slow, and imprecise. Asset tracking relied on spreadsheets (if you were lucky).
For the first time, enterprises faced a new problem of visibility. What was installed? What was actually used? What was compliant with policy? What is the policy? What could be rationalized? And something else happened too: users gained autonomy while corporate IT lost a little control. Procurement tried to standardize and finance attempted to forecast despite the futility. Naturally, everyone’s respective incentives began to diverge. Users wanted speed and functionality while IT wanted stability and supportability. Procurement wanted leverage and discounts while finance wanted predictability (and also discounts). The gaps between everyone widened and “experience” started to mean different things depending on who you asked.
This period gave rise to early software asset management (SAM) and deployment intelligence tools. Vendors like Lakeside Software were among those helping enterprises understand endpoint environments; not just what they owned, but what they needed. Instead of guessing about upgrade cycles or Windows migrations, organizations could use data to plan. The challenge had shifted from centralized scarcity to distributed complexity, and increasingly, to organizational misalignment.
Virtualization, SaaS, and Subscription Economics
The next phase (roughly the 2005 to 2015 window) accelerated the fragmentation. Virtualization and containers abstracted infrastructure, SaaS abstracted deployment, and cloud abstracted ownership.
Procurement models evolved. Perpetual licenses became subscriptions, CapEx shifted to OpEx, device-based licensing morphed into user-based licensing, and enterprise agreements evolved into cloud consumption commitments.
Looking back, shelfware wasn’t really the same level of problem. Now we had redundant SaaS tools, underutilized subscriptions, shadow IT, poorly aligned consumption forecasting, and cloud cost overruns.
In short, visibility became far more difficult and the disconnects deepened. Users could input a credit card for enterprise IT services. Business units could adopt SaaS without permission from IT. Finance saw spend accelerate and procurement negotiated contracts that didn’t always reflect real usage. Adding insult to injury, IT was often asked to secure and support tools it didn’t select. Everyone had dashboards but nobody had shared truth (sound familiar?).
Marketplaces, MACC, and Ecosystem Economics
We’re now in a different era again. Cloud marketplaces have become procurement engines where enterprises increasingly purchase ecosystem software. Constructs like Microsoft’s Azure Consumption Commitments (MACC) tie software procurement directly to cloud spend, which changes the incentives. Software companies (ISVs) align with those marketplaces so that customers can optimize purchases against existing commitments. Procurement, finance, and IT are pushed to coordinate.
Indirectly, software selection has now become ecosystem selection. Marketplace purchasing isn’t just a transactional convenience. It’s structural alignment. It forces finance, procurement, and IT to operate against a common consumption model.
At the same time, AI is adding another layer of volatility. AI isn’t just a feature; it’s becoming an infrastructure multiplier. Agents consume APIs, invoke services, and generate unforeseen compute. AI agents require data pipelines, so poor software choices now have compounding effects. In a world of elastic cloud workloads and accelerating automation, procurement isn’t just about price or feature lists. It’s about telemetry, integration depth, ecosystem compatibility, consumption alignment, and operational intelligence.
If you think about it, the same core problem from the 1990s still exists: visibility and optimization. The difference is that the surface area (and associated risk) is exponentially larger. The organizations that reduce friction between users, IT, finance, and procurement, and operate from shared telemetry instead of assumptions, will move faster and waste less.
This is where modern endpoint intelligence platforms and marketplace-based procurement converge: operational truth alongside financial and contractual alignment. Together, they reduce the historic disconnect.
For us at Lakeside, this is also where SysTrack fits naturally. If you are buying through a marketplace and applying spend to MACC, you still need to know what is actually happening across endpoints: what is installed, what is used, what is degrading experience, and what can be optimized. Without that shared truth, you are just moving spend through a different channel. With it, marketplace procurement becomes a lever for both governance and outcomes.
Full Circle (again?)
Looking back, mainframes were centralized and controlled. Then the PC era decentralized everything and created sprawl. The cloud re-centralized infrastructure but further decentralized the services.
Today’s marketplace-driven enterprise now sits somewhere in between. We have distributed workloads running on centralized platforms, purchased through ecosystem commitments.
The enterprises that win won’t simply buy software. They’ll buy into the right ecosystem with the right data, the right telemetry, and the right partners to continuously optimize what they consume.
Procurement has evolved from purchasing capacity, to managing licenses, to governing consumption, to orchestrating ecosystems. The difference now is that the tools exist to reconnect the user experience, operational reality, and financial model into a single system of truth. We still have work to do.
If you are sitting on MACC, the pragmatic question is not whether you should use it, but whether you are using it to drive measurable operational outcomes. If you’d like to evaluate SysTrack through the Azure Marketplace, we can help you map marketplace procurement to endpoint-level reality so commitments translate into utilization, performance, and continuous optimization.
Getting Started
If youโre exploring how to turn marketplace purchases and MACC commitments into real operational outcomes, start by grounding decisions in endpointโlevel truth.
Learn how Lakeside helps organizations connect Microsoft Marketplace procurement to visibility, optimization, and measurable resultsโand take the next step when youโre ready.
Resolve Faster with SysTrack
See for yourself how greater insight helps IT deliver more productive digital experiences.
Subscribe to Lakeside Updates
Receive product updates, DEX news, and more



