Cloud Success Stories – Part 2

Cloud adoption has been on a trajectory of steady acceleration for several years.

At the end of 2019, our own cloud adoption research found that two-thirds of IT leaders surveyed intended to take a “cloud-first” posture for new applications and workloads.

Furthermore, while the global pandemic has disrupted technology plans and budgets, in many cases we’ve seen that organizations that had made progress toward cloud adoption were better positioned to pivot in their responses than those who hadn’t.

The pandemic has highlighted the business value of the cloud, helping organizations to sustain operations, support newly remote workers and pivot in response to new economic conditions.

In fact, we expect cloud adoption and migration to speed up as organizations rethink their workspaces and go-to-market strategies for post-pandemic recovery.

Sharing Customer Stories

Many of the organizations we work with at Softchoice have started or are moving forward faster with their cloud adoption journeys with the intent to build differentiated, next-generation product offerings on a modern infrastructure foundation.

However, each organization – and each application – is on a journey of its own. We wanted to share our experience helping 1,400+ organizations transition to the cloud and help others benefit from what they’ve learned.

This series will explore real-life stories on the journey to the cloud. In this article, we’ll look at two organizations and how Microsoft Azure helped them advance toward greater business agility.

Work Truck Solutions

The Challenge: Work Truck Solutions (WTS) wanted to refresh its technology stack to support an all-new online marketplace providing authoritative, up-to-date data on commercial trucks and vans.   

“Softchoice guided us toward the right Azure resources and helped us make a crucial upgrade that benefited our audiences” – Craig Vitt, Software Engineering Manager, Work Truck Solutions

The Journey:

  • Work Truck Solutions had deployed on Azure as early as 2012, but its new domain needed to reinforce its status as a source of industry knowledge
  • As the application needed to serve a national audience, unlike its earlier dealer-centric offerings, they needed a technology stack that combined scalability with manageable costs.
  • The company partnered with Softchoice to select and implement the right Azure resources to meet key KPIs, including site traffic, lead generation, and conversions.
  • Using the latest technologies, Softchoice ensured that the application could be deployed to numerous environments and aligned with DevOps workflows.
  • The initial launch exceeded expectations and as of April 2020, the marketplace had aggregated 160,000+ commercial vehicles, including data on customizable bodies, upfits and more.

Next Steps

  • WTS has embarked on a roadmap toward expanding capabilities, integrating duplicate continuous integration (C/I) and testing environments to run sprints in parallel.
  • With Softchoice, WTS is also focused on reducing technical debt related to legacy Azure resources, finding additional cost savings, and ensuring continued scalability and growth.

Read the full case study

Lumenpulse

The Challenge: Lumenpulse needed to replace legacy infrastructure supporting its ERP systems to support scalability and digital transformation without disrupting its 24/7/365 operations.

We’ve built a foundation for technological transformation at Lumenpulse. We’re anticipating many gains in productivity, efficiency and scalability.” – Alexandre Azevedo, IT Director, Lumenpulse

The Journey:

  • During a rapid expansion, Lumenpulse became concerned about the capacity of legacy ERP systems, including end-of-life Windows Server 2008 and SQL Server 2008 to scale.
  • They needed to transition to a future-proof ERP system, without interrupting ERP access for its fast-growing global manufacturing operations.
  • With Softchoice, they conducted a workload assessment of the existing environment and mapped its virtual machines to determine readiness for migration to the public cloud.
  • After making the decision to migrate to Microsoft Azure, Lumenpulse worked with Softchoice to implement an array of Azure resources to modernize its ERP and business operations.
  • The full deployment was completed on time after 10 months, after which Lumenpulse was onboarded to a fully managed cloud and end-user support arrangement with Softchoice.
  • Lumenpulse has since benefited from streamlining of routine tasks and lower costs resulting from tiering inactive data to low-cost storage and Azure Reserved Virtual Machine Instances.

Next Steps:

  • Through the Softchoice Keystone Operations Center, Lumenpulse has 24/7/365 access to Microsoft-certified technical engineers for cloud infrastructure monitoring and escalation.
  • Under the Softchoice Cloud Solution Provider (CSP) program, they also benefit from flexible monthly billing to keep the costs of their Azure deployments under control.

Read the full case study

What’s Next for You Cloud Journey? 

We’ve covered two stories where businesses have re-invented their business operations and product offerings through cloud adoption and migration.

But no cloud transition is ever fully complete. Working with a strategic managed services partner like Softchoice will help you:

  • Achieve the right mix of cloud services to meet your business needs
  • Take the risk out of cloud adoption and migration
  • Help you reduce and control costs in your cloud environment
  • Drive product and service innovation while maintaining security and compliance
  • Help you address cloud infrastructure skills gaps

Planning to migrate one or more workloads to the public cloud? 

Learn more about how we can help by exploring Softchoice Cloud Services.

Where to Find Savings in Your Cloud or Data Center Environment

Part 1 of our 2-part series on Driving Efficiency through Infrastructure Optimization. Read Part 2, “How to Add to Your IT Environment without Adding Costs.” 

For IT departments, the mandate to do more with less and get the most out of technology investments isn’t new. But today there’s much more pressure to find and seize immediate opportunities to cut costs.

In addition to rationalizing software and restructuring contracts, on-premise data center and public cloud infrastructure are two high impact areas for potential short-term savings.

There are some common challenges, however. In the cloud, lack of visibility and formal governance practices makes it harder to learn where and how to find savings. In the data center, the need to avoid new capital expenditures makes it necessary to free up existing capacity to support new projects.

In fact, cloud consuming organizations waste an average of 30% of their cloud spend due to redundant resources (Source: RightScale). At the same time, inactive data accounts for 50% of total storage capacity, taking up valuable space (Source: NetApp).

The best options for making short-term financial impact in the infrastructure environment are:

  • Reducing cloud costs by improving management and visibility
  • Freeing up data center compute and storage capacity to avoid future costs

Below, we go deeper into each of these cost saving opportunities.

Reducing Cloud Costs through Improved Management and Visibility

Without careful management, public cloud infrastructure costs can get out of control.

Because organizations can procure and consume public cloud resources much easier than their on-premise counterparts, losing track of workloads and associated spend is a common problem.

Redundant resources, the absence of adequate monitoring tools and lack of control over who initiates or decommissions workloads in the cloud all contribute to over-spend.

The Flexera State of the Cloud Report for 2020 found that 79% of those surveyed cited managing costs as a top cloud challenge, second only to security.  The report also found that enterprise companies overspent their cloud budgets by 23% on average in 2019.

To right-size public cloud infrastructure and drive cost efficiency, consider the following actions:

  • Find and remove overprovisioned or idle resources: Identifying and reviewing accounts with low I/O activity helps you determine which resources could be decommissioned with minimal impact to the business.
  • Implement and enforce formal cloud governance: A formal cloud governance policy helps you better understand the structure of cloud costs, establish accountability and control access and decision-making around cloud resources.
  • Adopt a cloud management platform: A cloud management platform helps enhance visibility into your public cloud environment to promote better forecasting for cloud budgets based on real-time usage. Further categorizing cloud instances by assigning metadata tags related to billing, environments, applicable compliance requirements and more allows IT teams to track usage and associated cost across cloud instances, even in a hybrid or multicloud environment. IT can then augment and automate tagging using cloud native tools for policy enforcement. Together, these ensure that utilization meets requirements while reducing financial risk.
  • Optimize cloud storage: As with on-premise infrastructure, automating the categorization and storage of active and inactive data into performance and capacity tiers in the cloud helps drive further efficiency.
  • Implement automated scaling: Putting automated scaling in place allows you to scale up resources when needed and scale down the rest of the time. This replaces the need to accommodate maximum utilization, which is often a needless expense.
  • Use reserved versus on-demand instances: The leading public cloud providers offer discounts to customers for reserving instances for anticipated future needs in advance rather than pay higher rates for on-demand usage.

Looking to learn more about managing in the cloud? Get the guide

Freeing Up Data Center Resources to Avoid Costs

Compared with adding new usage-based public cloud resources, the cost to continue operating an owned data center is often negligible. However, when capacity isn’t optimized for efficiency, the result is additional capital expenditures when the time comes to support new applications or projects.

For instance, many organizations over-provision data center hardware to avoid the problem of running short of capacity within their virtualized infrastructure. Meanwhile, inactive data stored on-premises takes up valuable storage resources that could be tapped for other initiatives.

To free up on-premise infrastructure and avoid unnecessary future spend, we recommend these steps:

  • Optimize virtual machine resources: Optimizing workload placements and right-sizing VM allocations addresses inefficiencies by addressing risk and capacity waste. This increases efficiency by reclaiming resources from over-sized Virtual Machines (VMs). At the same time, increasing VM density by rebalancing VMs helps to safely address workload requirements and avoid resource contention.
  • Optimize on-premise storage: While not a direct cost reduction, optimizing on-premise storage allows you to extend the life of existing storage and defer capital costs. Tiering storage to the cloud automates the categorization of active and inactive data. By moving inactive data to a lower-cost cloud storage provider, you can free up on-premise capacity for new projects and pay for additional storage at a lower monthly rate.

Next Steps to Finding Cost Savings in Your Environment

Finding short-term opportunities and immediate steps to reduce infrastructure spending may require the help of an experienced and specialized solutions provider like Softchoice.

We offer the following solutions to assist organizations like yours to find and take advantage of these savings opportunities.

  • Cloud Cost Assessment: Analyze your existing public cloud workloads to uncover immediate cost-savings opportunities and improve visibility into cloud cost drivers.
  • Data Center Technology Review: Pinpoint opportunities to optimize infrastructure with the goal of freeing up existing capacity to offset future capital expenses. The review targets server, storage, virtualization, hybrid cloud, backup and file systems.
  • Cloud Data Tiering Accelerator: Identify inactive data stored on-premise that could be moved to lower-cost public cloud storage to free up on-premises capacity.

Our team of licensing and technology vendor experts are ready to help you find efficiencies wherever you are in your journey from response to recovery.

Looking for help to find and address cost savings opportunities in your IT environment?

Connect with an Expert.

Cloud Success Stories – Part 1

Multicloud has become a popular approach for organizations moving to the cloud.  

Although it isn’t practical in all business casesRightScale finds 84% of companies already run applications in a mix of cloud environments.[1]  

In the last several years, Softchoice has seen many of our customers revisiting their approach to the cloud to gain the distinct advantages of several cloud platforms. At the same time, many are looking to de-risk their cloud strategy by avoiding vendor lock-in 

But today’s measure of success in the cloud isn’t just how an organization gains in efficiency or agility. Instead, it’s how fast the cloud can drive real business transformation.  

Sharing Customer Stories

Many of our customers are turning to one or more clouds to stay true to the goal: Spend more time delivering great products and services than maintaining infrastructure.  

Nonetheless, each organization – and each application – is on a journey of its own. We wanted to share our experience helping 1,400+ organizations transition to the cloud and help others benefit from what they’ve learned.  

This series will explore real-life stories on the journey to the cloud. In this article, we’ll look at three organizations and how they integrated Google Cloud into their cloud strategies to deliver the best possible customer experiences.  

Michael Young, Vice President – Technology Strategy, Birch Hill Equity  

The Challenge: Birch Hill Equity wanted to adopt serverless architecture to the greatest extent possible to focus on delivering analytics products rather than maintaining infrastructure.  

Multicloud certainly ties into our strategy at Birch Hill…Our number one priority is for as much of what we do as possible to be serverless.”  

The Journey 

  • Birch Hill started in the public cloud with AWS, using a PostgreSQL, Databricks and a data warehouse to support its then lightweight data center needs.  
  • The company’s growing portfolio of analytics products required faster and faster response times, prompting the company to adopt Google BigQuery, due to its sub-second response times and lack of infrastructure to maintain.  
  • Google identity and secure access through OAuth met some critical security needs while allowing a small team to run data-intensive analytics workloads.  
  • Today, Google Cloud allows Birch Hill to spin up new analytics offerings fast while AWS supports heavy workloads with Databricks over EC2 clusters along with other infrastructure components.  

“We wanted to focus our time on delivering analytics products, not maintaining our cloud.”   

Next Steps: 

  • Birch Hill still faces challenges in implementing and managing effective security across multiple clouds – a common difficulty for multicloud adopters.  
  • Meanwhile, the company struggles with skills shortages in DevOps, infrastructure and architecture design, preferring to focus on expanding its analyst bench.  

Read the full conversation 

Sergei Leschinsky, Senior Director – Information Services, Polar Inc.  

The Challenge:  Polar needed to minimize delays and embrace a distributed network to support exponential growth and global expansion for its real-time bidding product for digital advertising.  

Distributed geographies became an essential part of the Polar Platform, which is a distinct advantage of public cloud. 

The Journey: 

  • Polar started its cloud journey as an early adopter, extending some of its production workloads to the public cloud – however, the project was unsuccessful.  
  • Overcoming an early false start, the company began a second migration with AWS, favoring its industry-leading versatility of services and solutions.  
  • As its product entered a period of rapid growth, Polar consolidated CDN providers and started bringing its heaviest-traffic workloads to Google Cloud. The result was a 50% savings in egress traffic.   
  • This allowed them to take advantage of geo-locations to support expansion in Europe and Australia. 
  • Today, Polar is using Google Cloud to support compute, load balancing and MySQL while AWS supports its data storage needs.  

Next Steps: 

  • Polar’s next steps in the cloud are to migrate its remaining high-traffic workloads to its Google Cloud environment.  
  • However, the company finds getting the attention of cloud providers to escalate and resolve issues is sometimes an uphill battle as a customer with a smaller footprint.  
  • They also see some difficulty in navigating the changes in billing structures and program changes across several large, complex and innovative service providers.  

“Our approach and need for public cloud today are very different than what we were trying to use it for in the past.”  

Read the full conversation.  

Norman Shi, Chief Technology Officer, Gradient.io  

The Challenge: As a startup, Gradient needed to process massive amounts of data in very short periods to support its SaaS tool ranking brand performance on Amazon’s retail platform.  

“Eventually, your application requirements will get to a stage where you require a higher level of infrastructure that offers greater scale, elasticity and processing speed.” 

The Journey: 

  • As a cloud-native company, Gradient started its journey without legacy infrastructure, allowing it to select the cloud provider or providers best able to meet their needs.   
  • Although Gradient recognized the strengths of AWS, potential data hosting conflicts with its retailer customers rendered it impractical for their needs.    
  • The company built its technology stack on Google Cloud to take advantage of its exceptional data collection and processing capabilities.  
  • Gradient also wanted to benefit from Google Cloud’s user-friendly interface and open source services like Kubernetes.  
  • Today, Gradient uses Google Cloud to power and optimize its SaaS dashboard for a fast-growing customer base.  

Next Steps:  

  • As a small but growing company in the cloud, Gradient still struggles with resource constraints and the challenges in accessing Google Cloud-specialized skills 
  • They also have some trouble tracking, managing and optimizing their cloud spend as their offering goes through a period of rapid growth.  

“These services are game-changers for any organizations who want to process terabytes and petabytes of data. 

Read the full conversation 

What’s Next for You Cloud Journey? 

The cloud journey is not always a pleasant or a complete success at first.”  

We’ve covered three real-life journeys that led to successful cloud transitions. However, no cloud transition is ever fully complete. Working with a strategic managed services partner like Softchoice will help you:  

  • Achieve the right mix of cloud services to meet your business needs 
  • Take the risk out of cloud adoption and migration 
  • Optimize your cloud spending across multiple providers 
  • Balance product and service innovation with proper cloud governance  
  • Upskill your team on every aspect of the cloud 

Learn more about how we can help by exploring Softchoice Cloud Services. 

Planning to migrate one or more workloads to the public cloud? First, check out this Forrester report, 10 Facts Tech Leaders Should Know About Cloud Migration. 

 [1] RightScale 2019 State of the Cloud Report from Flexera