Tuesday, April 11, 2017

Tabula Rosa Systems Blog Of 4/11/17 - Where does your data center end?

Buy the books at

 www.amazon.com/author/paulbabicki
====================================================


cloudstrategymag.com
August 22, 2016
Where does your data center end?
It might seem like an odd question as you can probably point out the physical building or buildings that house your data center(s). But does that physical installation line up with the logic concept of the “data center” held by your business?
Most businesses today have started to extend their IT footprint into the public cloud. Cloud service providers (CSPs) like AWS, Microsoft Azure, and the Google Cloud Platform have brought public cloud computing to the mainstream.
The benefits are considerable, such as shifting expenses to operating expenses (OPEX), accurately tying expenses to projects, immediate fulfillment, complete automation, and others; however, the public cloud also brings significant challenges to your overall IT strategy.
The biggest IT question facing today’s business is, “How can you secure and manage assets effectively in a hybrid environment?”

Hybrid is the New Normal

You might be tempted to dismiss a hybrid environment — where you have assets in your own data center and in the public cloud — as a temporary issue. Maybe you believe that teams within the business are simply experimenting with this technology. Or that it’s only for development and test workloads. Perhaps you’ve even set out an “all in” strategy and expect to get rid of your existing data center footprint.
The fastest hybrid case for these scenarios, “all in,” will still take at least 12 to 18 months if your migration goes as planned. But as we know, when was the last project you delivered that went according to plan?
You will be dealing with the challenges of a hybrid environment for the foreseeable future. It’s time to develop a strong strategy and body of knowledge in dealing with them.

Stark Differences

At first glance, the challenges presented by a hybrid environment don’t appear to be too different when compared to a multi-data center deployment. This is a problem set that is well understood. As soon as businesses started to use multiple data centers and regional deployments, workflows and procedures were adjusted to handle multiple locations.
The real challenge of a hybrid environment lies in the fact that the data center and the cloud are fundamentally different environments. It’s very much a case of apples and oranges.
Without hands-on experience, it can be hard to truly grasp how substantial the difference is. Apples and oranges might not do it justice. A horse and buggy compared to a modern automobile might be a more apt analogy. Both will get you somewhere, but the automobile will do it faster and in more comfort.
The reason for this difference is code.
Everything in the public cloud is addressable via code. This not only reduces delivery times through heavy automation, but it has started a change in the culture of solution delivery.
When everything can be written in code, teams start to approach solution delivery like modern software development. This is the core of the “DevOps” (development + operations) model. Despite the name, all teams involved with solution delivery are involved, but DevSecProjBizOps was too cumbersome an acronym to catch on.
The DevOps approach to solution delivery mirrors the agile software development process. Teams work together in a collaborative manner. Changes are made in small increments, tested, validated with the business and rolled out constantly. This lies in stark contrast to traditional, waterfall methods.
In the waterfall approach, each team hands off their work to the next in line. Development creates the application, security evaluates it, operations deploys it, and the business uses it. Rarely do all players in the solution collaborate on solving problems or improving the end result.
This disconnect results in long delivery times, misaligned goals, tensions between teams, and other problems. In the context of approaching a hybrid data center, it means that one team (typically operations) has a disproportionate influence on technology and processes around the infrastructure.
It’s that undue influence that can hobble teams and innovation. That’s not to say that operations is doing a poor job. It’s just that the problem set (how to deliver modern IT solutions) is only being seen from one particular angle. That lack of diversity leads to short-sighted solutions.
When teams work together to regularly deliver incremental improvements to a solution, a different view of the infrastructure is created. Traditional maintenance windows and fallback scenarios are no longer the best way forward.
Deploying small incremental changes — even to critical systems — with strong tests and a fail forward mentality is a more effective way for teams to work together to meet business goals.

Avoiding Split Processes

It’s simple to write out how DevOps models should work, but in reality, this is a difficult culture change. It takes time to work towards a new way of working. However, you can make it easier by taking a data center out approach.
Since the public cloud is new, or relatively new, territory you can establish new workflows and processes. In the data center, you have an existing set and software to implement it all. This is where the majority of change is going to happen, so it makes sense to start there.
The goal is to avoid creating a set of processes and workflows for each environment. This means that the tools chosen by the various teams are going to have to support both traditional data center and cloud workflows. That in and of itself can be a challenge. When it’s not possible to find tools that natively support both environments, make sure that you can at least get data in and out of the tool, and automate it with little difficulty.
Being able to move data through various tools in an automated fashion will allow you to stitch together the necessary tools. This is a key point because each new tool has associated overhead. Your teams will have to learn how to use the tool as well as support it. Each tool you add should be carefully evaluated for its impact as well as its upside.
The first step is to do an inventory of your current workflows and processes. This should include the software and tools that help enable these processes. The next step is critical to your success. You need to understand why you have these processes in place.
A data center is like any other complex system. Over time it will accumulate processes that are non-essential, and even detrimental, to efficient operations. This happens frequently with security controls and mitigations. As data changes and threats evolve, security measures put in place for one reason are often counter or ineffective in new circumstances.
Planning your approach to a hybrid environment is a perfect time to re-evaluate these items and see if they are still needed or could be improved on.

Automate as Much as Possible 

In the public cloud portion of your hybrid environment, things should be nearly 100% automated. The goal for modernizing your data center tools and processes should be as close to total automation as possible. This will allow for the DevOps cultural change to take effect inside of the data center as well.
Waiting hours or days to provision new computing resources or storage won’t cut it when the same resources can be provisioned in seconds in the public cloud. Most of the work implementing a unified strategy for hybrid is going to be in this area.
If you can’t automate a particular area (capacity planning being a key piece) work hard to ensure that the workflows and processes are clear, simple, and not a roadblock for larger works. For most teams this means a huge reduction in delivery time for various data center services. For obvious reasons, that’s a big challenge.
The good news here is that a lot of data centers have the technology and tools in place to accomplish these improvements. Heavy investments in virtualization for servers and storage, modern switching and network infrastructure, and flexible perimeter security defenses all support this speed of delivery. It is very much a cultural challenge.
The downside of not addressing these areas is that your data center usage will decrease as it’s simply easier to deploy to the public cloud side of the environment. Not maximizing your existing investment in your data centers is going to sell your business investment short. A well-organized data center can continue to deliver business value throughout the current technology life cycle (typically five to seven years) if you update the processes around it to support a more collaborative way of working.

Broad Visibility

While automation may be the weak point on the data center side of the hybrid environment, visibility tends to be a challenge on the public cloud side. It’s not that you’ll have a hard time seeing the assets you deploy in the public cloud — they are after all only an API call away — it’s that existing tools have a really hard time dealing with the volume of assets in the cloud and the pace at which they change.
Enterprise monitoring and discovery tools are based around a simple premise: Most assets in the data center are deployed and stay relatively static for three to seven years. In the public cloud, that thinking goes out the window. Assets now appear for minutes, hours, or longer. If they are no longer needed, they are terminated and recreated as new assets when needed again.
This flexibility is one of the major appeals of the cloud and one of the biggest monitoring and security challenges. This is the outcome of the culture shift to DevOps and it requires a software tool change to properly deal with it.
If your current monitoring and security tool sets do not have native AWS, Microsoft Azure, or Google Cloud Platform functionality, it’s time to start evaluating new tools.
A new wave of cloud-first tools are taking over the enterprise market. Built with scale and flexibility in mind, they can easily tackle traditional data center assets as well. These new tools also support automation as a key pillar, simplifying the efforts to integrate them into your team’s workflow and reducing their overhead.
Remember: A good tool gives a lot more to your teams than it requires in upkeep.

Hybrid Success

Hybrid environments are here to stay. Simply trying to forklift your existing tools, workflows, and processes from the data center into a hybrid data center/public cloud environment is a recipe for disaster.
Successful adoption means working hard to enable a more collaborative model for teams. In the spirit of DevOps, this deep cultural shift means delivering value to the business in smaller increments, but at a much faster pace.
This agility not only helps reduce mistakes, but also brings a strong alignment of IT with business goals.
To accomplish this, you’ll need to start with the data center. Mapping out existing tools, workflows, and processes then analyzing why they are in place. Eliminate the excess and focus on automating as many of these flows as possible. When that’s not possible, make them as efficient as possible and remove roadblocks often.
As your public cloud deployment grows, visibility is critical. The volume of assets and speed at which they change is difficult for existing tools to maintain. Look at updating monitoring and security tools to ones that natively support these new environments.
The road to a successful deployment is difficult. The good news is that technological challenges are straight forward and there’s a lot of guidance available. The cultural change is the most difficult part, but the benefits are significant.
The investment to create a flexible hybrid environment with a matching IT culture is worth it. You’ll have teams capable of delivering modern solutions that support the business and allow you to take things to the next level.
==================================  
Good Netiquette And A Green Internet To All!  =====================================================================
Tabula Rosa Systems - Tabula Rosa Systems (TRS) is dedicated to providing Best of Breed Technology and Best of Class Professional Services to our Clients. We have a portfolio of products which we have selected for their capabilities, viability and value. TRS provides product, design, implementation and support services on all products that we represent. Additionally, TRS provides expertise in Network Analysis, eBusiness Application Profiling, ePolicy and eBusiness Troubleshooting

We can be contacted at:

sales@tabularosa.net  or 609 818 1802.
 ===============================================================
In addition to this blog, Netiquette IQ has a website with great assets which are being added to on a regular basis. I have authored the premiere book on Netiquette, “Netiquette IQ - A Comprehensive Guide to Improve, Enhance and Add Power to Your Email". My new book, “You’re Hired! Super Charge Your Email Skills in 60 Minutes. . . And Get That Job!” has just been published and will be followed by a trilogy of books on Netiquette for young people. You can view my profile, reviews of the book and content excerpts at:

 www.amazon.com/author/paulbabicki

Anyone who would like to review the book and have it posted on my blog or website, please contact me paul@netiquetteiq.com.

In addition to this blog, I maintain a radio show on BlogtalkRadio  and an online newsletter via paper.li.I have established Netiquette discussion groups with Linkedin and  Yahoo I am also a member of the International Business Etiquette and Protocol Group and Minding Manners among others. I regularly consult for the Gerson Lehrman Group, a worldwide network of subject matter experts and I have been contributing to the blogs Everything Email and emailmonday . My work has appeared in numerous publications and I have presented to groups such as The Breakfast Club of NJ and  PSG of Mercer County, NJ.


Additionally, I am the president of Tabula Rosa Systems, a “best of breed” reseller of products for communications, email, network management software, security products and professional services.  Also, I am the president of Netiquette IQ. We are currently developing an email IQ rating system, Netiquette IQ, which promotes the fundamentals outlined in my book.

Over the past twenty-five years, I have enjoyed a dynamic and successful career and have attained an extensive background in IT and electronic communications by selling and marketing within the information technology market.

No comments:

Post a Comment