Best Practices for Optimizing Cloud Usage and Costs.

Part I: Best Practices for Optimizing Cloud Usage and Costs

The funny thing about developers is that once they get ahold of fancy new toys, there can be a tendency to play with them as much as possible. This may sound familiar if you are using Microsoft Azure or another public cloud for application development and watched the costs go through the roof.

Luckily, that doesn’t have to be the case.

While you don’t want to clip your developer’s wings, striking a balance between freedom and responsibility and making ‘cost consciousness’ a part of your company culture will go a long way to helping reduce your Azure spend and better manage your cloud infrastructure.

The decision to go beyond cost and focus on optimization is also an important step in moving towards a true cloud cost management and optimization (CCMO) program.

Start by asking yourself some key questions:

  • How much is my infrastructure costing me?
  • How much money am I wasting on idle infrastructure?
  • What guardrails do I have in place?

Once you have made an honest evaluation of where you think you are, start implementing these best practices to put you on the road to immediate Azure cost savings.

Delete Unattached Disk Storage

It’s not uncommon to see thousands of dollars in unattached Disk Storage (Page Blobs) within Azure accounts.

That’s because when you delete a Virtual Machine, by default, any disks that are attached to the VMs aren’t deleted. This features helps you prevent data loss from an unintentional VM deletion. However, after a VM is deleted the Disk Storage remains active and you will continue to pay the full price of the disk.

Because of the dynamic nature of cloud computing, it’s easy for users to quickly spin up and spin down workloads, but that means the risk of leaving behind unattached storage is high. Check for unattached Disk Storage in your infrastructure to cut thousands of dollars from your monthly Azure bill.

Xgility Pro TipDelete your Disk Storage if it has been unattached for two weeks. It’s unlikely the same storage will be utilized again.

Delete Aged Snapshots

Many organizations use Snapshots on Blob and Disk Storage to create point-in-time recovery points in case of data loss or disaster.  However, Snapshot can grow quickly if not closely monitored. Individual Snapshots are not costly, but the cost can grow quickly when several are provisioned.

Additionally, users can configure settings to automatically create subsequent snapshots without scheduling older snapshots for deletion. Monitoring Snapshot cost and usage per VM will help ensure Snapshots do not get out of control.

Xgility Pro TipSet a baseline number of snapshots that should be retained per object. Most of the time a recovery will occur from the most recent snapshot.

Terminate Zombie Assets

Zombie assets are infrastructure components that are running in your cloud environment but not being used. For example, they could be former VMs that have not been turned off. Zombie VMs may also occur when VMs fail during the launch process or because of script errors that fail to deprovision VMs. Additionally, zombie assets can also come in the form of idle Load Balancers that aren’t being used effectively, or an idle SQL Database.

Microsoft charges for these assets when they’re in a running state. They should be isolated, evaluated, and terminated if not needed. Take a backup of the asset before terminating or stopping it to ensure recovery if necessary.

Xgility Pro TipIdentify VMs that have a Max CPU <5% over the past 30 days as a starting point for finding zombie assets.

Upgrade VMs to the Latest Generation

In 2014, Microsoft introduced the next generation of Azure deployment, called Azure Resource Manager (ARM), or sometimes v2. This update offers functionality like resource grouping, advanced tagging, role-based access control, and templates. While the prices for ARM and Azure Classic (Azure v1) are the same, the management improvements help save time.

For example, using ARM, you can easily batch deploy new VMs from a JSON template, rather than deploying them one at a time. You can tag assets so they are more easily viewable by Line of Business.

In addition, for some VM types there is the option to upgrade to the latest version. While the VM price points are the same, the performance improvements that may enable you to run fewer VMs.

For example, upgrading a D-series VM gives you 35% faster processing and greater scalability for the same price.

Xgility Pro TipMigrate from Azure Classic to ARM for improved performance, additional features and better manageability. Since both Classic and ARM assets can now be managed in the same console, you can migrate workloads at your own pace.

Rightsize VMs

Rightsizing an Infrastructure as a Service (IaaS) offering such as VMs is the cost reduction initiative with the potential for the biggest impact.

Over-provisioning a VM can lead to exponentially higher costs.  Without performance monitoring or cloud management tools, it’s hard to tell when assets are over- or under-provisioned.

Be sure to consider CPU, memory, disk, and network in/out utilization. Reviewing these trended metrics over time, can help you reduce the size of the VM without affecting VM application performance. Because it’s common for VMs to be underutilized, you can reduce costs by assuring that all VMs are the right size.

Xgility Pro TipLook for VMs that have an Avg CPU < 5% and Max CPU < 20% for 30 days as viable candidates for rightsizing or termination.

It’s important to remember these tips are not meant to be one-time activities but ongoing processes. Watch for Part II of our blog series to learn five more tips that can help you save costs in Azure.

Source: CloudHealth, 10 Best Practices for Reducing Spend in Azure, 2018

Do you need to gain control over your Azure costs and cloud sprawl?
Learn how our Cloud Optimization Platform Service can provide visibility into your cloud spend and automate best practices to reduce your Azure costs by an average of 20%-30%.
See it in action: Request a demo or try it free for 14-days.

How effectively are you managing your Azure costs?

Cloud Management Benchmark Quiz

10 Ways to Reduce Your Azure Spend.

[Infographic] 10 Ways to Reduce Your Azure Spend

Microsoft Azure adoption is on the rise. But many organizations have learned the hard way that moving to the public cloud doesn’t always guarantee cost savings. In fact, many have noticed cloud bills that are two to three times higher than expectations.

Luckily, that doesn’t have to be the case. The first step to combatting rising Microsoft Azure costs is to gain visibility across your entire organization’s cloud spend. Then, use the 10 best practices outlined in our infographic including deleting unattached storage, deleting aged snapshots, and terminating zombie assets.

Download PDF Version

Infographic: 10 Ways to Reduce Your Azure Spend.

 


Do you need to gain control over your Azure costs and cloud sprawl?
Learn how our Cloud Optimization Platform Service can provide visibility into your cloud spend and automate best practices to reduce your Azure costs by an average of 20%-30%.
See it in action: Request a demo or try it free for 14-days.

Azure Automation: Infrastructure as Code

Automation is Key to Getting the Most Out of the Cloud

The cloud offers many benefits to companies that move their infrastructure there or develop applications using cloud native services, but automation is the key to making the cloud efficient and cost-effective. We have to get into the “way back” machine for a second to return to the days of on-premise servers, storage, and network gear to get a full grasp of how important automation is. In the old days, IT was typically siloed into four or five teams that all had expertise on different IT components: server team, storage team, network team, application team, and security team. When a basis move, add or change needed to be made the entire team was not only consulted but also the played a role in making the change happen. This made basic changes time consuming and fairly complicated to orchestrate. Additionally, changes were typically done manually because the siloed groups almost never created cross-functional tools or scripts because of the siloed organizational structure. This drove up cost and slowed down the pace of innovation.

Ok, now let’s jump forward into modern times. The public cloud offers a new way to manage change for IT organizations and it is forcing IT professionals to evolve from being builders to becoming orchestrators. In the public cloud, IT professionals can now perform cross-functional changes using one common and ubiquitous console that has access to servers, storage, network, applications, and security. But that’s not the best part, you can now use a common scripting and API interface to perform all of the orchestration and you don’t need multiple subject matter experts with compartmentalized product knowledge to participate in it. All of the cloud services that you use can be treated as application code which allows for direct programming and automation. Moves, adds, and changes are now developed in code and executed in seconds with a simple command. This accelerates an organizations ability to change and allows them to innovate rapidly.

Let’s explore how you can do this with Azure services, I think you will find it enlightening.

Azure PowerShell

Azure provides a set of PowerShell cmdlets (AzureRM) that use the Azure Resource Manager model for the management, provisioning or updating of Azure resources. However, when executing these scripts, the user is required to log into Azure from PowerShell using the cmdlet Connect-AzureRmAccount which provides authentication and authorization.

If you plan to manage Azure PowerShell, the scripts should be executed under an Azure Active Directory (AAD) Service Principal, rather than using your own credentials.

Azure Service Principal

An Azure service principal is a security identity used by user-created apps, services, and automation tools to access specific Azure resources. Think of it as a ‘user identity’ (username and password or certificate) with a specific role, and tightly controlled permissions. A service principal should only need to do specific things, unlike a general user identity. It improves security if you only grant it the minimum permissions level needed to perform its management tasks.

Azure Automation

The following example creates a self-signed certificate, associates the certificate to an Azure Active Directory (AAD) Service Principal and connects to Azure from PowerShell, the AAD Service Principal provides the authentication and authorization.

1. Use Connect-AzureRMAccount to log into Azure from PowerShell using an account that has contributor rights.

2. Execute the following PowerShell commands to create a Self-Signed Certificate, the certificate will be stored in the local (server) Windows Certificate Store.’

3. Execute the following PowerShell command to create an Azure Service Principal.

Technical Note #1
The PowerShell command in Step #2 creates the Self-Signed Certificate along with the Key Value ($KeyValue) of the certificate, the $KeyValue is used as the -CertValue parameter when the Azure Service Principal is created in Step #3 and assigned the Display Name infrastructure.

The Azure Service Principal is created in Azure Active Directory in App registrations.

4. Execute the following PowerShell command to assign Contributor rights to the Azure Service Principal.

5. Execute the following PowerShell commands to retrieve the Azure Tenant Id and the Application Id (the Azure Service Principal) created in Step #3 and the Thumbprint of the Self-Signed Certificate created in Step #2.

Technical Note #2
The Tenant Id ($TenantId) is the Id associated with your Azure tenant, the Application Id ($ApplicationId) is the Azure Active Directory Service Principal that was created in Step #3 and the Thumbprint ($Thumbprint) was created in Step #2.

Connecting to Azure using a Service Principal

6. Using the variables $TenantId, $ApplicationId and $Thumbprint from Step #5 as input parameters, execute the following PowerShell command (Connect-AzureRmAccount).

The Connect-AzureRmAccount PowerShell cmdlet successfully executed and displays the Azure Context information. The context information is the Active Directory account (Account, i.e. Service Principal), Subscription Name (SubscriptionName), Subscription Id, Active Directory tenant (Tenant Id) and Azure Environment (Environment).

Take it One Step Further

Now that the Service Principal has been created, let’s take it one step further and automate an Azure PowerShell script using a CICD environment.

The CICD environment will be Jenkins which provides an automation server for Continuous Integration and Continuous Delivery.

Azure Network Infrastructure

An Azure Virtual Network has been provisioned which contains dev, stage and prod subnets, Azure Network Security groups are associated with each subnet.

The Jenkins server (xprovisioning) is a Windows 2016 Datacenter server, jenkins-2.138.2 and the AzureRM PowerShell cmdlets have been installed on the server.

Jenkins Project

A Jenkins project titled Create-Update Network Security Group is a Freestyle project which is configured to execute PowerShell scripts.

The Jenkins project will leverage Jenkins Credentials Binding plugin, this allows you to configure and inject credentials or text as environment variables which are stored as secrets in Jenkins.

The Jenkins project Build will execute an Azure PowerShell script Infrastructure_NSG_CreateUpdate.ps1

 

The PowerShell script will be configured to read the Jenkins environment variables (secret text), connect to Azure from PowerShell using a Service Principal, execute a publicly available Microsoft script (GitHub) that updates an Azure Network Security Group and logs out of Azure.

Jenkins Dashboard

The Jenkins Dashboard provides access to all the Jenkins projects that have been set up for the environment, click on the Create-Update Network Security Group project.

Jenkins Project Dashboard

The Jenkins Project Dashboard allows a user to modify, build or delete the project. The following is the output of the successfully executed Jenkins project Create-Update Network Security Group.

Putting It All Together

An Azure service principal was created with a Self-Signed Certificate, this provides authentication and authorization for connecting to Azure.

A Jenkins Freestyle project was created that executes the PowerShell script Infrastructure_NSG_CreateUpdate.ps1. The project also leveraged the Jenkins Credential Bindings Plug-in which allowed the Tenant Id ($TenantId) and Application Id ($ApplicationId) created in Step #3 and the Thumbprint ($Thumbprint) created in Step #2 to be stored as secret text.

The PowerShell script Infrastructure_NSG_CreateUpdate.ps1 calls a PowerShell script CreateUpdateNSGFromCsv.ps1, this script updates the inbound security rules for the Azure Network Security Group prod-nsg which is associated with the prod-subnet.

A network security group contains security rules that allow or deny inbound network traffic to, or outbound network traffic from, several types of Azure resources. You can enable network security group flow logs to analyze network traffic to and from resources that have an associated network security group.

Network Security Group – prod-nsg

Before the execution of the PowerShell script, the network security group contained one custom inbound security rule; Priority 100, Name Allow_RDP port 3389. This rule allowed Remote Desktop Protocol (RDP) access to virtual servers in the prod-subnet.

After the execution of the PowerShell script, additional inbound security rules were added, and the priority of the inbound security rule Allow_RDP was changed to Priority 200.

The additional inbound security rules allow inbound web traffic for ports 80 (http), 443 (https) and inbound SQL Server traffic for port 1443.

Six Myths About Moving to the Cloud: What You Really Need to Know About Moving to Office 365

Most organizations that choose to move to the cloud do so because they have decided they need it for business agility and want the cost savings that come with it.

If your organization is considering Microsoft Office 365 as your first step in moving applications to hosted solutions, you may have found similar inconsistencies in your research — making it difficult to separate fact from fiction.

A common misconception about Office 365, for example, is that it is simply a version of Office accessed by a browser.

To help in your migration to the cloud, here are six myth-busting facts about Office 365.

Myth 1: Office 365 is just Office tools in the cloud, and we can only use it online.

Fact: Office 365 is the Office you already know, plus productivity tools that will help you work more efficiently.

Whether at your desk or on the go, you have the tools to focus on the work that matters to your mission. And, since Office 365 lives in the cloud, these tools stay up to date, are simple to use and manage, and are ready to work when you are.

Myth 2: If our data moves to the cloud, our organization will no longer have control over our technology.

Fact: You still have total control over technology, but your IT department won’t have to worry about constant updates.

When you move to the cloud, time spent maintaining hardware and upgrading software is significantly reduced—eliminating headaches with it. Now your IT team can focus on advancing your organization’s technology, rather than being a repair service. Plus, you will have more time to spend improving operations and launching agile initiatives.

Instead of spending more and more portions of your budget on servers for email storage and workloads, you can think strategically and support organizational managers in a much more agile fashion, quickly responding to their needs.

Myth 3: Corporate spies, cyber thieves, and governments will have access to my data if it is in the cloud.

Fact: It’s your data, not anyone else’s.

This is a top fear about the cloud among many organizations, but it is unfounded. Your IT team manages access, sets up rights and restrictions, and provides smartphone access and options. Further, your organization remains the sole owner. You retain the rights, title, and interest in the data stored in Office 365.
Visit the Microsoft Trust Center to learn how they help safeguard your data »

Myth 4: Cloud migration is too much for my organization to handle.

Fact: We’re here to help every step of the way.

When you start considering how to move petabytes of data to the cloud, it’s easy to see why some people think “going cloud” is too big a challenge for IT departments and staff, alike. We’re not going to tell you it’s simple, but you really can get Office 365 up and running quickly.
Learn more about our Office 365 Strategic Services »

Myth 5: We have to learn all new tools to manage SharePoint Online.

Fact: SharePoint Online abstracts maintains the infrastructure, without changing anything else.

All of your hard work learning how to manage SharePoint is not lost! SharePoint Online shares the same familiar administration and management tools, whether your deployment is in the cloud, on location, or in a hybrid of the two. Although your customizations aren’t populated to the cloud, all the administrative controls remain the same.

When moving to SharePoint Online, you no longer need to bother with controlling the workload implementation—instead, your IT team can focus on controlling its configuration. With the convenient, one-time, expert-led implementation that SharePoint Online handles, your IT team can reallocate time they used to spend on implementation and can concentrate on building strong, strategic tools for the organization. SharePoint Online simply abstracts the infrastructure, enabling you to focus on the solution.

Myth 6: I have to move everything to the cloud. It is an all-or-nothing scenario.

Fact: You can move to the cloud at your own pace or use a hybrid approach.

While some early cloud supporters advocated for moving your entire organization to the cloud all at once, this isn’t a process that needs to happen overnight. Most implementations start with a hybrid approach—moving a single workload, like email, and growing from there.

The hybrid cloud creates a consistent platform that spans data centers and the cloud, simplifying IT and delivering apps and data to users on virtually any device, anywhere. It gives you control to deliver the computing power and capabilities that your organization demands and to scale up or down as needed without wasting your onsite technology investments.

As many organizations move their productivity workloads to the cloud, the path for each workload is different, and the time it takes for those migrations varies. We can help you move workloads such as file sync-and-share or email first, and then help you figure out the right long-term plan for more.

Final Thoughts

Feel free to share this with colleagues who need help separating fact from fiction when it comes to Office 365 in the cloud. It’s good to be on the same page, and you’ll save time by not having to argue about these myths.

Are you considering moving to Microsoft Office 365? If so, we can help. Explore our solutions and services for details or connect with an Xgility expert today.

The Principle of Least Privilege Access in the Cloud

The principle of least privilege states that each component should allocate sufficient privileges to accomplish its specified functions, but no more. This limits the scope of the component’s actions, which has two desirable effects: the security impact of a failure, corruption or misuse of the component will have a minimized security impact; and the security analysis of the component will be simplified.

NIST Special Publication 800-160: Systems Security Engineering Considerations for a Multidisciplinary Approach in the Engineering of Trustworthy Secure Systems, November, 2016.

 

The principle of least privilege (POLP) has long been a best practice for computer security.  In practical application, administrative users will use regular user accounts for routine activities, and use a separate, administrative login to perform administrative functions.  POLP is a component of many different compliance programs, including FISMA, PCI, CIS, etc.

In the Windows desktop, User Access Control (UAC) performs a POLP function – blocking or requesting access for administrative privileges as needed.

 

Using “Run As Administrator” also performs a POLP function – programs run with normal privileges unless explicitly requested to run with administrator privileges.

In the Windows Server environment, Microsoft has long recommended using separate administrator and regular user accounts for system administrators.  (From The Administrator Accounts Security Planning Guide published in 1999 which is a document no longer available from Microsoft):

 

Why You Should Not Log On To Your Computer as an Administrator

If you regularly log on to your computer as an administrator to perform common application-based tasks, you make the computer vulnerable to malicious software and other security risks because malicious software will run with the same privileges you used to log on. If you visit an Internet site or open an e-mail attachment, you can damage the computer because malicious code could be deployed that will download and execute on your computer.

If you log on as an administrator of a local computer, malicious code can, among other things, reformat your hard disk drive, delete your files, and create a new user account that has administrative privileges. If you log on as a member of the Domain Admins group, Enterprise Admins group, or Schema Admins group in the Active Directory® service, malicious code can create a new domain user account that has administrative access or put schema, configuration, or domain data at risk.

There are obstacles to adopting this practice of separate accounts for users of cloud services such as Office 365.  Creating separate accounts for administrators requires multiple subscriptions for a single user, for example.  Managing multiple accounts within a browser context can be confusing, leading to a less safe usage of administrator accounts.  Privileges for Office 365 are managed through users, roles, and groups within the Office 365 Admin portal.

Assigning privileges through the use of roles allows limitation of the privileges that are assigned to user accounts.  For example, a user account can be assigned the Reports Reader role so that they can view the activity reports in Office 365, but they are not assigned any other administrative privileges.  Other examples of roles available within Office 365 are below.

Organizations may find that they need a higher level of granularity and control of administrators.  To assist in managing the security of privileged accounts, Microsoft provides Azure Active Directory (AD) Privileged Identity Management (PIM).  This is available with Azure Premium P2 licenses.  Adding this application to the Azure portal provides a high level of protection for privileged accounts.  Azure PIM can secure privileged accounts by requiring Azure Multi-Factor Authentication (MFA), placing time limits on the granting of privileges (like “Just-in-Time Administration” in Windows Server 2016), detect attacks, and allow conditional access.  These controls reduce the attack surface, and help maintain the principle of least privilege for the Azure AD administrator accounts.

Cloud and Office 365 administration requires a paradigm shift when compared to Windows server administration.

When our team performs security assessments we are looking for no less than two global (tenant) administrators and no more than 5.  Many customers make the mistake of giving the Power BI or Exchange administrator global admin rights when all they require for their job is administration rights to that specific workload.

 

If you have any questions or would like to schedule a security assessment for your organization, please contact us.

 

 

Related articles:

Office 365 vs. Your Information Security Program

https://www.xgility.com/office-365-vs-information-security-program/

Unraveling Office 365 Groups

https://www.xgility.com/unraveling-office-365-groups/

 

 

Author:  John Norton

Editors:  Alex Finkel and Kurt Greening

Top Azure Consulting Companies

Below is a list of what I believe are the top Azure consulting companies in the DC Metro area including Maryland, Northern Virginia, and Washington D.C.  The ratings are based mostly on industry insider knowledge, including factors such as satisfaction of known customers, consultant turnover, and experience with key 3rd party solutions from the Azure Marketplace.  Top companies can do more than just migrate virtual machines to Azure, but offer services to transform applications using the power of the cloud.

The top Azure consulting companies are active speakers in local user groups, Microsoft Gold Partners, Cloud Solution Providers, and participate in Microsoft programs such as Azure Everywhere Assessments/Pilots/POC, Go Fast, and Software Assurance Planning Days.

Azure is new and is rapidly evolving.  As Microsoft customers move installing packaged software on servers to the cloud, this creates opportunities for new partners and may change trusted relationships with customers and vendors.  You should expect this list to be updated frequently.  The list below is in no particular order (so don’t email me to complain if your company is #9, we can still be friends).

 

Booz Allen Hamilton

These guys have a pretty large Microsoft practice focused primarily in the Federal government.  Their customers include most of the intelligence community and the Internal Revenue Service (IRS).  Dan Usher is as active in the Azure technical community as he is in the SharePoint community.

 

Accenture

These guys are large, national, and prefer to work on the biggest projects.  They have a large presence in the Federal government, including DHS, and Fortune 1000.  Our team regularly sees Accenture consultants and local Azure user groups.

 

Xgility

Xgility started as a SharePoint consulting and application development company.  Over the last few years we have recognized that Microsoft cloud reduces the time to market and deploy technology solutions for our customers.  Recent projects include developing mobile field agent reporting apps, proof of concept (POC) lab for a government customer, migrating virtual machines to Azure for a large trade association, and modernizing an ecommerce application for a major insurance company to take advance of Platform as Service.  Xgility is a Gold partner, CSP, GSA schedule holder, and a certified small business.

 

CSRA

CSRA traditionally has been a provider of general IT program management services.  In the past they have hosted customers in their data center and implemented mostly provide cloud (virtual servers) environments.  Recently they have stepped up their commitment to Microsoft and Azure specifically.   Our team has had the opportunity to work with them on past projects in the Federal Government.

 

AIS

AIS, also known as Applied Information Sciences, grew out of small business status by performing mostly on government contracts.  While not as active in the user group community as some, they still have a good (mostly federal) customer base in the DC metro area.

 

Planet Technologies

Planet Technologies is a Microsoft partner headquartered in Montgomery County, MD.  They have a good presence in the state and local government and also do federal work.  Patrick Curran, one of their consultants, is an active speaker in Azure and SharePoint community.

 

 

If you have implemented SaaS solutions like Office 365 and SalesForce, it may be time to evaluate moving the servers in your data center to a public cloud.  Have you worked with another Azure consulting company else you really like?  Drop us a note.

 

 

Author:  Kurt Greening

Editor:  Alex Finkel

Office 365 – Desktop Update Best Practices

When it comes to cloud migrations, most customers dedicate efforts to the migration only to then realize there is more work to be done.  Enterprise customers have been accustomed to Microsoft version releases every couple of years.  However, subscription-based cloud services update often.  Large enterprises managing desktops and applications often dedicate teams to test and deploy updates.  Consumers who manage their own devices and smartphones are accustomed to seeing their system and apps update automatically and more often.

As an Office 365 subscriber, organizations should be aware of subscription channels and their update frequency.  Information Technology users should be on the First Release Deferred Channel with monthly feature updates.  The rest of the enterprise should run the Deferred Channel, which updates three times per year.  Consumers and Windows insiders are kept current with monthly builds.

Great news!  Starting this September, the updates will move to a semi-annual model.  Microsoft announced these changes in a blog and then shared the support article with this helpful chart:

Windows 10 will also follow the same model starting in September and then March as well.  While this change is confusing, it seems Microsoft is listening to customers who have been managing the update frenzy as they try to keep current and secure with the Microsoft cloud.  Having to deal with this twice a year for the desktop OS and Office apps sounds almost blissful.

 

Are you thinking of moving from a subscription based model for Microsoft applications such as Word, Outlook, PowerPoint and Excel?  If you are interested in migrating, but concerned about the change, the team of experts at Xgility can help.  Want a free trial of Office 365?  Get your Office 365 E5 trial subscription here.  For a free 30 minute consultation, please contact us.

 

 

Author:  Chris Ertz

Editors:  Alex Finkel and Kurt Greening

Why an Accurate Active Directory Profile is Critical

When was the last time you updated your work profile?  In some cases the answer is never, as you may be thinking, “Isn’t that IT’s job and didn’t I complete several HR forms when I joined the company?”  Chances are you’ve updated your LinkedIn profile more frequently than your company bio.  Many of us spend a lot of time on social media sites, including Twitter and Facebook, telling the world (or a few friends) all about you.

You may want to think more about your work profile as Microsoft just announced an updated profile experience coming soon to Office 365.  Items such as your manager, teammates, and documents can be surfaced right from your name.   Leveraging the Office Graph and the same engine behind Delve, the new profile experience can make a connection quick and easy.

While some of this profile data is a result of your uploading and sharing of information in Office 365, your primary stats are stored in your company’s Active Directory.  With Office 365, a copy of your AD is synchronized with Azure and for startups and small businesses, their primary directory is Azure AD.  Either way, it is important for your profile to be complete.  In fact, many SharePoint workflows break as the approving manager is not available in the user’s profile.

Xgility has partnered with a new startup company founded by former Microsoft Product Manager and former Nintex CEO, who help organizations collect profile data.  The company is called HyperFish and we’re excited to help our customers improve their employee profile data with this service.

As the administrator, I let Hyperfish analyze my company directory and then schedule their machine learning HyperBot to help collect missing profile data.  Using our company brand, we set the tone of the communications and channels to communicate.  Here it is in action:

 
At Xgility, we like to come up with new ways to help our customers increase collaboration and productivity.  For our customers, keeping active directory up-to-date may be part of skills tracking, knowledge management, or automated workflows.  If you have ever needed to text a coworker at the last minute due to a conference room change, you will appreciate this tool.  Microsoft has some work to do to make it easier to do skills tracking in active directory, but we believe there are exciting things to come.  Making sure your directory and profile is complete and up-to-date will set the foundation for transformational results.

If you would like to view the PowerPoint presentation on this topic, from SharePoint Saturday DC, you can view it here.

Try the free active directory analyzer and photo tool or contact us for a free 30 minute demo.

 

 

Author:  Chris Ertz

Editors:  Kurt Greening and Alex Finkel

Does Our Government Contractor Need to Move to Gov-Cloud?

If you work in the DC Metropolitan area, you are aware that there are thousands of companies who serve the needs of the Federal Government.  If you doubt these numbers, FOIA lists contractors in SAM.  Government Contractors often have to adhere to many of the same security and compliance standards of the Federal Government, especially when accessing, sharing, and storing sensitive or classified material.

GAO has released guidelines for government contractors including, but not limited to, having a business continuity plan, virus protection, encryption, and using two-factor authentication.  Requirements for two factor authentication are pushed down to defense contractors through three DFARS clauses.  The DFARS clauses require defense contractors (and all of its subcontractors) that possess or transmit controlled unclassified information from DoD to fully implement NIST 800-171 and provide the DoD CIO with a certification memo.  The requirement for 2 factor is imbedded in the controls in 800-171.

We expect these requirements to continue to become more stringent as the Federal government evaluates cyber security risks and our enemies target government contractors.  Many small and medium government contractors have found that the cloud can be both more cost effective and more secure.  As government contractors evaluate the move to the cloud, many have asked us which cloud?

Microsoft built a Government datacenter for Office 365 and Azure services for .gov agencies, but government contractors with commercial .com were not allowed to use those services without a letter of approval from a government agency.  Most agencies were reluctant to provide that letter unless the contractor was a building a SaaS application that would host government data.  As contractors assisted agencies in the migration to the FedRAMP approved services, contractors wondered how this could impact their internal cloud migration and relationship with their government customers.  Contractors no longer need to worry because Office 365 Enterprise and Microsoft Azure are now in scope for FedRAMP at the Moderate Impact Level.

At Xgility, we’ve migrated Federal, State, and Local agencies as well as Government Contractors to Microsoft Cloud platforms.  These customers are achieving new levels of productivity and security in the cloud.  We agree with Microsoft that Office 365 offers more security than on-premises services and since you need multi-factor authentication to participate in this ecosystem, Office 365 has you covered there too.

Xgility Gov Cloud Office 365

 

If you would like to learn more about Microsoft’s Cloud or would like a 30 minute free consultation, contact us.

 

 

Author:  Chris Ertz

Editors:  Alex Finkel and Kurt Greening