Skip to content

Cloud migration isn’t only a hosting change

Technology

May 6, 2020 - 7 minute read

Objectivity_Cloud_Migration_Elements_Blog_768x440
Mariusz Jadach
See all Mariusz's posts

2988 HC Digital Transformation 476X381

Share

Have you ever been asked to migrate on-premises software into a cloud? This may seem to be a standard request, but when you start looking at the details, it can turn out that the change of hosting location is only a small portion of the work that needs to be done. What other requirements or additional tasks can pop up? I’ll try to answer this question from the perspective of a migration project we’re doing for one of customers at Objectivity.

Initial Views and Assumptions

The customer wanted to have an existing web application migrated from on-premises servers to an Azure cloud subscription. This step was a part of their long-term cloud strategy, which assumed that the migration will result in cost savings as well as increased flexibility of the application during future development.

The application is built using the .NET framework and its main components are: the ASP.NET MVC application, a Web API, and a relational SQL Server database (general architecture overview is depicted on Fig. 1). Internet Information Services (IIS) on Windows Server on-premises machines were used to host the .NET application components.architecture overview chart Initially, the migration work seemed to be straightforward. We and the customer wanted to avoid the lift-and-shift approach (using cloud IaaS resources to reflect original on-premises infrastructure). The expected and agreed solution was to use appropriate PaaS resources in the Azure cloud – App Services for the web and API part, and the SQL Database for the application database.

What Really Needs to Be Done

We started the project with a kick-off workshop with the customer. Our conversations revealed additional requirements and certain application dependencies and constraints affecting the migration:

  • The customer wanted to change the application authentication mechanism completely. This meant replacing the existing mix of WS-Federation, on-premise ADFS and Active Directory with a more modern OpenID Connect protocol and Azure Active Directory as an identity provider.
  • The Azure SQL Database managed instance with a disabled public endpoint was the only acceptable option for data storage.
  • The application used SQL Server Reporting Services (SSRS) and SQL Server Analysis Services (SSAS) components that are not available in the Azure SQL Database managed instance (but still present in Azure or Power BI). Our idea was to reimplement certain application features to get rid of the dependencies on these two components.
  • We decided to create new CI/CD pipelines for the migrated application using Azure DevOps (it was already being used as a Git repository). The pipelines were meant to replace existing configurations in TeamCity that supported only part of the application’s environments (some deployments were done manually).

The target application architecture after the migration is shown on Fig. 2 below.

 

Target architecture chart 

Authentication in the Cloud

The authentication mechanism change turned out to be the most technically challenging and time-consuming piece of the work. As I mentioned earlier, the customer decided to use an existing Azure Active Directory instance as an identity provider and authentication server (instead of the on-premises ADFS). In addition, we replaced the WS-Federation protocol with a modern OpenID Connect (OIDC) that is an authentication extension of the OAuth 2.0 protocol. Due to these changes, the application becomes more flexible, rendering it ready for future interoperability with various identity providers (compliant with OIDC).

This specific customer requirement resulted in deep code changes across many of the solution’s components. The authentication of direct AJAX calls from the JavaScript code in a user browser to the API layer (bypassing MVC backend) was an especially interesting technical challenge. The OIDC specification recommends the use of an “authorisation code flow” for an MVC web application. In such a flow, the server-side part (MVC backend) plays the role of a confidential client application and no tokens are sent to a user browser by default. We followed the recommendation but still needed to handle the authentication of many existing direct AJAX calls to Web APIs from a user browser (JavaScript code). Such a “mixed” scenario is not standard for OIDC or OAuth 2.0 protocols. We decided to create a special and secured MVC action that returns a valid access token for the API layer. The JavaScript code can use the action to get an access token and then attach it as a bearer token to an AJAX request to the API.

Database Migration

The customer’s intention was to minimise the risk of database migration, which is why they asked us to use the Azure SQL Database managed instance. This Azure service offers the highest compatibility level with on-premises SQL Server instances, security and isolation. For instance, the application database uses custom functions relying on CLR (common language runtime) assemblies. This feature is supported in the managed instance mode only. We did not want to replace these functions, because it could be troublesome and lead to increased migration costs.

Moreover, data security and isolation are extremely important for the customer, so the managed instance mode seemed to be the only acceptable option. Due to security requirements, we’re not allowed to enable or use the public endpoint option of the managed instance. Consequently, we had to utilise the VNet Integration feature of our API app services to make database connectivity possible. The feature creates a dedicated app service subnet in the SQL managed instance virtual network.

SQL Server Reporting and Analysis Services

Getting rid of the SSRS dependency was an easy task. This SQL Server component was used by the original application only for generating simple PDF documents for the users. The same feature can be achieved with one of many available JavaScript libraries that render a PDF document in a user web browser (without any server-side involvement). We applied such an approach and the removal of the SSRS resulted in big costs savings and lower solution complexity.

As we expected from the very beginning, the removal of the SSAS dependency is a more serious challenge. The existing solution offers users the possibility to create their own custom reports in Power BI, based on the application’s data. The data is exposed as a Power BI dataset object in a workspace. The key point is that the dataset model is dynamic – when a user creates special entities in the application, new corresponding data tables must be visible and available in the Power BI dataset automatically. Today, this behavior is achieved easily with SSAS. New data tables are created in the analytical SSAS database by the application, using the Microsoft SSAS API library. These tables are then visible in a Power BI dataset model automatically, allowing users to build improved reports. Dynamic dataset models are supported by Power BI only for SSAS – they don’t work for ordinary relational databases, which we want to use as the preferred data source for the dataset.

Wouldn’t it be perfect if we could create or modify Power BI datasets (based on a relational database) programmatically using an API library, similarly to the current SSAS API approach? Fortunately, there is a Power BI feature named “XMLA read/write endpoints” that is currently in the private preview state (March 2020, read-only endpoints are already globally available). The preview feature allows third party applications to access Power BI datasets via Analysis Services API libraries and to modify their semantic data model. This is possible because Power BI uses the Analysis Services engine internally to power its datasets.

Thanks to our partnership with Microsoft, we were invited to the private preview tests. We managed to verify whether the “XMLA read/write endpoints” feature works for our use case and confirmed that it solves our problem as expected. As the feature is not yet available globally, we can’t use it in production. So, we discussed this with the customer and decided that the application code should temporarily support both approaches: the existing one with SSAS and the new one with the Power BI XMLA endpoints. Until the “XMLA read/write endpoints” feature is globally available, an interim solution will use the SSAS instance hosted in an Azure virtual machine and the existing application code.

Continuous Integration and Deployment

The process of building the application code, testing, and deployment must be redefined as a result of the cloud migration. Defining new CI/CD pipelines in Azure DevOps wasn’t a big issue, mainly because the team has previous experience with this platform. The new build definition that was created in DevOps is much clearer and more intuitive than its original counterpart in TeamCity.

The new release pipeline (CD) created in DevOps enables fully automatic deployments to be performed to all application environments – previously, only some of them were supported by a TeamCity deployment configuration. Deployment automation for all environments was a clear and important project requirement. While creating the release pipeline, we proposed and implemented the Roundhouse tool for database versioning and deployment. The application team had good experiences with the tool, which should improve future database maintenance. Before this improvement, all database changes were applied manually by running custom SQL patch scripts.

The SQL Database managed instance with no public endpoint can’t be accessed by Microsoft-hosted DevOps agents – it’s not visible outside of its own Azure virtual network. Therefore, we plan to use self-hosted DevOps agents that will be located within the same VNet in the customer’s Azure subscription.

Apart from the main release pipeline, additional DevOps pipelines were created for migrated Selenium, SpecFlow and tSQLt tests.

Summary

The migration project is still ongoing, but most of the work is completed now and some of the pre-production environments are up and running in Azure. The lesson we learnt during the course of this project is that cloud migration isn’t only about an application hosting change. There could be a multitude of requirements, dependencies, or constraints that aren’t clearly visible at the beginning, but which can occur later during a migration project. Dealing with such additional concerns can be more important, more technically challenging, and more time-consuming than a ‘pure’ cloud migration process. Apart from this, migrating to the cloud can be a good opportunity to discover and implement certain improvements within the delivery processes or the application.

This may seem obvious but it’s worth mentioning that good cooperation with the customer is crucial for us. Our kick-off workshop at the beginning of the project was a great idea, because it revealed most of the customer’s expectations and technical challenges. Together we were able to set a general direction for the new cloud-based solution.

If you’d like to learn more about the cloud, download our “Cloud Done Right: Effective Cost Management eBook. 

2988 HC Digital Transformation 476X381
Mariusz Jadach
See all Mariusz's posts

Related posts

You might be also interested in

Contact

Start your project with Objectivity

CTA Pattern - Contact - Middle