How to Work with Aggregate Functions in Cosmos DB SQL (Packt)

Learn how to work with aggregate functions in this article by Gastón C. Hillar, an independent consultant, a freelance author, and a speaker who has been working with computers since he was 8 years old and Daron Yöndem, a Microsoft Regional Director and a Microsoft MVP for 11 years.

— post by Gastón C. Hillar and Daron Yöndem, provided by Packt —

Working with aggregate functions

Cosmos DB SQL provides support for aggregations in the SELECT clause. For example, the following query will use the SUM aggregate function to sum all the values in the expression and calculate the total number of levels in the filtered games. The query uses the ARRAY_LENGTH built-in function to calculate the length of the levels array for each game and use it as an argument for the SUM aggregate function.

The code file for this article can be found at https://bit.ly/2FB9DDg. The sample is included in the sql_queries/videogame_1_17.sql file:

The following lines show the results of the query. Notice that the element of the array includes a key named $1:

Whenever you use an expression in the SELECT clause that is not a property name and you don’t specify the desired alias name, Cosmos DB generates a key that starts with the prefix and continues with a number that starts in 1. Hence, if you have three expressions that aren’t property names and don’t include their desired aliases, Cosmos DB will use $1$2 and $3 for the properties in the output results.

If you only want to generate the value without a key in the result, you can use the VALUE keyword. The following query uses this keyword. The code file for the sample is included in sql_queries/videogame_1_18.sql file:

The following lines show the results of the query. Notice that the element of the array doesn’t include the key:

It is also possible to achieve the same goal using the COUNT aggregate function combined with the In keyword. The following query uses the COUNT aggregate function to count the number of items in the expression and calculate the total number of levels in the iterated levels for all the games. The code file for the sample is included in the sql_queries/videogame_1_19.sql file:

The following lines show the results of the query. Notice that the query specified the desired alias:

Now you want to calculate the average tower power for the levels defined in the video games. The towerPower property is not defined for all the levels and it is only available for the levels of the game whose id is equal to 1. Whenever you use the AVG aggregate function to calculate an average for an expression, only the documents that have the property will be a part of the average calculation. Hence, the levels that don’t have the towerPower property won’t generate an impact on the average.

The following query uses the AVG aggregate function combined with the IN keyword to iterate all the levels of the games that have the towerPower property and compute its average value. The code file for the sample is included in the sql_queries/videogame_1_20.sql file:

The following lines show the results of the query. Notice that the query specified the desired alias:

If you found this article interesting, you can explore Guide to NoSQL with Azure Cosmos DB to create scalable applications by taking advantage of NoSQL document databases on the cloud with .NET Core. Guide to NoSQL with Azure Cosmos DB will help build an application that works with a Cosmos DB NoSQL document database with C#, the .NET Core SDK, LINQ, and JSON.

Cheers!

Creating a function in Azure (Packt)

Functions in a serverless architecture consist of logic that serves a single, well-defined purpose. They are executed using an ephemeral compute service and can be scaled automatically based on demand. Azure Functions is Microsoft’s solution for serverless functions.

— post by Joseph Ingeno, provided by Packt —

In my book, Software Architect’s Handbook, I discuss serverless architecture and the use of functions. In this blog post, we will create a new function in Azure. Functions in Azure provide you with a choice of programming language (C#, JavaScript, Java, F#, with others coming in the future) and allow you to bring in dependencies from the NuGet and npm package managers. The runtime that powers Azure Functions can be found on GitHub.

There are different ways that a function for Azure can be created, such as using Visual Studio, Visual Studio Code, or the Azure command line interface (Azure CLI). However, so as to make this demo not require the installation of any tools, we will create a function through the Azure portal.

The following steps will be necessary to create a function in the Azure portal:

  • Logging in to the Azure portal
  • Creating a function app
  • Creating a function in the new function app
  • Testing the function
  • Customizing the function
  • Cleaning up resources

 

Logging in to the Azure portal

If you do not already have one, the first step is to create a free Azure account by navigating to https://azure.microsoft.com. Each account receives some free credit to allow you to try out Azure services.

Once you have an account, you can log in to the Azure portal by going to http://portal.azure.com.

 

Creating a function app

A function app in Azure is a container that hosts the execution of individual functions. Before we create a function, we must create a function app that will host it.

  1. In the top left corner, select the Create a resource option, followed by Compute.

  1. Select Function App.

  1. Create a function app by providing values for the various settings:
    • App Name: A unique name for the function app
    • Subscription: The subscription under which the function app will be created
    • Resource Group: A resource group is a container that holds resources related to the Azure solution. Create a new resource group or use an existing one for the function
    • Hosting Plan: The hosting plan for the function app; The consumption plan is charged on a pay-per-execution basis and dynamically allows resources based on the app’s load. The App Service Plan lets you define a capacity allocation with predictable costs and scale.
    • Location: The region where your function app will execute; Select one near you or near other services that your function will need to access.
    • Storage: An Azure storage account is used to store and access your Azure Storage data objects. Create a new storage account or select an existing one to be used by the function

  1. Click the Create button to deploy the new function app. Once the application has been deployed, you will receive a notification.

 

Creating a function

Once your new function app has been created, we can create a function to be used in the container. We’ll be creating an HTTP triggered function that can be executed via an HTTP call.

  1. If Function Apps is already under your Favorites, simply select it.

  1. If it isn’t already in your favorites, select All services. Find Function Apps and click the star next to it to make it one of your favorites. After doing so, select Function Apps.

  1. You should see the function app that you created previously. Click the Create New icon next to Functions under your function app in order to create a new function.

  1. To create a new function, we can either select a premade one or create one on our own. In this case, we will use one of the premade ones.

Select Webhook + API and select one of the languages. In this example, I have elected to use C#. Click the Create this function button when you are finished, which will create the function.

If C# was selected as the language, the code for the premade function is as follows:

 

Testing the function

Now that the function has been created, we can test it to confirm that it works properly.

  1. Your function has now been created and should be visible in the Functions section underneath your function app.

  1. With your function selected, click the Get function URL link to get the URL for your function.

  1. The previous step will cause the Get function URL dialog box to be displayed. Select default (Function key) for the Key drop-down, and then click Copy to copy the URL.

The URL contains a key that is required, by default, to access the function over HTTP.

  1. Paste the URL into a browser. As we saw when looking at the code, there is a query string parameter for a name Add &name=<yourname> to the end of the URL, replacing <yourname> with your actual name. Press the Enter key to execute the request.

If the name that you passed was “Joe,” depending on the browser that you used you would either see the string “Hello Joe” or you would see it in XML format.

Trace information is written to logs during function execution. With the focus being placed on your function in the Azure portal, the contents of the logs can be viewed at the bottom of the page.

 

Customizing the function

You may want to customize the function that you just created in terms of things like the HTTP methods that are supported by the function and the routing template. Under your specific function, select the Integrate option.

There are various options for the HTTP trigger that can be customized.

Allowed HTTP methods

You can configure your function to support all HTTP methods (in which case the Selected HTTP methods section with its checkboxes is hidden) or selected HTTP methods. In the Selected HTTP methods section, you can choose which HTTP methods you want the function to support (GET, POST, DELETE, HEAD, PATCH, PUT, OPTIONS, and TRACE).

Authorization level

You can choose between the following authorization levels:

  • Function
  • Admin
  • Anonymous

With Function selected as the authorization level, an API key is required to access the function. This key was being passed as part of the URL:

If we set the authorization level to Anonymous, our function will be accessible without an API key. This makes the URL look as follows:

Route template

You can change the route that is used in order to invoke the function.  For example, if you were to enter “/HelloTrigger” in the Route template textbox, instead of using the function name in the URL as is the case by default, the enter route template value will be used instead. The resulting URL is as follows:

The “api” part of the URL is part of the base path prefix and is handled by a global setting.

Cleaning up resources

Resource groups in Azure are containers that hold resources such as function apps, functions, and storage accounts. Deleting a resource group deletes everything that it contains. Once you are done with the demo, you may want to delete the resource group that was used so you are not billed for any resources.

  1. In the Azure portal, go to the relevant resource group page. One of the ways that this can be done is by selecting Resource groups and then selecting a resource group.

  1. An alternative approach is to select our function app as we had done previously and select its resource group from there.

  1. With the resource group in focus, select the Delete resource group

  1. Follow the instructions and click the Delete

It can take a few minutes, but eventually, the resource group and its contents will be deleted. A notification will appear to let you know when it is done.

Joseph Ingeno is a software architect who has designed and developed many different software applications. During his career, he has worked on projects for a number of different business domains, using a variety of technologies, programming languages, and frameworks. His book, Software Architect’s Handbook, was published by Packt and is now available. The Software Architect’s Handbook is a comprehensive guide to help developers, architects, and senior programmers advance their career in the software architecture domain.

Cheers!

Creating, Debugging and Deploying an Azure Function (Packt)

You can learn how to create, debug, and deploy an Azure Function by reading this tutorial by Daniel Bass, a developer who develops complex backend systems entirely on Azure, making heavy use of event-driven Azure Functions and Azure Data Lake.

Serverless programming has been a buzzword in technology for a while now, first implemented for arbitrary code by Amazon on Amazon Web Services (AWS) in 2014. The term normally refers to snippets of backend code running in environments that are wholly managed by the cloud provider, totally invisible to developers. This approach has some astounding benefits, enabling an entirely new paradigm of computing architecture.

This article will focus on Microsoft’s serverless product, Azure Functions. In this article, you’ll create an Azure Function in Visual Studio, debug it locally, and deploy it to an Azure cloud instance. You can refer to https://github.com/TrainingByPackt/Serverless-Architectures-with-Azure/tree/master/Lesson%201 to access the complete code for this article.

To develop Azure Functions for production, you need a computer running Windows and Visual Studio 2015 or later; however, the smoothest experience is present in Visual Studio 2017, version 15.4 or later. If your computer can run Visual Studio, it can handle the Azure Function development.

Creating Your First Function to Receive and Process Data from an HTTP Request

Before you begin, confirm that you have Visual Studio 2017 version 15.4 installed; if not, download and install it. Visual Studio 2017 has a comprehensive suite of Azure tools, including Azure Function development. To do so, perform the following steps:

1. Open the Visual Studio Installer, which will show you the version of Visual Studio that you have installed and allow you to select the Azure Workflow and install it; if it is missing, then update Visual Studio, if required, to the latest version:

2. Click on Modify, select the Azure development workload, and click on Modify again:

Now, you can create a new Azure Function as a part of your serverless architecture that listens to HTTP requests to a certain address as its trigger. Begin by implementing the following steps:

1. Create a new solution. The example is called BeginningAzureServerlessArchitecture, which is a logical wrapper for several functions that will get deployed to this namespace.

2. Use the Visual C# Cloud | Azure Function Select the Empty trigger type and leave the default options, but set storage to None. This will create a Function App, which is a logical wrapper for several functions that will get deployed and scaled together:

3. You now have a solution with two files in it: host.json and local.settings.json. The local.settings.json file is used solely for local development, where it stores all details on connections to other Azure services.

It is important to note that when uploading something to a public repository, be very careful not to commit unencrypted connection settings – by default they will be unencrypted. host.json is the only file required to configure any functions running as a part of your Function App. This file can have settings that control the function timeout, security settings for every function, and a lot more.

4. Now, right-click on the project and select Add New Item. Once again, choose the Azure Function template:

5. On the next screen, select Http trigger with parameters, and set the Access rights to Anonymous. Right-click on your solution and select Enable NuGet Package Restore:

6. You will now have a C# file called PostTransactions.cs. It consists of a single method, Run, with an awful lot in the method signature: an attribute and an annotation. Some of this will be familiar to you if you are an experienced C# developer, and it is important to understand this signature.

Configuration as code is an important modern development practice. Rather than having servers reconfigured or configured manually by developers before code is deployed to them, configuration as code dictates that the entire configuration required to deploy an application to production is included in the source code.

This allows for variable replacement by your build/release agent, as you will (understandably) want slightly different settings, depending on your environment. Azure Functions implement this principle, with a configuration split between the host.json file for app-wide configurations and app settings, and the Run method signature for individual functions. Therefore, you can deploy an Azure Function to production with only the code that you find in the GitHub repository:

Outcome

You created an Azure Function, understood the roles the different files play, and learned about configuration as code.

The FunctionName annotation defines the name of the function within the Function App. This can be used for triggering your function, or it can be kept separate. The first parameter is an HttpRequestMessage object with an HttpTrigger attribute. This is what varies when you choose different triggers (for example, a timer trigger will have an object with a TimerTrigger attribute).

This attribute has several arguments. The first is the authorization level. Do you remember setting this when you created the function? It was called Access rights in the template. This defines the level of authorization that the function will demand of HTTP requests. The five levels are shown in the following table:

Authorization level Required information
Anonymous No key required; anyone with the path can call it an unlimited number of times.
User Need a valid token, generated by a user that has AD permission to trigger the Function App. Useful for high-security environments, where each service needs to manage its own security. Generally, token-based authentication is much more desirable than key-based.
Function Need the function key—a unique key created for each function in a Function App upon deployment. Any host key will also work. The most common form of authorization for basic deployments.
System Need the master key—a key at the Function App level (called a host key) that cannot be deleted but can be renewed.
Admin Need any host key.

One thing to bear in mind is that if you set a function to be high security and use System or Admin authorization, then any client that you give that key to will also be able to access any other functions in the Function App (if they can work out the path). Make sure that you separate high-security functions into different apps.

The next parameters are GET and POST, which define the HTTP verbs that will activate the function. Generally, from a microservices architecture point of view, you should only have one verb to prevent you from having to do bug-prone switching logic inside of the function. You can simply create four separate functions if you want GET,  POST,  PUT, and  DELETE on an artifact.

Finally, there is a string assigned to the property route. This is the only bit of routing logic that the function itself can see, and it simply defies the subpath from the Function App. It accepts WebAPI syntax, which you can see in the curly braces, / {name}. This will assign any text that appears where the curly braces are to a parameter called name.

This completes the HttpTrigger object. The three parameters left in the method signature are an HttpRequestMessage object, which allows you to access the HttpRequestMessage that triggered the function; a string parameter called name, which is what the string in the curly braces in the path will get bound to; and a TraceWriter for logging.

The current logic of the Function App can be seen in the following example, and you should see that it will take whatever name is put into it and send back an HTTP response saying Hello to that name.

Debugging an Azure Function

You now have a working Azure Function that can be deployed to Azure or run locally. You’ll first host and debug the function locally, to show the development cycle in action.

Debug an Azure Function

In this section, you’ll run an Azure Function locally and debug it. You can develop new functions and test the functionality before deploying to the public cloud. And to ensure that it happens correctly, you’ll require the single function created directly from the HTTP trigger with the parameters template.

Currently, your machine does not have the correct runtime to run an Azure Function, so you need to download it:

1. Click on the Play button in Visual Studio, and a dialog box should ask you if you want to download Azure Functions Core Tools – click on Yes. A Windows CMD window will open, with the lightning bolt logo of Azure Functions. It will bootstrap the environment and attach the debugger from Visual Studio. It will then list the endpoints the Function App is listening on.

2. Open the Postman app and copy and paste the endpoint into it, selecting either a POST or GET verb. You should get the response Hello {name}. Try changing the {name} in the path to your name, and you will see a different response. You can download Postman at https://www.getpostman.com/.

3. Create a debug point in the Run method by clicking in the margin to the left of the code:

4. Use Postman to send the request.

5. You are now able to use the standard Visual Studio debugging features and inspect the different objects as shown in the following screenshot:

6. Set your verb to POST, and add a message in the payload. See if you can find the verb in the HttpRequestMessage object in debug mode. It should be in the method property.

Outcome

You have debugged an Azure Function and tested it using Postman. As you can see from running the function locally, you, the developer, do not need to write any of the usual boilerplate code for message handling or routing. You don’t even need to use ASP.NET controllers, or set up middleware. The Azure Functions container handles absolutely everything, leaving your code to simply do the business logic.

Activity: Improving Your Function

In this activity, you’ll add a JSON payload to the request and write code to parse that message into a C# object.

Prerequisites

You’ll require a function created from the HTTP trigger with the parameters template.

Scenario

You are creating a personal finance application that allows users to add their own transactions, integrate with other applications, and perhaps allow their credit card to directly log transactions. It will be able to scale elastically to any number of users, saving money when you don’t have any users.

Aim

Parse a JSON payload into a C# object, starting your RESTful API.

Steps for Completion

1. Change the Route to transactions.

2. Remove the get Remove the String parameter called name:

3. Add the Newtonsoft.json package, if it isn’t already present. You can do this by right-clicking on Solution Manage NuGet packages | Browse | Newtonsoft.json.

4. Right-click on the project and add a folder called Models, and then add a C# class called Transaction. Add two properties to this class: a DateTime property called ExecutionTime, and a Decimal property called Amount:

5. Use DeserializeObject<Transaction>(message).Result() to de-serialize the HttpRequestMessage into an instantiation of this class. To do this, you need to import the Models namespace and Newtonsoft.json. This will parse the JSON payload and use the Amount property to file the corresponding property on the Transaction object:

6. Change the return message to use a property of the new Transaction object, for example, You entered a transaction of £47.32!. Go to Postman and open the Body tab and select raw.

7. Enter the following JSON object:

8. Run locally to test. Make sure that you change the endpoint to /transactions in Postman.

Outcome

You learned how to access the HttpRequestMessage, and you will have a function that can read a JSON message and turn it into a C# object. During this subtopic, you debugged an Azure Function. Visual Studio only allows this through downloading azure-functions-core-tools. Unfortunately, it doesn’t make it available on the general command line—only through command windows started in Visual Studio. If you want to use it independently, then you have to download it using npm. If you need to download azure-functions-core-tools separately, you can use npm to get it – npm install -g azure-functions-core-tools for version 1 (fully supported) and npm install -g azure-functions-core-tools@core for version 2 (beta). You can then use the debug setup to set Visual Studio to call an external program with the command func host start when you click on the Debug button.

This package is a lot more than just a debug environment, however; it actually has a CLI for everything you could possibly need in the Azure Function development. Open up a command window (in Visual Studio, if you haven’t downloaded it independently) and type func help; you should see a full list of everything the CLI can do. Notable commands are func host start, which starts the local debug environment, and func azure {functionappname} fetch-app-settings, which lets you download the app settings of a function deployed to Azure so that you can test integration locally, as well. These need to be run in the same folder as the host.json file.

Deploying an Azure Function

An Azure Function is obviously geared towards being hosted on the Azure cloud, rather than locally or on your own computer. Visual Studio comes with a complete suite of tools to deploy and manage Azure services, and this includes full support for Azure Functions. The azure-functions-core-tools CLI that you downloaded to provide a local debug environment also has a set of tools for interacting with Azure Functions in the cloud, if you prefer CLIs.

It is possible to run Azure Functions on your own servers, using the Azure Functions runtime. This is a good way to utilize the sunk cost that you have already spent on servers, combined with the unlimited scale that Azure offers (if demand exceeds your server capacity). It’s probably only worth it in terms of cost if you have a significant amount of unused Windows server time because this solution inevitably requires more management than normal Azure Function deployments. To deploy to Azure, you’ll need an Azure login with a valid Azure subscription.

Deploying to Azure

In this section, you’ll deploy your first function to the public cloud and learn how to call it. You’ll go live with your Azure Function start creating your serverless architecture. And to ensure that it happens correctly, you’ll need a function project and a valid Azure subscription. You can begin by implementing the following steps:

1. Right-click on your project and select Publish…. Now select Azure Function App Create New as shown in the following screenshot:

2. Enter a memorable name and create a resource group and a consumption app service plan to match the following:

3. Click on the Publish button to publish your function.

4. Open a browser, navigate to http://portal.azure.com, and find your function. You can use the search bar and search the name of your function. Click on your Function App, and then click on the function name. Click on Get function URL in the upper-right corner, paste the address of your function in Postman, and test it. If you have a paid subscription, these executions will cost a small amount of money – you are only charged for the compute resources that you actually use. On a free account, you get a million executions for free.

Outcome

You now have a fully deployed and working Azure Function in the cloud.

This is not the recommended way to deploy to production. Azure Resource Manager (ARM) templates are the recommended way to deploy to production. ARM templates are JavaScript Object Notation (JSON) files. The resources that you want to deploy are declaratively described within JSON. An ARM template is idempotent, which means it can be run as many times as required, and the output will be the same each and every time. Azure handles the execution and targets the changes that need to be run.

If you found this article interesting, explore Daniel Bass’ Beginning Serverless Architectures with Microsoft Azure to quickly get up and running with your own serverless development on Microsoft Azure. This book will provide you with the context you need to get started on a larger project of your own, leaving you equipped with everything you need to migrate to a cloud-first serverless solution.

Cheers!

P.S. This post was provided by good people at Packt Publishing. Thank you, Ron!

How to Leverage SSIS in ADF (Packt)

Learn how to leverage SSIS on premises to interact with cloud data in ADF v2 in this tutorial by Christian Cote, Michelle Gutzait, and Giuseppe Ciaburro, the authors of Hands-On Data Warehousing with Azure Data Factory.

SQL Server Integration Services (SSIS) has been the Microsoft ETL predilection tool for more than a decade. A lot of enterprises have used SSIS to load their on-premise data warehouses since its inception in SQL Server 2005.

In the last couple of years, IT departments have had to deal with different kinds of data and specific toolsets to process them. SSIS has successfully been able to access cloud data from on-premise ETL servers since 2015 with the Azure Feature Pack (https://docs.microsoft.com/en-us/sql/integration-services/azure-feature-pack-for-integration-services-ssis?view=sql-server-2017). However, issues occur when most of the ETL is in the cloud and SSIS is in a small part of the chain. And, up until now, it was very complex to use ADF v1 as the orchestrator in the cloud, with some SSIS package calls in the pipeline.

The following sections will describe how SSIS on-premises can be successfully leveraged to interact with cloud data in ADF v2.

Sample setup

The first thing that you need to do is set up an SSIS solution. You can use Microsoft wide world importers. The samples can be found at https://github.com/Microsoft/sql-server-samples/releases/tag/wide-world-importers-v1.0.

From the list, you can download the databases and the SSIS solution:

  • Daily.ETL.ispac: This is the SSIS solution that takes data from WideWorldImporters to the WideWorldImportersDW database
  • WideWorldImporters-Full.bacpac: This is the transactional database that is used as a source for the data warehouse
  • WideWorldImportersDW-Full.bacpac: This is the sample data warehouse database that you’ll use in this article

Sample databases

The first step is to set up sample databases. For this, you’ll need to install SQL Server Management Studio (SSMS). This program is freely available from Microsoft at https://docs.microsoft.com/en-us/sql/ssms/download-sql-server-management-studio-ssms.

Start SSMS to connect to your server and right-click on the Databases folder in Object Explorer. Select Import Data-tier Application… from the contextual menu as shown in the following screenshot:

The Import Data-tier Application wizard will open. Now click on Next.

In the Import Settings step, make sure that the Import from local disk option is selected and click on Browse…. From the Open window, navigate to the location from where you previously downloaded the .bacpac files and click on Open, as shown in the following screenshot:

Click on Next on the Import Settings screen.

You can now proceed to the Database Settings step. Make sure that the entry in the New database name textbox is WideWorldImporters and click on Next.

The Summary will be displayed as shown in the following screenshot. Click on Finish to start the BACPAC import process:

The database is created as well as all the objects in it. This process may take a few minutes. Click on Close to terminate the process:

Repeat the same process for the other BACPAC WideWorldImportersDW-Full.bacpac. Make sure that you specify WideWorldImportersDW as the database name.

SSIS components

The SSIS setup usually requires an existing installation of SQL Server, with the Integration Services feature selected. With ADF v2, this step is less mandatory since you can deploy the packages in an Azure database.

That being said, it’s always better to test your SSIS solution on premise before deploying it in Azure. The next section will describe how to set up a sample SSIS solution.

Integration Services Catalog setup

This step is necessary for brand new SQL Server installations where you may want to deploy SSIS packages. When logged in to the server with SSMS, simply right-click on the Integration Services Catalogs folder and select Create Catalog…

The Create Catalog window will open up. You’ll need to fill in the properties and click on OK to create the integration services catalog:

Next, you need to create a folder for your project. In Object Explorer, expand the Integration Services Catalogs node, right-click on SSISDB, and select Create Folder… from the contextual menu:

The Create Folder window will open up. Set the properties as shown in the following screenshot and click on OK:

You’re now ready to open the solution in Visual Studio. Here, you can deploy it to your newly created on-premises Integration Services Catalogs folder.

Sample solution in Visual Studio

For this section, you’ll need an existing installation of Visual Studio 2015 or 2017 and SQL Server Data Tools (SSDT) installed. Visual Studio is available at https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017.

Once SSDT is opened, click on FILE | New | Project…:

The New Project window will open up. Select Integration Services Import Project Wizard, type WWImportersDaily in the textbox beside Name, and click on OK:

The Integration Services Import Project Wizard window will open; click on Next to proceed to the next step:

In the Select Source window, select the Project deployment file radio button and click on Browse…. This will open up the Open file window. Navigate to the folder where .ispac, the SSIS sample file, has been downloaded and click on Open:

Once back to the wizard, click on Next to proceed to the next step.

In the Review window, click on the Import button.

Once the project has been imported, you can see it in Solution Explorer:

You’ll have to modify the two connection managers if your server is not the default instance. To do so, double-click on WWI_DW_Destination_DW.conmgr and WWI_Source_DB.conmgr, and change their server properties.

If the connections are online, you can run the package and see if everything is working fine.

Deploying the project on premises

You’re now ready to deploy and test your project on your local server:

1. From the Solution Explorer, right-click on the WWImportersDaily project and select Deploy. Integration Services Deployment Wizard will open up as shown in the following screenshot. Click on Next to proceed to the next step:

2. Connect to your SQL Server where the integration services catalog was created before and click on Browse…. The Browse for Folder or Project window will open.

3. Expand SSISDB and select the DWDailyRefresh

4. Click on OK to close the window:

5. On the Select Destination window, click on Next.

6. The next step is to validate. However, the on-premise deployments don’t use it.

7. Click on Next to proceed to the Review window. Here, you can review whether you have selected the right .ispac file and confirm that you’re going to deploy it to the right folder in SSIS:

8. Now click on Deploy to start the deployment of the .ispac file.

9. The last step, Results, shows the progress of your deployment. If everything is successful (as it should be), click on the Close button:

You’re now done with the deployment.

10. The next step is to execute the package via the catalog. You can switch to SQL Server Management Studio for this.

11. In the Object Explorer, expand Integration Services Catalogs and navigate through the DailyETLMain.dtsx package.

12. Right-click on it and select Execute… from the contextual menu to open up the execution window:

13. Go to the Connection Managers tab, check the Connection String property for both connectors, and adjust them if necessary. If everything is okay, click on OK to start the package execution:

14. Click on Yes to see the report. Once the report has opened, you might notice that the package is still running. When you refresh it, though, you’ll see that the package has been successfully executed:

You have successfully leveraged SSIS on-premises. If you found this article to be an interesting read, you can explore Hands-On Data Warehousing with Azure Data Factory to learn how to build your own ETL solutions and address the key challenges faced while building them.

Cheers!

P.S. This post was provided by good people at Packt Publishing. Thank you, Ron!

Getting started with Microsoft Azure Stack (ASDK) (8)

After seven posts on how to get started with Microsoft Azure Stack/Azure Stack Development Kit, I think it’s finally time to share some additional resources that can help you on this amazing journey!

First (and the most important one… and the one that I already mentioned a few times) is the official documentation – hosted at docs.microsoft.com, regularly updated… with everything you need… just at the tip of your fingers! Amazing!

Next, there are two books I’ve been reading (feel free to leave a comment if you know any other book or resource I’ve missed) – the first one is Packt’s Building Hybrid Clouds with Azure Stack by Markus Klein and Susan Roesner (thank you, Ron!). The second book is Sams’ Microsoft Hybrid Cloud Unleashed with Azure Stack and Azure by Kerrie Meyler, Steve Buchanan, Mark Scholman, Jakob Gottlieb Svendsen and Janaka Rangama. Great books written by great authors!

Also, PluralSight is offering the short Microsoft Azure Stack: The Big Picture by Jason Helmick – totally free!

And if you plan to certify on Azure Stack – the official exam 70-537: Configuring and Operating a Hybrid Cloud with Microsoft Azure Stack is available
(accompanied by the official MOC, of course)!

Lastly – make also sure that you don’t miss Jeffrey’s inspiring talk!

Hope it helps!

Getting started with Microsoft Azure Stack (ASDK) (7)

One other important topic of running your Microsoft Azure Stack/ASDK environment is updating. You’ll need to update your deployment regularly to stay supported.

So… how do we do it?

First, you’ll download updates by using the provided Azure Stack Updates Downloader utility.

If you remember the following video, starting at 1:04, you could see the Downloader in action:

Next, you’ll unpack and upload the updates to your Stack deployment (to the updateadminaccount storage account, to be exact):

As the last step, Azure Stack will pick the info about uploaded updates and let you press the (magic) Update now button.

Then… we wait and our updates will be (magically) installed. And that’s it! Cool!

I’ve recorded a short video about this one as well:

Note, though, that updating of the nested Azure Stack/ASDK (i.e. installed inside of a virtual machine, either on Hyper-V, Azure or somewhere else) will probably not work… and this is perfectly fine (and expected)!

Cheers!

P.S. The whole YouTube playlist is available here.

Getting started with Microsoft Azure Stack (ASDK) (6)

Once you’re all set with preparing your Azure Stack/ASDK infrastructure, it’s time to let other users actually use it by forming tenants.

For this, you’ll first create a Plan which contains all the services a tenant is allowed to use with setting the appropriate Quotas on these services. Next, you’ll create an Offer consisting of the plan you just created (and any other you may have created earlier). With that, you can create a Subscription which will, finally, allow your user to consume Azure Stack/ASDK services (i.e. provision virtual machines and any other service you may have put in your plan).

I’ve made a short video about provisioning the above mentioned for my newly created [email protected] user:

Note that my user is using the “tenant portal” (https://portal.local.azurestack.external/) for provisioning its services, and not the “admin portal” (https://adminportal.local.azurestack.external/).

Cheers!

Getting started with Microsoft Azure Stack (ASDK) (5)

It’s not like we can deploy only WordPress or Ubuntu on top of our ASDK – we can do so much more! And now I’ll show you how to add Windows 2016 image to your Azure Stack (ASDK) environment.

First, you’ll need to connect to your ASDK environment by using the Login-AzureRmAccount command. Then you’ll need the Windows Server 2016 ISO, and with an easy New-AzsServer2016VMImage command, you’ll import the Windows Server 2016 bits and create Windows Server 2016 image/template for your Azure Stack environment.

There is also a video showing you all the steps and note that you can also add the latest Cumulative Update (CU) to the template with a simple switch called -IncludeLatestCU (how cool is that?!):

After that, you can easily create a new virtual machine based on Windows Server 2016 template you’ve just created, like in the following video:

Cheers!

Getting started with Microsoft Azure Stack (ASDK) (4)

And now, a short and sweet one – once you have downloaded, installed and registered your ASDK, you can finally start using it!

First thing I would do is to grab something from the Azure Marketplace – inside your Admin Portal (https://adminportal.local.azurestack.external/), you can open up the Marketplace management section and then click on Add from Azure button:

This gives you a whole selection of images available on Azure, which can be downloaded and used on you ASDK. Just for fun, in the next video, I’ll download WordPress (on Ubuntu) and a VM extension called Microsoft Antimalware (hmmm… I wonder what it does? Smile):

Once downloaded, you can provision yourself a brand new WordPress instance, running on top of Ubuntu, on top of your own ASDK, as you can see in the following video:

Happy weekend!

Getting started with Microsoft Azure Stack (ASDK) (3)

As a final preparation step (following my previous post) in using the ASDK, we need to first register it. For that, an Azure subscription is required!

So… let’s do it.

First, we need to make sure we have the required Azure Stack PowerShell modules. It’s also convenient to make the PSGallery trusted repository for installation of the modules, if we need to install them. Then we can download all the tools we’ll need, and finally register our (connected) ASDK environment, using the following commands:

You can find the whole procedure explained in more detail inside the official documentation, of course.

And… once registered – we can finally start using our ASDK!

As before – I’ve also prepared a short video about registering the ASDK:

Happy exploring!