MS Cloud Foundation

Architecture Foundation - Appendix

 

 

 

 

Company / Organization: becke.ch 

Scope: 0.0 {language={en}; document-type={ott}; organization={becke.ch}; interest={business}; domain={architecture}; level={foundation}; technology={ms-cloud,azure,office-365,dynamics-365}}

Version: 1.0.0

File-name: becke-ch--ms-cloud--s1-v1--foundation.odt

Author: ms-cloud--s0-v1@becke.ch  

Copyright © 2019 becke.ch – All rights reserved

 

Document Version History

Version

Date

Author

Description

1.0.0

07.2019

Raoul Becke

Initial version: Changes according to requirements; see table below version 1.0.0.

 

 

Work-Product Version History

Version

Date

Author

Requirements

Components Changed

1.0.0

07.2019

Raoul Becke

Due to its huge size, moved the appendix of document “becke-ch--ms-cloud--s0-v1” see [1] into this separate document.

Document

 

 

 

 

 

Table of Contents

1. Introduction

2. References and glossary

2.1. References

2.2. Glossary (terms, abbreviations, acronyms)

A. Appendix

A.1. Subscription

A.1.1. Trial Subscription (private)

A.1.2. Azure Portal Login

A.1.3. Trial Subscription Expired Subscribe Again

A.1.4. Organization Subscription

A.1.5. Change Subscription Directory

A.2. My Vsiual Studio: Developer License (MSDN): Activate Azure Benefit on Alternate Account

A.2.1. Error: Oops: It appears that you have already used your MSDN benefit for a Microsoft Azure Subscription

A.3. User Management

A.3.1. Create a new Standard User

A.3.2. Log-In & Set Password

A.3.3. Guest User

A.4. Group Management

A.4.1. Create a new Security Group

A.5. Database

A.5.1. Query Editor

A.5.2. nd logins for “MSDN Subscription” but having “...1...v1...” in the name)

A.5.3. AD User: Create Schema and Grant Access To Create Table, Insert, Select

A.5.4. AD Group: Create Schema and Grant Access To Create Table, Insert, Select

A.5.5. Timeout Issues, Query Information & sys.dm_exec_requests

A.6. Logic App

A.6.1. Create

A.6.2. Logic App Designer

A.7. Azure Data Factory – V2: ETL: Extract, Transfrom & Load: SSIS

A.7.1. SQL Server Data Tools (SSDT) – Installation

A.7.2. Call a Web Service using SSIS then Store and Query the Results – NOT WORKING!

A.7.3. Provision the Azure-SSIS Integration Runtime in Azure Data Factory

A.8. Azure Data Factory – V1: EL: Extract & Load: Copy Data

A.8.1. Copy Data (Preview)

A.8.2. Modify & Deploy

A.8.3. Deployment Script

A.8.4. Incremental Load & Data Delay

A.8.5. Performance Improvement aka Reduce Interval below 15 minutes

A.8.6. ERROR & SOLUTION

A.9. Key Vault

A.9.1. Security

A.9.2. Setup

A.10. Power BI

A.10.1. Desktop Installation

A.10.2. Power BI Desktop

A.10.3. Power BI Pro

A.10.4. Assign license to user

A.10.5. Portal

A.10.6. Create Workspace

A.11. Storage Account

A.11.1. Set-Up

A.11.2. File Share

A.11.3. Security

A.11.4. Build-Time: Administration: RBAC

A.12. Azure Container Instance (ACI)

A.12.1. Create Container Instance

A.12.2. Execute a command in a running Azure container instance

A.13. Azure Cloud Services

A.13.1. Creation

A.13.2. Setup

A.14. Create new Directory

 

Illustration Index

Illustration Index

 

 

Index of Tables

Index of Tables

Table 1: References        8

Table 2: Glossary        8

 

1. Introduction

This document gives an overview of the MS cloud offerings with a focus on MS Azure. It is a summary of many different Microsoft Cloud articles spread over the Internet.

This document contains only the appendix moved over from “becke-ch--ms-cloud--s0-v1” see [1].

 

2. References and glossary

2.1. References

 

Reference

Location

Remarks

[1]

HTML, HTML A4, PDF

MS Cloud foundation: Architecture Foundation.

Table 1: References

 

2.2. Glossary (terms, abbreviations, acronyms)

Terms / Abbreviations / Acronyms

Description

DC

Data-center see Error: Reference source not found

Table 2: Glossary

 

 

2.2.0.0.0.A. Appendix
2.2.0.0.0.A.1. Subscription
2.2.0.0.0.A.1.1. Trial Subscription (private)
  1. 1.Go to “https://azure.microsoft.com/en-us/free/ 

 
  1. 2.Click on link “Start free” you will then be redirected to “https://login.microsoftonline.com/common/oauth2/authorize 

     

 

 

 
  1. 3.Click on link Use another account” 

 
  1. 4.Click on link Create a new Microsoft account” you are then redirected to page: “https://signup.live.com/signup 

 

 
  1. 5.Enter the following information: 

    1. a.Email: ms-cloud--s0-v1@becke.ch 

    2. b.Pwd: e… 

    3. c.Optional De-Select “Send me promotional emails from Microsoft” 

  2. 6.Click on button “Next” 

 

 

 
  1. 7.Check you email account: 

Email

From: Microsoft account team <account-security-noreply@accountprotection.microsoft.com>

To: ms-cloud--s0-v1@becke.ch

Subject: Verify your email address

 

Microsoft account Verify your email address To finish setting up your Microsoft account, we just need to make sure this email address is yours. To verify your email address use this security code: 8... If you didn't request this code, you can safely ignore this email. Someone else might have typed your email address by mistake. Thanks, The Microsoft account team

 

 

  1. 8.Enter the code you received via Email and click “Next” 

 

 
  1. 9.Enter the characters you see on the captcha and click “Next” you are then redirected to page “https://account.azure.com/signup 

 

 
  1. 10.Enter the information “About you”: 

    1. a.Country Region: Switzerland 

    2. b.First Name: Microsoft 

    3. c.Last Name: Cloud (ms-cloud—s0-v1) 

    4. d.Email: ms-cloud--s0-v1@becke.ch 

    5. e.Work Phone: 004179… 

    6. f.Organization Optional: Leave Empty 

  2. 11.Identity verification by phone: 

    1. a.Switzerland (+41) 

    2. b.79… 

    3. c.Enter the message you received into the text message field: 785… 

  3. 12.Identity verification by card: 

    1. a.Payment method: New Debit/Credit Card 

    2. b.Card number: … 

    3. c.Expiration date: 07/2018 

    4. d.CVV: … 

    5. e.Name on card: Raoul … 

    6. f.Address line 1: S... 3 

    7. g.Address line 2: [empty] 

    8. h.City: Elgg 

    9. i.State/Province/Distrinct: [empty] 

    10. j.Postal Code: 8353 

  4. 13.Agreement – click on “Agree” - you are then redirected to page: “https://account.azure.com/signuplanding/Index 

 

The subscription is active and you can proceed to the following main administration pages:

 

2.2.0.0.0.A.1.3. Azure Portal Login
  1. 1.Navigate to “http://portal.azure.comwhich redirects you to “https://login.microsoftonline.com 

 
  1. 2.Enter your credentials: 

    1. a.User: ms-cloud--s0-v1@becke.ch 

    2. b.Password: ... 

  2. 3.You are then redirected to: https://login.live.com/ 

 
2.2.0.0.0.A.1.5. Trial Subscription Expired Subscribe Again
  1. 1.Follow the steps described in chapter A.1.1 but when reaching step “https://account.azure.com/signup ” you get the message: “You already used your one free trial or you are not eligible for it.”: 

 
  1. 2.Trying with a different user: ms-cloud--s0-v2@becke.ch 

BUT after the last step getting the error message: “We cannot proceed with signup due to an issue with your account. Please contact support.

 
2.2.0.0.0.A.1.7. Organization Subscription

 

https://account.azure.com/organization

 

Domain Name: Handling Insufficient: The domain name handling in Microsoft Azure is insufficient – see chapter Error: Reference source not found.

 

  1. 1.About you 

    1. a.FIRST NAME: Microsoft 

    2. b.LAST NAME: Cloud (ms-cloud—s0-v1) 

    3. c.CONTACT EMAIL ADDRESS: ms-cloud--s0-v1@becke.ch 

    4. d.ORGANIZATION NAME (OPTIONAL): becke.ch 

  2. 2.Your login and country information: 

    1. a.DOMAIN NAME: beckech (.onmicrosoft.com) “check availability” (see issues above) 

    2. b.NEW USER ID: ms-cloud--s0-v1 (@beckech.onmicrosoft.com)  

    3. c.COUNTRY/REGION: Switzerland 

    4. d.CREATE NEW PASSWORD: E… 

    5. e.CONFIRM NEW PASSWORD: E… 

  3. 3.Your mobile number and verification: 

    1. a.Your mobile number and verification: Switzerland (+41) 

    2. b.SEND TEXT MESSAGECALL ME: 79 ... (You get an SMS from +447781470659) 

    3. c.Verify: 155... 

  1. 4.You are redirected to: Log-In to Microsoft Azure: https://login.microsoftonline.com : 

 

Login: ms-cloud--s0-v1@beckech.onmicrosoft.com

Password: E...

 

 

  1. 5.You are redirected to: Account Azure: https://account.azure.com  

 
  1. 6.About you: 

    1. a.* Country/Region: Switzerland 

    2. b.* First Name: Microsoft 

    3. c.* Last Name: Cloud (ms-cloud—s0-v1) 

    4. d.* Email address for important notifications: ms-cloud--s0-v1@becke.ch 

    5. e.* Work Phone: 79 ... 

    6. f.Organization (Optional): becke.ch 

 

 

 

 
  1. 7.Identity Verification by phone: 

    1. a.Switzerland (+41) 

    2. b.79 ... (“Send text message”) (You get an SMS from +447781470659) 

    3. c.Verify: 572... (“Verify Code”) 

 

 

 

 
  1. 8.Identity Verification by card: 

    1. a.Payment method: New Credit/Debit Card: VisaMasterCardAmerican Express (Your card will not be charged, though you might see a temporary authorization hold.) 

    2. b.* Card number: 3776 … 

    3. c.* Expiration date: 07/2018 

    4. d.* CVV: … 

    5. e.* Name on card: Ra… 

    6. f.* Address line 1: So… 

    7. g.Address line 2 (Optional): [empty] 

    8. h.* City: E… 

    9. i.State/Province/District (Optional): [empty] 

    10. j.* Postal Code: 8... 

 

 

 

 
  1. 9.Agreement: 

    1. a.Select: “I agree to the subscription agreement, offer details, and privacy statement.” 

    2. b.Optional: Select: “Microsoft may use my email and phone to provide special Microsoft Azure offers.” 

 

Welcome:

 

Get started with your Azure subscription: https://portal.azure.com

 

AND the good thing is that even I already had a trial subscription previously (see above) which has expired. I could again get a trial subscription:

 
2.2.0.0.0.A.1.9. Change Subscription Directory

There exist 2 different ways to change the subscription directory:

 
 
 

 

 

 

2.3.0.0.0.0.1. My Vsiual Studio: Developer License (MSDN): Activate Azure Benefit on Alternate Account

Chrome (or Firefox): Always use the Chrome (or Firefox) Browser to e.g. avoid „strange” SSO behavior of IE.

Email: Wait for your confirmation of the MSDN Platform Subscription:

 

My Vsiual Studio: Developer License (MSDN): https://my.visualstudio.com/subscriptions

Log in with your business account.

Go to “Subscriptions” and click on “Add alternate account”:

 

Go to “Benefits” and copy the hyperlink on “Activate”: https://account.windowsazure.com/signup?offer=MS-AZ... (DO NOT CLICK ON THE HYPERLINK!)

 

Close all browsers.

Log-In with your private account: https://my.visualstudio.com/

Enter the hyperlink (you copied previously) in the browser: https://account.windowsazure.com/signup?offer=MS-AZ...

 

Select: “I agree to the subscription agreement, offer details, and privacy statement” and click on “Purchase”

DONE – you can now click on “Or get started with your Azure subscription”: https://portal.azure.com/

 

SUCCESS:

 
2.3.0.0.0.0.1.1. Error: Oops: It appears that you have already used your MSDN benefit for a Microsoft Azure Subscription

If you get the error: Oops: It appears that you have already used your MSDN benefit for a Microsoft Azure Subscription:

 

Solution: You need to go to your respective MSDN Subscription responsible, delete your MSDN Subscription and assign it new. You will then get again an Email and can start as described in the previous chapter.

 

 

2.3.0.0.0.0.2. User Management
2.3.0.0.0.0.2.1. Create a new Standard User

We create a new Standard User for the DB Access:

 

Create a database active directory user for DB Access:

Create a database active directory user for DB Access that is member of group db-user-group--s0-v1:

 

Create another user as db administrator:

 

Create user for account access:

 

Create user for key vault access:

 

2.3.0.0.0.0.2.2. Log-In & Set Password

http://portal.azure.com

 

 

 

 

 

 

2.3.0.0.0.0.2.3. Guest User
  1. 1.Click on “Users” 

 
  1. 2.Click on “+ New Guest User” 

 
  1. 3.Enter the user details: 

 
  1. 4.In the email invitation click on the button “Get Started”: 

 
  1. 5.Scenario:User has no MS Account”: There exists no ms cloud account with this email address: A new account and password (Eo…) need to be created: 

 

    1. a.An email is sent to verify the email address and the user needs to enter the code on the signup page: 

 
 
    1. b.Fill out the captcha (to proof you’re a realy person and not a robot): 

 
    1. c.And finally the user needs to review and accept the permissions: 

      1. i.Sign you in: becke.ch will be able to sign you into their apps and resources. 

      2. ii.Read your profile info: becke.ch will be able to read your basic profile (name and email address). 

 
    1. d.Which brings the user to the home page: https://account.activedirectory.windowsazure.com  

 
    1. e.Respective when login into the azure portal the user gets the following screen: 

 
    1. f.And when login in as administrator in the azure portal we can see that this guest user is part of our AAD: 

 
  1. 6.Scenario: User has MS Account”:  All steps are identical to “5.” with the following difference: 

    1. a.The user is instantly visible in the AAD: 

 
2.3.0.0.0.0.3. Group Management

Based on chapter Error: Reference source not found

2.3.0.0.0.0.3.1. Create a new Security Group

We create a new security group (NOT office 365 group) containing the DB Administrators:

 

http://www.techmikael.com/2017/02/all-you-never-wanted-to-know-about.html

The Azure AD Admin UI allows you to create the following:

 

2.3.0.0.0.0.5. Database

 

 

Database: (Created an identical 2nd DB and DB-Server for “MSDN Subscription” but having “...s1-v1...” in the name):

Server: (Created an identical 2nd DB and DB-Server for “MSDN Subscription” but having “...s1-v1...” in the name):

2.3.0.0.0.0.5.1. Query Editor
 

Login:

 
2.3.0.0.0.0.5.3. SQL User: Create SQL User, Create Schema and Grant Access To Create Table,  Insert, Select (Created identical 2nd logins for “MSDN Subscription” but having “...1...v1...” in the name)

Log-In with db-admin--s0-v1:

CREATE USER db-user--s0-v1;

Failed to execute query. Error: Incorrect syntax near '-'.

CREATE USER dbuser0v1;

Failed to execute query. Error: 'dbuser0v1' is not a valid login or you do not have permission.

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-manage-logins

CREATE USER dbuser0v1 WITH PASSWORD = 'E...';

 

CREATE SCHEMA schm--s0-v1;

Actually the dashes in the schema name cause some issues

CREATE SCHEMA schema0v1;

https://docs.microsoft.com/en-us/sql/t-sql/statements/grant-object-permissions-transact-sql

GRANT ALL ON schm--s0-v1 TO dbuser0v1;

Failed to execute query. Error: Incorrect syntax near 'schm'.

GRANT ALL ON schema0v1 TO dbuser0v1;

Failed to execute query. Error: Cannot find the object 'schema0v1', because it does not exist or you do not have permission.

https://docs.microsoft.com/en-us/sql/t-sql/statements/grant-schema-permissions-transact-sql

GRANT ALL ON SCHEMA :: schema0v1 TO dbuser0v1;

Failed to execute query. Error: The all permission has been deprecated and is not available for this class of entity

An alternative to “ALL” is “CONTROL”:

GRANT CONTROL ON SCHEMA :: schema0v1 TO dbuser0v1;

GRANT CREATE TABLE TO dbuser0v1;

GRANT CREATE PROCEDURE TO dbuser0v1;

 

https://docs.microsoft.com/en-us/sql/t-sql/statements/revoke-schema-permissions-transact-sql

REVOKE ALTER ON SCHEMA :: schema0v1 TO dbuser0v1;

 

 

https://stackoverflow.com/questions/497317/how-can-i-view-all-grants-for-an-sql-database

SELECT

    class_desc

  , CASE WHEN class = 0 THEN DB_NAME()

         WHEN class = 1 THEN OBJECT_NAME(major_id)

         WHEN class = 3 THEN SCHEMA_NAME(major_id) END [Securable]

  , USER_NAME(grantee_principal_id) [User]

  , permission_name

  , state_desc

FROM sys.database_permissions

Log-in with user: dbuser0v1

 

CREATE TABLE schema0v1.Benutzer (  ID int NOT NULL, LastName nvarchar(50), FirstName nvarchar(50), modification datetime);

Failed to execute query. Error: CREATE TABLE permission denied in database 'becke-ch—app--s0-v1--uc'.

https://serverfault.com/questions/252808/with-sql-server-2008-can-you-grant-create-table-permission-within-a-schema

Requires CREATE TABLE permission in the database and ALTER permission on the schema in which the table is being created.

 

--INSERT INTO schema0v1.Benutzer VALUES (1, 'Raoul', 'Becke', SYSDATETIME());

INSERT INTO schema0v1.Benutzer VALUES (1, 'Raoul', 'Becke', '20170810 03:44:09 AM');

INSERT INTO schema0v1.Benutzer VALUES (2, 'Ludwig', 'Becke', '20170809 07:16:10 AM');

INSERT INTO schema0v1.Benutzer VALUES (3, 'Yaro', 'Becke', '20170809 07:21:10 AM');

INSERT INTO schema0v1.Benutzer VALUES (4, 'Ian', 'Becke', '20170809 07:26:10 AM');

SELECT * FROM schema0v1.Benutzer;

SELECT count(*) FROM schema0v1.Benutzer;

 

 

DROP TABLE schema0v1.Benutzer;

https://docs.microsoft.com/en-us/sql/t-sql/statements/truncate-table-transact-sql

TRUNCATE TABLE schema0v1.Benutzer;

CREATE SCHEMA schema0v1dest;

GRANT CONTROL ON SCHEMA :: schema0v1dest TO dbuser0v1;

GRANT CREATE TABLE TO dbuser0v1;

CREATE TABLE schema0v1dest.Benutzer (  ID int NOT NULL, LastName nvarchar(50), FirstName nvarchar(50), modification datetime);

2.3.0.0.0.0.5.5. AD User: Create Schema and Grant Access To Create Table,  Insert, Select

 

Log-In with db-admin--s0-v1:

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-manage-logins

CREATE USER [db-user--s0-v1@beckech.onmicrosoft.com] FROM EXTERNAL PROVIDER;

Failed to execute query. Error: Principal 'db-user--s0-v1@beckech.onmicrosoft.com' could not be created. Only connections established with Active Directory accounts can create other Active Directory users.

SOLUTION: See chapter Error: Reference source not found.

Log-In with db-ad-admin--s0-v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”!!):

CREATE USER [db-user--s0-v1@beckech.onmicrosoft.com] FROM EXTERNAL PROVIDER;

CREATE USER [dbaduser0v1@beckech.onmicrosoft.com] FROM EXTERNAL PROVIDER;

CREATE SCHEMA schema1v1;

GRANT CONTROL ON SCHEMA :: schema1v1 TO [db-user--s0-v1@beckech.onmicrosoft.com];

GRANT CONTROL ON SCHEMA :: schema1v1 TO [dbaduser0v1@beckech.onmicrosoft.com];

GRANT CREATE TABLE TO [db-user--s0-v1@beckech.onmicrosoft.com];

GRANT CREATE TABLE TO [dbaduser0v1@beckech.onmicrosoft.com];

 

Log-in with user: db-user--s0-v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”)

 

CREATE TABLE schema1v1.Benutzer (  ID int NOT NULL, LastName nvarchar(25), FirstName nvarchar(25));

INSERT INTO schema1v1.Benutzer VALUES (1, 'Raoul2', 'Becke2');

SELECT * FROM schema1v1.Benutzer;

 

2.3.0.0.0.0.5.6. AD Group: Create Schema and Grant Access To Create Table,  Insert, Select

Precondition: Make sure the group “db-user-group--s0-v1” was created and user “dbaduser1v1@beckech.onmicrosoft.com” has been assigned.

Log-In with db-ad-admin--s0-v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”!!):

https://docs.microsoft.com/en-us/azure/sql-database/sql-database-aad-authentication-configure  

To create a contained database user representing an Azure AD or federated domain group, provide the display name of a security group:

CREATE USER [db-user-group--s0-v1] FROM EXTERNAL PROVIDER;

 

GRANT CONTROL ON SCHEMA :: schema1v1 TO [db-user-group--s0-v1];

GRANT CONTROL ON SCHEMA :: schema1v1 TO [db-user-group--s0-v1];

GRANT CREATE TABLE TO [db-user-group--s0-v1];

GRANT CREATE TABLE TO [db-user-group--s0-v1];

 

Log-in with user: dbaduser1v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”)

 

CREATE TABLE schema1v1.Benutzer (  ID int NOT NULL, LastName nvarchar(25), FirstName nvarchar(25));

INSERT INTO schema1v1.Benutzer VALUES (1, 'Raoul2', 'Becke2');

SELECT * FROM schema1v1.Benutzer;

 

 

2.3.0.0.0.0.5.8. Timeout Issues, Query Information & sys.dm_exec_requests

ERROR: Failed to execute query. Error: Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.

 

ANALYZE: Run the following query:

SELECT sqltext.TEXT,

req.session_id,

req.status,

req.command,

req.cpu_time,

req.total_elapsed_time

FROM sys.dm_exec_requests req

CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext

Which returns the following result – e.g.:

 

https://stackoverflow.com/questions/17817177/what-does-a-status-of-suspended-and-high-diskio-means-from-sp-who2

A query gets suspended when it is requesting access to a resource that is currently not available. This can be a logical resource like a locked row or a physical resource like a memory data page. The query starts running again, once the resource becomes available.

High disk IO means that a lot of data pages need to be accessed to fulfill the request.

That is all that I can tell from the above screenshot. However, if I were to speculate, you probably have an IO subsystem that is too slow to keep up with the demand. This could be caused by missing indexes or an actually too slow disk. Keep in mind, that 15000 reads for a single OLTP query is slightly high but not uncommon.

https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-exec-requests-transact-sql

Finding all currently blocked requests

The following example queries sys.dm_exec_requests to find information about blocked requests.

SELECT sqltext.TEXT, req.session_id , req.status , req.blocking_session_id  

    , req.wait_type , req.wait_time , req.wait_resource  

    , req.transaction_id  

FROM sys.dm_exec_requests req  

CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext

WHERE req.status = N'suspended'

 

SOLUTION: https://docs.microsoft.com/en-us/sql/t-sql/language-elements/kill-transact-sql Start with the root blocking query i.e. the query that has BLOCKING_SESSION_ID==0 and kill this query which should then unblock all following queries!

KILL 95

 

2.3.0.0.0.0.7. Logic App
2.3.0.0.0.0.7.1. Create

 

 

 
2.3.0.0.0.0.7.3. Logic App Designer

Switch to “Manually enter connection information”:

ERROR: Please check your account info and/or permissions and try again. Details: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.) inner exception: Microsoft SQL: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.)

SOLUTION:

ERROR: Please check your account info and/or permissions and try again. Details: Cannot open server "beckech.onmicrosoft.com" requested by the login. The login failed.

inner exception: Microsoft SQL: Cannot open server "beckech.onmicrosoft.com" requested by the login. The login failed.

https://social.msdn.microsoft.com/Forums/ie/en-US/05a593b1-3d16-4339-82ad-45c8afe7210b/sql-connector-azure-ad-authentication?forum=azurelogicapps

What are supported authentication types for SQL connector in Logic APP?

Does it support the Azure AD authentication (as described  here https://azure.microsoft.com/en-gb/documentation/articles/sql-database-aad-authentication/). In Edit API connection, I tried to change the Authentication type to Active Directory Password, but same is not working.

The SQL Connector only supports windows and sql server login for authentication. It doesn't yet support AAD based auth.

It will be great if you can open feature request on our uservoice .

 

 

2.3.0.0.0.0.9. Azure Data Factory – V2: ETL: Extract, Transfrom & Load: SSIS

As mentioned in chapter Error: Reference source not found - ADF V2 now fully supports E(xtract) T(ransform) & L(oad) capabilities using SSIS (SQL Server Integration Services).

2.3.0.0.0.0.9.1. SQL Server Data Tools (SSDT) – Installation

SQL Server Data Tools, BIDS, Visual Studio – What Do You Need? See Article: https://whitepages.unlimitedviz.com/2015/04/sql-server-data-tools-bids-visual-studio/

Server

Toolset Name

Location

SQL Server 2008 R2 and below

Business Intelligence Development Studio (BIDS)

On SQL Media

SQL Server 2012 and 2012 SP1

SQL Server Data Tools

On SQL Media

SQL Server 2014

SQL Server Data Tools – BI

Download for Visual Studio 2012 Download for Visual Studio 2013

 

Download and install SQL Server Data Tools (SSDT) for Visual Studio: See https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017

Visual Studio versus “Stand-Alone”: If Visual Studio is not installed, selecting Install a new SQL Server Data Tools instance installs SSDT with a minimal version of Visual Studio, but for the best experience we recommend using SSDT with the latest version of Visual Studio. BUT personally I would suggest to keep it simple and anyway if you only focus on SSIS (SSAS or SSRS) development then just install SSDT standalone from here: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017#ssdt-for-vs-2017-standalone-installer

  1. 1.Go to: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017#ssdt-for-vs-2017-standalone-installer 

  2. 2.Download: English (United States): https://go.microsoft.com/fwlink/?linkid=875613&clcid=0x409 

  3. 3.Run the installer: “SSDT-Setup-ENU.exe” 

 
  1. 4.Click “Next” 

 
  1. 5.Install all features i.e. use default selection: 

    1. a."Install new SQL Server Data Tools for Visual Studio 2017 Instance"  

    2. b.Set a nickname for the new Visual Studio 2017 instance: "SSDT" 

    3. c.Install tools for these SQL Server features: 

      1. i.SQL Server Database 

      2. ii.SQL Server Analysis Services 

      3. iii.SQL Server Integration Services 

  2. 6.Click "Install" 

  3. 7.Log-in as administrator to continue the installation. 

 

ERROR: Error: Failed to get VS Instances: Die COM-Klassenfactory für die Komponente mit CLSID {177F0C4A-1CD3-4DE7-A32C-71DBBB9FA36D} konnte aufgrund des folgenden Fehlers nicht abgerufen werden: 80040154 Klasse nicht registriert (Ausnahme von HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG))., stackTrace:    bei System.RuntimeTypeHandle.CreateInstance(RuntimeType type, Boolean publicOnly, Boolean noCheck, Boolean& canBeCached, RuntimeMethodHandleInternal& ctor, Boolean& bNeedSecurityCheck)

   bei System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)

   bei System.RuntimeType.CreateInstanceDefaultCtor(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)

   bei System.Activator.CreateInstance(Type type, Boolean nonPublic)

   bei System.Activator.CreateInstance(Type type)

   bei Microsoft.Sql.DataTools.ManagedBootstrapperApp.ViewModel.MainViewModel.GetVsInstances()

[0A80:0A84][2018-07-14T18:07:59]e000: MainViewModel.OnPackageAction: Cannot get SQL instance

[0A80:0A84][2018-07-14T18:07:59]i000: MainViewModel.OnPackageAction: Full list of installed instances:

SOLUTION: https://blogs.msdn.microsoft.com/heaths/2017/09/01/cleaning-up-corrupt-visual-studio-instances/

We have mitigated a couple of possible causes and continue to explore both mitigations and long-term remedies, but in the meantime, you will need to clean up, or obliterate, the instance and re-install.

Open an elevated command prompt.

Run: %ProgramFiles(x86)%\Microsoft Visual Studio\Installer\resources\app\layout\InstallCleanup.exe -i

This will clean up only instance data such as the installation directory and state. This will make re-installing the instance faster since any workloads and components you had installed would remain cached and would not be re-downloaded.

  1. 8.If everything goes well then there should now be an entry in the startup: “Visual Studio 2017 (SSDT)” and the Program is located in: “"C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\devenv.exe" 

 
  1. 9.Starting “Visual Studio 2017 (SSDT)” the next step is either to “sign in” with an existing account, “register” for a new account or “skip this step”: 

    1. a.I decided to sign-in:  

 
    1. b.Log in with my azure account “ms-cloud--s0-v1@beckech.onmicrosoft.com” and password: 

 
 
 
      1. i.Settings (General, JavaScript, Visual Basic, Visual C#, Visual C++, Web Development, Web Development (only code)): Choose Default: General 

      2. ii.Color scheme (Blue, Blue (additional contrast), Dark, Light): Default: Blue 

 

 

  1. 10.If this entry is missing then I suggest do de-install all the following packages (and start the installation again): “Microsoft Visual C++ 2010 x86 Redistributable”, “Microsoft SQL Server 2012 Native Client”, “Microsoft-System-CLR-Typen für SQL Server 2017”, “Microsoft SQL Server 2016 LocalDB”, “Microsoft Visual Studio Tools for Applications 2017”, “Microsoft Visual C++ 2017 Redristributable (x86)”, “Microsoft SQL Server Data Tools – Visual Studio 2017”, “Microsoft Visual C++ 2017 Redistributable (x64)”, “Microsoft SQL Server Data Tools für Visual Studio”, “Microsoft Visual Studio Installer”. 

 
2.3.0.0.0.0.9.2. Call a Web Service using SSIS then Store and Query the Results – NOT WORKING!

This sample is based on https://www.codeproject.com/Articles/1182165/Call-a-Web-Service-using-SSIS-then-Store-and-Query applied to the Azure platform.

  1. 1.Analogue to chapter A.5.2 connect to the database “becke-ch--app--s1-v1--uc” with user “dbuser1v1 

  2. 2.Create an “input” tables “CityData” containing the “PostalCode” and an “output” table “WebServiceData” containing the entire address (city, state, area, ) as XML string. 

CREATE TABLE [schema1v1].[CityData]([PostalCode] VARCHAR(10) NOT NULL) ON [PRIMARY];

CREATE TABLE [schema1v1].[WebServiceData]([City] [xml] NULL) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY];

The xml data type lets you store XML documents and fragments in a SQL Server database. An XML fragment is an XML instance that is missing a single top-level element. You can create columns and variables of the xml type and store XML instances in them – see here.

TEXTIMAGE_ON { filegroup | "default" } Indicates that the text, ntext, image, xml, varchar(max), nvarchar(max), varbinary(max), and CLR user-defined type columns (including geometry and geography) are stored on the specified filegroup.

  1. 3.Insert some data into the “input” tables “CityData”: 

INSERT INTO [schema1v1].[CityData] VALUES ('76520');

INSERT INTO [schema1v1].[CityData] VALUES ('76541');

INSERT INTO [schema1v1].[CityData] VALUES ('76574');

 

  1. 4.Create a stored procedure to get all postal codes. This postal code result set will then be used as input for the SSIS-Foreach-Loop-Container see below: 

CREATE PROCEDURE [schema1v1].[pr_GetPostalCodes] AS SET NOCOUNT ON SELECT [PostalCode] FROM [schema1v1].[CityData];

 

  1. 5.Start Visual Studio 2017 (SSDT)respective the Program located in: “"C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\devenv.exe"”: 

  2. 6.Create a newBusiness Intelligence Integration Servicesselect “Integratrion Services Projectproject and name it "Call Web Service".
    Name: becke-ch--ssis--s0-v1--webservice
    Location:
    c:\ws\becke-ch--ssis--s0-v1\webservice
    Solution name:
    becke-ch--ssis--s0-v1--webservice  

 
    1. a.In the main-window titled “Package.dtsx”, in the tab “Flow Control”, right-click and select “New Connection...”: 

 
      1. i.In the “Add SSIS-Connection-Manager” select ADO.NET if possible respective supported as connection type. For a comparison of which driver to use see here: ADO versus OLEDB versus ODBC: https://datatellblog.wordpress.com/2015/01/13/ssis-data-flows-ado-net-vs-ole-db-vs-odbc/ . (ADO versus ADO.NET see https://msdn.microsoft.com/de-ch/library/bb978899.aspx). 

 

 
      1. ii.In the “Configure ADO.NET Connection Manager” dialog, click “New...”: 

 
      1. iii.In the “Connection Manager” dialog enter the connection details:
        IMPORTANT: The firewall needs to be opened in order that the “Visual Studio 2017 (SSDT)” Machine can connect to the Azure DB (becke-ch—app—s1-v1.database.windows.net)!
        IMPORTANT: You need to enter the database name before clicking on “Test connection” because the user “dbuser1v1” has only access to this database, otherwise the connection test will fail!
        Provider (“.NET-Provider”: “SqlClient Data Provider” or “OracleClient Data Provider” or “Odbc Data Provider”. “.NET-Provider for OleDb”: “Microsoft OLE DB Provider Oracle”, “Microsoft OLE DB Provider Analysis Service”, etc.): Default: “.NET-Provider\SqlClient Data Provider”:
        Server name: becke-ch--app--s1-v1.database.windows.net
        Authentication:
        SQL Authentication
        User name:
        dbuser1v1
        Password:
        E…
        Database name:
        becke-ch--app--s1-v1--uc  

 
    1. b.In the main-window titledPackage.dtsxswitch to the tab “Parameters” and add package variables: city (Package (Scope), Object (Data tpye), System.Object (Value)), postalCode (Package (Scope), String (Data tpye)) and postalCodes (Package (Scope), Object (Data tpye), System.Object (Value)):
      IMPORTANT: On the top right of the Package.dtsxwindow you need to click on the button “Variables” otherwise (by default) the window section containing the variables is not displayed! 

 
    1. c.In the main-window titled “Package.dtsx”, switch back to the tab “Flow Controland from the SSIS-Toolbox explorer, favorites section, drag & drop the “Execute SQL Task” into Flow Controlwindow. Double click on Execute SQL Task” and call the dbo.pr_GetPostalCodes stored procedure as follows:
      General: Name: Default: Execute SQL Task
      General: Description: Default: Execute SQL Task
      Options: TimeOut: Default: 0
      Options: CodePage: Default: 1252
      Options: TypeConversionMode (None, Allowed (Default)): Allowed
      ResultSet
      (None (Default), Single Row, Full Result Set, XML): Full result set
      SQL Statement: ConnectionType (Excel, OLE DB (Default), ODBC, ADO, ADO.NET, SQLMOBILE): ADO.NET
      SQL Statement: Connection (choose from drop down the connection configured see above): becke-ch--app--s1-v1.database.windows.net.becke-ch--app--s1-v1--uc.dbuser1v1
      SQL Statement: SQLSourceType (Direct input (Default), File connection, Variable): Direct input
      SQL Statement: SQLStatement: schema1v1.pr_getPostalCodes
      SQL Statement: IsQueryStoreProcedure (False (Default), True): True
      SQL Statement: BypassPrepare: True  

 
    1. d.In the “Execute SQL Task Editor” on the left click on “Resultset” and map the Execute SQL Task Result Set to the postalCodes variable and set the Result Name (string) to: "0": 

 
    1. e.From the SSIS-Toolbox explorer, contaienr section, drag & drop the “Foreach Loop Container”, connect the “Execute SQL Task” to the container and then configure the Collection and Variable Mappings. 

    2. f.Configure the Collection settings for the Foreach Loop Container
      Enumerator (Foreach-Element-Enumerator (Default), Foreach-File-Enumerator, Foreach-ADO-Enumerator, Foreach-ADO.NET-Schemarowset Enumerator, Foreach Enumerator for Variable, ...): Foreach-ADO-Enumerator
      ADO object source variable: User::postalCodes
      Enumeration mode (
      Rows in the first table, Rows in all the tables (ADO.NET dataset only), All tables (ADO.NET dataset only)):  Rows in all the tables 

 
    1. g.In the “Foreach Loop Container Editor” configure the Variable Mappings for the Foreach Loop Container. Use postalCode (singular) for each postal code in the postalCodes object.
      Variable: User::postalCode
      Index:
      0 

 
    1. h.Create a new HTTP Connection Manager (the http connection manager offers the possibility to do basic authentication AND Client/Server Certificate 2-way-SSL!): 

      1. i.Right click in the Connection Managers pane and click “New Connection…" and select HTTP. 

      2. ii.Click Add… 

      3. iii.Enter the Server URL pointing to the endpoint of the SOAP Webservice: https://becke-ch-ssis-s0-v1.appspot.com:443/uszipsoap and click OK
        to obtain the WSDL for the web service: https://becke-ch-ssis-s0-v1.appspot.com/uszipsoap?WSDL and click OK. 

 
    1. i.Download the WSDL file “https://becke-ch-ssis-s0-v1.appspot.com/uszipsoap?WSDL” to “c:\ws\becke-ch--ssis--s0-v1\webservice\uszipsoap.wsdl
      PROBLEM SSIS (AND OTHER MS TOOLS/LIBRARIES) CANNOT HANDLE NESTED WSDL/XSD FILES CONTAINING IMPORT STATEMENTS AND FOR FLATTENING OF SUCH WSDL FILES ONLY COMMERCIAL TOOLS (E.G. XMLSPY ETC.) ARE AVAILABLE!
      Therefore I took a fallback downloading the original (MS) WSDL file “https://github.com/cheezy/soap-object/blob/master/features/wsdl/uszip.asmx.wsdl.xml” and changing the http-address!
       

 

    1. j.Add a Web Service Task to the Foreach Loop Container:  

      1. i.On the General tab:
        Name: Default: Web Service Task
        Description: Default: Web Service Task
        Connection:
        HttpConnection: Select the HTTP Connection Manager created in the previous step.
        WSDLFile:
        Click the ellipsis button and browse to the WSDL file stored in “c:\ws\becke-ch—ssis--s0-v1\webservice\uszipsoap.wsdl.
        OverwriteWSDLFile: True.
        DO NOT Click the Download WSDL button. Otherwise your WSDL file gets corrupted – and I don’t know why! 

 
      1. ii.Configure the Input and Output properties for the Web Service Task 

        1. 1.Input:
          Service: USZip
          Method
          (GetInfoByZIP, GetInfoByState, GetInfoByCity, GetInfoByAreaCode): GetInfoByZip
          Clicking the checkbox in the Variable column will switch the Value column to a dropdown list.
          Value: User::postalCode. 

 
        1. 2.Output tab
          OutputType (File Connection (Default), Variable): Variable.
          Variable:
          User::city and click OK. 

 
    1. k.Add another Execute SQL Task and configure it to call the schema1v1.pr_AddCity stored procedure. Configure the sequence between the Web Service Task and the Execute SQL Task accordingly. 

      1. i.On the General Tab:
        General: Name: Execute SQL Task
        General: Description: Execute SQL Task
        Options: TimeOut: 0
        Options: CodePage: 1252
        Options: TypeConversionMode: Allowed
        ResultSet:
        None
        SQL Statement: ConnectionType: ADO.NET
        SQL Statement: Connection: becke-ch--app--s1-v1.database.windows.net.becke-ch--app--s1-v1--uc.dbuser1v1
        SQL Statement: SQLSourceType: Direct input
        SQL Statement: SQLStatement: schema1v1.pr_AddCity
        SQL Statement: IsQueryStoreProcedure: True
        SQL Statement: BypassPrepare: True  

 

      1. ii.On the Parameter Mappings Tab: Configure the Parameter Mapping properties of the Execute SQL Task:
        Click the Add button.
        Select
        User::city from the Variable Name dropdown list.
        Select Xml as the Data Type.
        Type "city" without the quotation marks as the Parameter Name and click OK.
         

 

ERROR: "Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: Could not execute the Web method. The error is: Fehler beim Reflektieren des Typs 'System.Xml.XmlText'.

SSIS-Paket "c:\ws\becke-ch--ssis--s0-v1\webservice\becke-ch--ssis--s0-v1--webservice\becke-ch--ssis--s0-v1--webservice\Package.dtsx" gestartet.

Fehler: 0xC002F304 bei Task "Webdienst", Task "Webdienst": Fehler mit folgender Fehlermeldung: "Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: Could not execute the Web method. The error is: Fehler beim Reflektieren des Typs 'System.Xml.XmlText'..

   bei Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebMethodInvokerProxy.InvokeMethod(DTSWebMethodInfo methodInfo, String serviceName, Object connection)

   bei Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTaskUtil.Invoke(DTSWebMethodInfo methodInfo, String serviceName, Object connection, VariableDispenser taskVariableDispenser)

   bei Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTask.executeThread()".

Fehler beim Task: Task "Webdienst"

Warnung: 0x80019002 bei Foreach-Schleifencontainer: SSIS-Warnungscode "DTS_W_MAXIMUMERRORCOUNTREACHED". Die Execution-Methode wurde erfolgreich ausgeführt, aber die Anzahl von ausgelösten Fehlern (1) hat den maximal zulässigen Wert erreicht (1). Deshalb tritt ein Fehler auf. Dieses Problem tritt auf, wenn die Anzahl von Fehlern den in "MaximumErrorCount" angegebenen Wert erreicht. Ändern Sie den Wert für "MaximumErrorCount", oder beheben Sie die Fehler.

Warnung: 0x80019002 bei Package: SSIS-Warnungscode "DTS_W_MAXIMUMERRORCOUNTREACHED". Die Execution-Methode wurde erfolgreich ausgeführt, aber die Anzahl von ausgelösten Fehlern (1) hat den maximal zulässigen Wert erreicht (1). Deshalb tritt ein Fehler auf. Dieses Problem tritt auf, wenn die Anzahl von Fehlern den in "MaximumErrorCount" angegebenen Wert erreicht. Ändern Sie den Wert für "MaximumErrorCount", oder beheben Sie die Fehler.

SSIS-Paket "c:\ws\becke-ch--ssis--s0-v1\webservice\becke-ch--ssis--s0-v1--webservice\becke-ch--ssis--s0-v1--webservice\Package.dtsx" fertig gestellt: Fehler.

Das Programm "[4348] DtsDebugHost.exe: DTS" wurde mit Code 0 (0x0) beendet.

 

REASON: It seems that “Microsoft.SqlServer.Dts.Tasks.WebServiceTask” cannot handle “<s:complexType mixed="true">

      <s:element name="GetInfoByZIPResponse">

        <s:complexType>

          <s:sequence>

            <s:element minOccurs="0" maxOccurs="1" name="GetInfoByZIPResult">

              <s:complexType mixed="true">

                <s:sequence>

                  <s:any />

                </s:sequence>

              </s:complexType>

            </s:element>

          </s:sequence>

        </s:complexType>

      </s:element>

LETS TRY THE FOLLOWING:

      <s:element name="GetInfoByZIPResponse">

        <s:complexType>

          <s:sequence>

            <s:element minOccurs="0" maxOccurs="1" name="GetInfoByZIPResult" type="s:string">

            <!--s:element minOccurs="0" maxOccurs="1" name="GetInfoByZIPResult">

              <s:complexType mixed="true">

                <s:sequence>

                  <s:any />

                </s:sequence>

              </s:complexType>

            </s:element-->

          </s:sequence>

        </s:complexType>

      </s:element>

THIS SEEMS WORKING – QED – <s:complexType mixed="true">AND/OR “<s:sequence> <s:any />...” NOT WORKING!

There is not much useful information on the web regarding: WebserviceTaskException System.Xml.XmlText complexType any

https://stackoverflow.com/questions/16759748/xmlattribute-xmltext-cannot-be-used-to-encode-complex-types

https://stackoverflow.com/questions/2567414/correct-xml-serialization-and-deserialization-of-mixed-types-in-net

FACT SSIS IS UNUSABLE FOR WEBSERVICES BECAUSE IT CANNOT WELL HANDLE WSDL SPECIFICATIONS!

 

2.3.0.0.0.0.9.3. Provision the Azure-SSIS Integration Runtime in Azure Data Factory

https://docs.microsoft.com/en-gb/azure/data-factory/tutorial-deploy-ssis-packages-azure

  1. 1.Create a Azure SQL DB & Server see chapter A.5 with the following settings: 

Server:

Security: DB: We don’t set-up a firewall for now and we don’t do AAD authentication (instead using SQL authentication).

  1. 2.Select New on the left menu, select Data + Analytics, and then select Data Factory 

    1. a.Name (must be globally unique – see https://docs.microsoft.com/en-gb/azure/data-factory/naming-rules ): becke-ch-ssis-s0-v1-uc 

    2. b.Subscription: MSDN Subscription 

    3. c.Resource group (Create new/Use existing): Use existing: becke-ch--ssis--s0-v1  

    4. d.Version: V2 

    5. e.Location: West Europe 

  2. 3.Select Author & Monitor to open the Data Factory user interface (UI) on a separate tab 

    1. a.On the Let's get started page, select the Configure SSIS Integration Runtime tile 

    2. b.On the General Settings page of Integration Runtime Setup, complete the following steps: 

      1. i.Name (Integration Runtime Name can contain only letters, numbers and the dash (-) character. The first and last characters must be a letter or number. Every dash (-) character must be immediately preceded and followed by a letter or a number. Consecutive dashes are not permitted in integration runtime name!): becke-ch-ssis-s0-v1-uc-xyz 

      2. ii.Description: enter the description of your integration runtime. 

      3. iii.Location (select the same location of your database server to host SSISDB!): West Europe 

      4. iv.Node Size (Select a large node size (scale up), if you want to run many compute/memory –intensive packages – select the smallest: D1 V2: 1: 3.50 GiB: 50 GiB: $0.592/hour: $0.202/hour (~66%)):  

      5. v.Node (Select a large cluster with many nodes (scale out), if you want to run many packages in parallel: 1..10): 1 

      6. vi.Edition/License: (select SQL Server edition/license for your integration runtime: Standard or Enterprise. Select Enterprise, if you want to use advanced/premium features on your integration runtime): Standard 

      7. vii.Save Money (Select Yes, if you want to bring your own SQL Server license with Software Assurance to benefit from cost savings with hybrid use): No 

      8. viii.Click Next.  

    3. c.On the SQL Settings page, complete the following steps: 

      1. i.Subscription (subscription that has your database server to host SSISDB): MSDN Subscription 

      2. ii.Location (location of your database server to host SSISDB. Select the same location of your integration runtime!): West Europe 

      3. iii.Catalog Database Server Endpoint: (created on your behalf as a standalone database, part of an Elastic Pool, or in a Managed Instance (Preview). See Compare SQL Database and Managed Instance (Preview). See Create Azure-SSIS IR in a virtual network): becke-ch--ssis--s0-v1.database.windows.net 

      4. iv.Use AAD authentication: (See Create Azure-SSIS IR with AAD authentication): Do not select 

      5. v.Admin Username (SQL authentication username for your database server): db-admin--s0-v1 

      6. vi.Admin Password: Eo... 

      7. vii.Catalog Database Service Tier (Basic/Standard/Premium tier or Elastic Pool name): Basic 

      8. viii.Click Test Connection and if successful, click Next.  

    4. d.On the Advanced Settings page, complete the following steps:  

      1. i.Maximum Parallel Executions Per Node maximum number of packages to execute concurrently per node in your integration runtime cluster. Select a low number, if you want to use more than one cores to run a single large/heavy-weight package that is compute/memory -intensive. Select a high number, if you want to run one or more small/light-weight packages in a single core): 1 

      2. ii.Custom Setup Container SAS URI (optionally enter Shared Access Signature (SAS) Uniform Resource Identifier (URI) of your Azure Storage Blob container where your setup script and its associated files are stored, see Custom setup for Azure-SSIS IR): Leave empty 

      3. iii.Select a Vnet (join your integration runtime to a virtual network): Do not select  

      4. iv.Click Finish to start the creation of your integration runtime (takes approximately 20 to 30 minutes to complete). 

When you provision an instance of an Azure-SSIS IR, the Azure Feature Pack for SSIS and the Access Redistributable are also installed. These components provide connectivity to Excel and Access files and to various Azure data sources, in addition to the data sources supported by the built-in components. You can also install additional components. For more info, see Custom setup for the Azure-SSIS integration runtime.

    1. e.On the Connections tab, switch to Integration Runtimes if needed. Select Refresh to refresh the status 

  1. 4.Create an Azure-SSIS integration runtime 

    1. a.In the Azure Data Factory UI, switch to the Edit tab, select Connections, and then switch to the Integration Runtimes tab to view existing integration runtimes in your data factory.  

    2. b. 

  2. 5. 

2.3.0.0.0.0.11. Azure Data Factory – V1: EL: Extract & Load: Copy Data

As mentioned in chapter Error: Reference source not found - ADF V1 only supported E(xtract) & L(oad) capabilities and the transformation had to be done in the source and/or destination mostly using stored procedures. This functionality is still available in V2 and should be preferred when doing simple copying of data instead of using the rather heavy weight SSIS functionality.

https://docs.microsoft.com/en-us/azure/data-factory/data-factory-azure-blob-connector

https://blogs.msdn.microsoft.com/bigdatasupport/2016/01/23/incremental-data-load-from-azure-table-storage-to-azure-sql-using-azure-data-factory/

 

2.3.0.0.0.0.11.1. Copy Data (Preview)

Based on: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-copy-data-wizard-tutorial

To copy data from one location to another azure offers a preview feature to perform this task:

  1. 1.Open your data factory and click on “Copy data (PREVIEW)” which opens “https://datafactory.azure.com” in a new tab: 

 
  1. 2.Properties: Enter name and description for the copy data task and specify how often you want to run the task. 

    1. a.Task name (required): CopyPipeline-hpw 

    2. b.Task description: Optional: ... 

    3. c.Task cadence or Task schedule: Run once now/Run regularly on schedule: Select:  Run regularly on schedule 

      1. i.Recurring pattern: Monthly/Weekly/Daily/Hourly/Minute: Select: every 15 minutes
        ATTENTION: THE MINIMUM IS A 15 MINUTE RECURRENCE!!! 

    4. d.Start date time (UTC): 07/24/2017 03:00 am 

    5. e.End date time (UTC): 12/30/2099 11:00 pm 

 

 
  1. 3.Source data store: Specify the source data store for the copy task. You can use an existing data store connection or specify a new data store. Click HERE to suggest new copy sources or give comments. 

    1. a.Select “Azure SQL Database” 

 

 
  1. 4.Connection: Specify the Azure SQL Database: 

    1. a.Connection name (required): Source-SQLAzure-hpw 

    2. b.Server / database selection method (required): From Azure subscription / Enter manually: From Azure subscription 

    3. c.Azure subscription (required): Select your subscription 

    4. d.Server name (required): becke-ch--app--s0-v1 

    5. e.Database name (required): becke-ch--app--s0-v1--uc 

    6. f.User name (required): db-user--s0-v1@beckech.onmicrosoft.com 

    7. g.Password (required): ... 

 

ERROR: Error when processing request, Type=Microsoft.Data.Mashup.MashupValueException, Message=Microsoft SQL: Cannot open server "beckech.onmicrosoft.com" requested by the login. The login failed., Source=Microsoft.Data.Mashup,', ActivityId: ....

REASON/SOLUTION: AD Authentication not supported use instead SQL Authentication

 

 
  1. 5.Select the table: [schema0v1].[Benutzer] 

 

 
  1. 6.Apply Filter: None / custom query: None 

 

  1. 7.Destination: Repeat the same steps again as we DID for the Source 

BUT Destination Sources are limited:

 
 
  1. 8.Table mapping: For each table you have selected to copy in the source data store, select a corresponding table in the destination data store or specify the stored procedure to run at the destination. Select: [schema0v1].[Benutzer] 

 
  1. 9.Schema mapping: Choose how source and destination columns are mapped. 

 
  1. 10.Settings: More options for data movement: Leave the setting to default: 

    1. a.Performance settings:  

      1. i.Advanced settings:  

        1. 1.Parallel Copy:  

          1. a.Cloud units: 1, 2, 4, 8, 16, 32: Auto (default) 

          2. b.Parallel copies: 1,2,3,4...,32: Auto (default) 

 

Summary:

 

Start Deployment:

 
2.3.0.0.0.0.11.3. Modify & Deploy

Open the Data Factory and click on “Author and Deploy”:

 

Write output data to separate schema: “schema0v1dest”:

  1. 1.Open the Output Dataset: 

 
  1. 2.Change “tableName”: "[schema0v1].[Benutzer]" → "[schema0v1dest].[Benutzer]" and click “Deploy” 

{

    "name": "OutputDataset-xfz",

    "properties": {

        "structure": [

            {

                "name": "ID",

                "type": "Int32"

            },

            {

                "name": "LastName",

                "type": "String"

            },

            {

                "name": "FirstName",

                "type": "String"

            }

        ],

        "published": false,

        "type": "AzureSqlTable",

        "linkedServiceName": "Destination-SQLAzure-xfz",

        "typeProperties": {

            "tableName": "[schema0v1dest].[Benutzer]"

        },

        "availability": {

            "frequency": "Minute",

            "interval": 15

        },

        "external": false,

        "policy": {}

    }

}

 

2.3.0.0.0.0.11.4. Deployment Script

Automation Script: The export of a data-factory template via automation script from the resource group see chapter Error: Reference source not found is not possible and give the following error:

Export template operation completed with errors. Some resources were not exported. Please see details for more information. (Code: ExportTemplateCompletedWithErrors)

REASON: https://social.msdn.microsoft.com/Forums/azure/en-US/06b0a24a-c088-4ab1-ace0-5957bb4c2440/data-factory-resource-provider-wont-create-template-from-resource-group?forum=AzureDataFactory : ARM template export is not supported for Azure Data Factory. However, you can still create/edit an ARM template for deployment of Data Factories. Please see https://docs.microsoft.com/en-us/azure/data-factory/data-factory-build-your-first-pipeline-using-arm .

SOLUTION/WORKAROUND: There currently exists no solution to this problem i.e. these resources need to be handled outside the automation script individually.

Download script from above, save it under the name “becke-ch--app--s0-v1--df--uc.json” and modify its content based on the one that was generated using the wizard in A.8.1.

 

Open questions: Comparing the downloaded script with the script generated by the wizard:

Section

Only in Wizard

Only in Download

Linked Service

...

    "name": "Destination-SQLAzure-xfz",

    "properties": {

        "hubName": "becke-ch-app-s0-v1-uc_hub",

        "type": "AzureSqlDatabase",

...

 

Dataset

...

        ],

        "published": false,

        "type": "AzureSqlTable",

        "linkedServiceName": "Source-SQLAzure-xfz",

        "typeProperties": {

            "tableName": "[schema0v1].[Benutzer]"

        },

        "availability": {

            "frequency": "Minute",

            "interval": 15

        },

        "policy": {}

    }

...

...

            "type": "datasets",

            "name": "[variables('azureSqlDatasetNameFrom')]",

            "dependsOn": [

              "[variables('dataFactoryName')]",

              "[variables('azureSqlLinkedServiceNameFrom')]"

            ],

            "apiVersion": "2015-10-01",

            "properties": {

Pipeline

...

                    "sink": {

                        "type": "SqlSink",

                        "writeBatchSize": 0,

                        "writeBatchTimeout": "00:00:00"

                    },

...                "policy": {

                    "timeout": "1.00:00:00",

                    "concurrency": 1,

                    "executionPriorityOrder": "NewestFirst",

                    "style": "StartOfInterval",

                    "retry": 3,

                    "longRetry": 0,

                    "longRetryInterval": "00:00:00"

                },

        "start": "2017-07-05T06:33:01.03Z",

        "end": "2099-12-30T23:00:00Z",

        "isPaused": false,

        "hubName": "becke-ch-app-s0-v1-uc_hub",

        "pipelineMode": "Scheduled"

    }

...

...

            ],

            "apiVersion": "2015-10-01",

            "properties": {

                    "sink": {

                      "type": "SqlSink",

                      "sqlWriterCleanupScript": "$$Text.Format('DELETE FROM {0}', 'emp')"

                    },

 

IMPORTANT CHANGES: Edit the deployment script “becke-ch--app--s0-v1--df--uc.json” and perform the following important changes:

The provided location 'West Europe' is not available for resource type 'Microsoft.DataFactory/dataFactories'

Name: 'becke-ch--app--s0-v1--df--uc' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx",

  "code": "InvalidName"

 

 

Deployment:

  1. 1.Follow the steps in chapter: Error: Reference source not found. 

  2. 2.Deploy Data Factory entities using the Resource Manager template you created above: New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile  C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json 

PS C:\Users\raoul-becke--s0-v1> New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile  C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json

DeploymentName          : becke-ch--app--s0-v1--adf--uc-deployment

ResourceGroupName       : becke-ch--app--s0-v1

ProvisioningState       : Succeeded

Timestamp               : 25.07.2017 20:14:26

Mode                    : Incremental

TemplateLink            :

Parameters              :

                          Name             Type                       Value

                          ===============  =========================  ==========

                          sqlServerNameFrom  String                     becke-ch--app--s0-v1

                          databaseNameFrom  String                     becke-ch--app--s0-v1--uc

                          sqlServerUserNameFrom  String                     dbuser0v1

                          sqlServerPasswordFrom  SecureString

                          sqlTableFrom     String                     [schema0v1].[Benutzer]

                          sqlServerNameTo  String                     becke-ch--app--s0-v1

                          databaseNameTo   String                     becke-ch--app--s0-v1--uc

                          sqlServerUserNameTo  String                     dbuser0v1

                          sqlServerPasswordTo  SecureString

                          sqlTableTo       String                     [schema0v1dest].[Benutzer]

Outputs                 :

DeploymentDebugLogLevel :

 

2.3.0.0.0.0.11.5. Incremental Load & Data Delay

https://docs.microsoft.com/en-us/azure/data-factory/data-factory-scheduling-and-execution

https://docs.microsoft.com/en-us/azure/data-factory/data-factory-repeatable-copy

As described in the articles above we can achieve an incremental load by adapting the “sqlReaderQuery” to the “WindowStart” and “WindowEnd” times of the slice:

"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= \\'{0:yyyy-MM-dd HH:mm}\\' AND modification < \\'{1:yyyy-MM-dd HH:mm}\\'', WindowStart, WindowEnd)"

ISSUE: Sometimes data is synchronized delayed i.e. the record is persisted or propagated and then persisted after its modification time-stamp.

Data Set: Offset: The offset that can be set in Data Set does not help because it shifts the whole slice start and end but what we want is that for a certain slice we retrieve the records considering the delay of the propagation/persistence.

SOLUTION:  TODATETIMEOFFSET: https://docs.microsoft.com/en-us/sql/t-sql/functions/todatetimeoffset-transact-sql: In the following sample we assume that the maximum delay of persistence compared to the modification time-stamp is 5 minutes:

"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', +5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"

ALTERNATIVE SOLUTION: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-create-datasets

dataDelay  The time to delay the check on the availability of the external data for the given slice. For example, you can delay an hourly check by using this setting.

The setting only applies to the present time. For example, if it is 1:00 PM right now and this value is 10 minutes, the validation starts at 1:10 PM.

Note that this setting does not affect slices in the past. Slices with Slice End Time + dataDelay < Now are processed without any delay.

Times greater than 23:59 hours should be specified by using the day.hours:minutes:seconds format. For example, to specify 24 hours, don't use 24:00:00. Instead, use 1.00:00:00. If you use 24:00:00, it is treated as 24 days (24.00:00:00). For 1 day and 4 hours, specify 1:04:00:00.

 

2.3.0.0.0.0.11.6. Performance Improvement aka Reduce Interval below 15 minutes

PRECONDITION: The incremental load see previous chapter A.8.4 has been established.

ISSUE: Your database changes are only processed minimum every 15 minutes on: HH:00:00, HH:15:00, HH:30:00 and HH:45:00 - but you need the changes at the destination every 5 minutes (on: HH:00:00, HH:05:00, HH:10:00, HH:15:00, HH:20:00, ...)

SOLUTION: Edit the deployment script “becke-ch--app--s0-v1--df--uc.json” and perform the following changes:

  1. 1.Create 2 more copies of the existing: Pipeline section, Input-Dataset section and Output-Dataset section. You need to give these copies different names (“name” attribute) and in the Pipeline section you need to reference these names accordingly in the “dependsOn”, “inputs” and “outputs” attribute. 

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameFrom'),'05')]",

            "dependsOn": [

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameFrom'),'10')]",

            "dependsOn": [

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameTo'),'05')]",

            "dependsOn": [

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameTo'),'10')]",

            "dependsOn": [

          {

            "type": "datapipelines",

            "name": "[concat(variables('pipelineName'),'05')]",

            "dependsOn": [

              "[concat(variables('azureSqlDatasetNameFrom'),'05')]",

              "[concat(variables('azureSqlDatasetNameTo'),'05')]"

            ],

            "apiVersion": "2015-10-01",

            "properties": {

              "activities": [

                {

                  "inputs": [

                    {

                      "name": "[concat(variables('azureSqlDatasetNameFrom'),'05')]"

                    }

                  ],

                  "outputs": [

                    {

                      "name": "[concat(variables('azureSqlDatasetNameTo'),'05')]"

                    }

                  ],

          {

            "type": "datapipelines",

            "name": "[concat(variables('pipelineName'),'10')]",

            "dependsOn": [

              "[concat(variables('azureSqlDatasetNameFrom'),'10')]",

              "[concat(variables('azureSqlDatasetNameTo'),'10')]"

            ],

            "apiVersion": "2015-10-01",

            "properties": {

              "activities": [

                {

                  "inputs": [

                    {

                      "name": "[concat(variables('azureSqlDatasetNameFrom'),'10')]"

                    }

                  ],

                  "outputs": [

                    {

                      "name": "[concat(variables('azureSqlDatasetNameTo'),'10')]"

                    }

                  ],

 

  1. 2.Because datasets are only processed every 15 minutes (if there are not any failures) exactly on: HH:00:00, HH:15:00, HH:30:00 and HH:45:00 and therefore you need to give the 2 copies of the dataset an offset: "offset": "00:05:00" respective "offset": "00:10:00": 

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameFrom'),'05')]",

            "properties": {

              "availability": {

                "frequency": "Minute",

                "interval": 15,

                "offset": "00:05:00"

              },

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameFrom'),'10')]",

            "properties": {

              "availability": {

                "frequency": "Minute",

                "interval": 15,

                "offset": "00:10:00"

              },

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameTo'),'05')]",

            "properties": {

              "availability": {

                "frequency": "Minute",

                "interval": 15,

                "offset": "00:05:00"

              },

          {

            "type": "datasets",

            "name": "[concat(variables('azureSqlDatasetNameTo'),'10')]",

            "properties": {

              "availability": {

                "frequency": "Minute",

                "interval": 15,

                "offset": "00:10:00"

              },

 

  1. 3.Having the incremental load established – see chapter A.8.4 – you need to make sure that the overlapping slices do not load the same data several time. This can be achieved by adapting the TODATETIMEOFFSET – so that the window size is reduced to 5 minutes: 

          {

            "type": "datapipelines",

            "name": "[variables('pipelineName')]",

            "properties": {

              "activities": [

                {

                  "typeProperties": {

                    "source": {

                        "type": "SqlSource",

                        "sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', -5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"

          {

            "type": "datapipelines",

            "name": "[concat(variables('pipelineName'),'05')]",

            "properties": {

              "activities": [

                {

                  "typeProperties": {

                    "source": {

                        "type": "SqlSource",

                        "sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', -5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"

          {

            "type": "datapipelines",

            "name": "[concat(variables('pipelineName'),'10')]",

            "properties": {

              "activities": [

                {

                  "typeProperties": {

                    "source": {

                        "type": "SqlSource",

                        "sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', -5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"

 

 

2.3.0.0.0.0.11.7. ERROR & SOLUTION

ERROR: The provided location 'West Europe' is not available for resource type 'Microsoft.DataFactory/dataFactories':

PS C:\Users\raoul-becke--s0-v1> New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile  C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json

New-AzureRmResourceGroupDeployment : 21:56:20 - Error: Code=LocationNotAvailableForResourceType; Message=The provided location 'West Europe' is not available for resource type 'Microsoft.DataFactory/dataFactories'. List of available regions for the resource type is 'westus,northeurope,eastus,westcentralus'.

At line:1 char:1

+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception

    + FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep

   loymentCmdlet

New-AzureRmResourceGroupDeployment : The deployment validation failed

At line:1 char:1

+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : CloseError: (:) [New-AzureRmResourceGroupDeployment], InvalidOperationException

    + FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep

   loymentCmdlet

SOLUTION: Fix location error: Edit “becke-ch--app--s0-v1--df--uc.json” and set: "location": "North Europe"

 

ERROR: Name: 'becke-ch--app--s0-v1--df--uc' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx",  "code": "InvalidName"

PS C:\Users\raoul-becke--s0-v1> New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile  C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json

New-AzureRmResourceGroupDeployment : 22:01:45 - Resource Microsoft.DataFactory/datafactories

'becke-ch--app--s0-v1--df--ucdzsa54yhmvwma' failed with message '{

  "message": "Name: 'becke-ch--app--s0-v1--df--uc' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx",

  "code": "InvalidName"

}'

At line:1 char:1

+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception

    + FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep

   loymentCmdlet

New-AzureRmResourceGroupDeployment : 22:01:55 - Template output evaluation skipped: at least one resource deployment

operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-debug for usage details.

At line:1 char:1

+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception

    + FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep

   loymentCmdlet

New-AzureRmResourceGroupDeployment : 22:01:55 - Template output evaluation skipped: at least one resource deployment

operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-debug for usage details.

At line:1 char:1

+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

    + CategoryInfo          : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception

    + FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep

   loymentCmdlet

DeploymentName          : becke-ch--app--s0-v1--adf--uc-deployment

ResourceGroupName       : becke-ch--app--s0-v1

ProvisioningState       : Failed

Timestamp               : 25.07.2017 20:01:52

Mode                    : Incremental

TemplateLink            :

Parameters              :

                          Name             Type                       Value

                          ===============  =========================  ==========

                          sqlServerNameFrom  String                     becke-ch--app--s0-v1

                          databaseNameFrom  String                     becke-ch--app--s0-v1--uc

                          sqlServerUserNameFrom  String                     dbuser0v1

                          sqlServerPasswordFrom  SecureString

                          sqlTableFrom     String                     [schema0v1].[Benutzer]

                          sqlServerNameTo  String                     becke-ch--app--s0-v1

                          databaseNameTo   String                     becke-ch--app--s0-v1--uc

                          sqlServerUserNameTo  String                     dbuser0v1

                          sqlServerPasswordTo  SecureString

                          sqlTableTo       String                     [schema0v1dest].[Benutzer]

Outputs                 :

DeploymentDebugLogLevel :

SOLUTION: Fix naming error – see https://docs.microsoft.com/en-us/azure/data-factory/data-factory-naming-rules – double hyphen “--” are not allowed: Edit “becke-ch--app--s0-v1--df--uc.json” and set: "dataFactoryName": "beckech-app-s0-v1-df-uc",

 

ISSUE: “Waiting: Dataset dependencies” And “The upstream dependencies are not ready”

SOLUTION: See https://stackoverflow.com/questions/38123673/azure-waiting-dataset-dependencies-in-manual-created-pipelines
setting the external property on the input dataset json definition to true lets azure data factory know that the data is being externally generated and not from another pipeline in the factory!
A new deployment is needed because otherwise we get:
Error
Dataset 'AzureSqlDatasetTo' is output of activity 'CopyFromAzureSQLToAzureSQL' in pipeline 'SQL2SQLPipeline'. It cannot change to external
.

    1. a.edit “becke-ch--app--s0-v1--df--uc.json”  

      1. i.Input Data-Set set: "external": true 

              "availability": {

                "frequency": "Minute",

                "interval": 15

              },

              "external": true

            }

      1. ii.Output Data-Set: set: "external": false 

              "availability": {

                "frequency": "Minute",

                "interval": 15

              },

              "external": false

            }

 

ERROR: Cannot set active period Start=..., End=... for pipeline 'SQL2SQLPipeline00' due to conflicts on Output with Pipeline: SQL2SQLPipeline05, Activity CopyFromAzureSQLToAzureSQL05, Period: Start=..., End=...

Cannot set active period Start=08/07/2017 20:45:00, End=08/07/2017 23:30:00 for pipeline 'SQL2SQLPipeline00' due to conflicts on Output: AzureSqlDatasetTo with Pipeline: SQL2SQLPipeline05, Activity CopyFromAzureSQLToAzureSQL05, Period: Start=08/07/2017 20:45:00, End=08/07/2017 23:30:00

SOLUTION: https://social.msdn.microsoft.com/Forums/expression/en-US/17802772-cf18-4e8c-a0e6-3af805c7856a/pipelineactivity-conflict?forum=AzureDataFactory

Use different data sets! You need separate Input- and Output-Dataset for each pipeline! If you only have different Output-Datasets then this will fix this error but you can then not have different start- and end-times!

 

ERROR: Database operation failed on server 'Source:tcp:becke-ch--app--s0-v1.database.windows.net,1433' with SQL Error Number '208'. Error message from database execution : Invalid object name 'parameters'..

SOLUTION: sqlReaderQuery was missing the concat operation and therefore not correct constructed and was looking for a table named “FROM parameters('sqlTableFrom')”:

                  "typeProperties": {

                    "source": {

                        "type": "SqlSource",

                        "sqlReaderQuery": "[concat('SELECT * FROM ',parameters('sqlTableFrom'))]"

                    },

...

 

 

2.3.0.0.0.0.13. Key Vault
2.3.0.0.0.0.13.1. Security
  1. 1.AD Security Group: According to naming convention in chapter Error: Reference source not found create an AD Security Group (see chapter Error: Reference source not found): “becke-ch--app--s0-v1--adgrp--kv 

2.3.0.0.0.0.13.2. Setup

 

 

 

 
 
2.3.0.0.0.0.14. Power BI
2.3.0.0.0.0.14.1. Desktop Installation
  1. 1.Download: https://powerbi.microsoft.com/en-us/desktop/  

  2. 2.Execute: PBIDesktop_x64.msi 

  3. 3.Welcome Screen: Click “Next”: 

 
  1. 4.Microsoft Software License Terms: Select “I accept the terms in the License  Agreement” and Click “Next”: 

 

  1. 5.Destination Folder: Leave Default: “C:\Program Files\Microsoft Power BI Desktop\” and click “Next”: 

 
  1. 6.Ready to install: Click “Install” - you need to log in as administrator to perform the installation! 

 
2.3.0.0.0.0.14.3. Power BI Desktop

 

  1. 1.Splash Screen: “Sign in to get the most out of Power-BI … sign in to create and share dashboards, and get automatic insights from you data ...” - I skip this step because I don’t have a “Power BI Pro” license with this account to create group workspaces, create content packs and publish them 

  2. 2.Click “Get Data”, select “more ...” and select “Azure SQL Database” (actually it pops up the same dialog as if you were selecting “SQL Server”): 

 
  1. 3.SQL Server Database: Enter the connection details: 

    1. a.Server: becke-ch--app--s0-v1.database.windows.net 

    2. b.Database (optional): becke-ch--app--s0-v1--uc 

    3. c.Data connectivity mode: Select “DirectQuery” (to have always the latest data shown in your report – alternatively you can select “Import” which is the default): 

      1. i.Direct Query: https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-use-directquery/ 

        1. 1.Supported Datasources: SQL Server, Azure SQL Database, Azure SQL Data Warehouse, SAP HANA, Oracle Database, Teradata Database, Amazon Redshift (Preview), Impala (Preview), Snowflake (Preview) 

        2. 2.Benefits:  

          1. a.build visualizations over very large datasets, where it otherwise would be unfeasible to first import all of the data 

          2. b.DirectQuery reports always use current data 

          3. c.1 GB dataset limitation does not apply to DirectQuery 

        3. 3.Limitations 

          1. a.All tables must come from a single database 

          2. b.Query Editor query is overly complex, an error will occur. To remedy the error you must either delete the problematic step in Query Editor, or Import the data instead of using DirectQuery 

          3. c.Relationship filtering is limited to a single direction, rather than both directions 

          4. d.Time intelligence capabilities are not available in DirectQuery. 

          5. e.By default, limitations are placed on DAX expressions allowed in measures 

          6. f.1 million row limit for returning data when using DirectQuery 

        4. 4.Important 

          1. a.Performance and load - All DirectQuery requests are sent to the source database, so the time required to refresh a visual is dependent on how long that back-end source takes to respond with the results from the query (or queries). 

          2. b.Using Row Level Security (RLS) can have a significant impact as well; a non-RLS dashboard tile shared by multiple users results in a single query to the database, but using RLS on a dashboard tile usually means the refresh of a tile requires one query per user, thus significantly increasing load on the source database and potentially impacting performance. 

          3. c.Security - All users who consume a published report connect to the back-end data source using the credentials entered after publication to the Power BI service. BUT the good thing is that RLS is being considered! 

    4. d.Advanced Options: 

      1. i.Command time out in minutes (optional): [leave empty] (default) 

        1. 1.https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-march-update-feature-summary/
          You can now specify a command timeout value (in minutes) when connecting to database sources.  This was previously possible only via custom formula authoring. (This should help to avoid connection timeouts on large data-sets which are truncating the data). 

      2. ii.SQL statement (optional, requires database): [leave empty] (default) 

      3. iii.Include relationship columns: Selected (default) 

        1. 1.https://community.powerbi.com/t5/Desktop/include-relationship-columns-option-in-query-editor/td-p/56644
          Functionality: I have tableA with relationship to tableB, then when connecting to database if i tick 'Include Relationship Columns' then under Navigator window when I click 'Select related Tables' it will automatically pick tableB but if i have not ticked that option then when you click 'Select related tables' it will not pick tableB. 

      4. iv.Navigate using full hierarchy: Not selected (default) 

      5. v.Enable SQL Server Failover support: Not selected (default) 

 

 

Authentication: SQL Azure connector with Azure AD Auth: NOT SUPPORTED!

https://ideas.powerbi.com/forums/265200-power-bi-ideas/suggestions/10172688-sql-azure-connector-with-azure-ad-auth

 

 

 

Therefore trying Database / SQL User authentication:

 

ERROR: Details: "Microsoft SQL: Cannot open server 'becke-ch--app--s0-v1' requested by the login. Client with IP address '77.58.5.221' is not allowed to access the server.  To enable access, use the Windows Azure Management Portal or run sp_set_firewall_rule on the master database to create a firewall rule for this IP address or address range.  It may take up to five minutes for this change to take effect.".

 

https://portal.azure.com – open the SQL Server “becke-ch--app--s0-v1”:

 

 

 

Power BI Desktop: Navigator:

 

2.3.0.0.0.0.14.5. Power BI Pro

Developer License (MSDN)

https://my.visualstudio.com/subscriptions

Click on "MSDN Platforms": https://my.visualstudio.com/subscriptions#

Go to "Power BI Pro" and click "Get Code": https://my.visualstudio.com/

"Success – The coupon code has been successfully retrieved. Click "Activate" to proceed."

 

You get the message “You already have a license for your work account which is administered by your organization. If you want to register for Power B I Pro Developer (MSDN), sign-out from your work account and create a new account that you administer yourself”
In German: “
Sie haben bereits ein Abonnement über Ihre Arbeit Ihre Organisation verwaltet Ihre Verwendung von . Wenn Sie sich für Power BI Pro Developer (MSDN) registrieren möchten, melden Sie sich bei Ihrem Geschäftskonto ab, und erstellen Sie ein neues Konto, das Sie selbst verwalten.”

Click on "Create new account" (Neues Konto erstellen")

You are then signed out and redirected to: https://signup.microsoft.com

But instead of creating a new account click on “Add to an existing subscription”:

 

Login: https://login.microsoftonline.com

Email: ms-cloud--s0-v1@beckech.onmicrosoft.com

Password: …

 
 

Enter address details and click on “Continue”:

 

Check order details and click on “Continue”:

 

 

Check Payment method: monthly payment (default)

And click on "Submit order"

 

 

 

Now everything is ready:

Assign new users to the subscription: https://portal.office.com/admin/default.aspx#ActiveUsersPage

Go to administrator start page: https://www.office.com

2.3.0.0.0.0.14.7. Assign license to user
  1. 1.Login to:  https://portal.office.com  

  2. 2.On the left side open “Billing” and click on “Licenses” 

  3. 3.On the line “Power BI Pro” click on “Assign now”: 

 
  1. 4.Under “Users → Active Users” select the user you want to assign the license to e.g. “ms-cloud--s0-v1”  

  2. 5.On the line “Product licenses ...” click on the “Edit” link 

 
  1. 6.Select the location and activate the “Power BI Pro” license and click “Save”: 

 
2.3.0.0.0.0.14.9. Portal

Login to: https://www.office.com

 

Click on the “Power BI” icon: https://app.powerbi.com/ :

 

Click on “Sign in”:

 
2.3.0.0.0.0.14.11. Create Workspace

Click on “Workspaces” and click on “Create app workspace”:

 

 

 

3.0.0.0.0.0.1. Storage Account
3.0.0.0.0.0.1.1. Set-Up

https://docs.microsoft.com/en-us/azure/storage/storage-introduction

 
3.0.0.0.0.0.1.3. File Share

 

 

 
3.0.0.0.0.0.1.5. Security

https://docs.microsoft.com/en-us/azure/storage/storage-security-guide

Build-Time: Administration: RBAC: During build-time respective for administration the security is based on RBAC (Role Based Access) BUT ONLY IF the Resource Manager based Deployment Model was chosen! This means that authentication and authorization are based on AAD i.e. we can bind the resource roles Owner/Contributor/Reader with AD Users and/or AD Security Groups.

Run-Time: Data Security: Storage Account Keys or Shared Access Signatures and Stored Access Policies: Data Plane Security refers to the methods used to secure the data objects stored in Azure Storage – the blobs, queues, tables, and files.

 

3.0.0.0.0.0.1.7. Build-Time: Administration: RBAC

If a user has no role on the storage account or on the higher level artifacts: resource group or subscription then the storage account and its underlying Blobs, Files, Tables and Queues are not visible.

Reader: Adding a user as reader to the storage account via IAM:

 

Adding a user as Reader gives him access to view the Storage Account BUT he cannot access the underlying Blobs, Files, Tables and Queues:

Access denied

You do not have access

Looks like you don't have access to this content. To get access, please contact the owner.

Contributor: Giving a user the contributor role  gives him access to view the Storage Account AND to the underlying Blobs, Files, Tables and Queues. They can perform all operations on the storage account except assigning access. Someone with this role can view and regenerate the storage account keys. With the storage account keys, they can access the data objects.

 
3.0.0.0.0.0.2. Azure Container Instance (ACI)

https://docs.microsoft.com/en-us/azure/container-instances/

3.0.0.0.0.0.2.1. Create Container Instance
 
 

Summary:

Basics

Subscription: MSDN Platforms

Resource group: becke-ch--httpd--s0-v1

Location: West Europe

Container name: becke-ch--httpd--s0-v1--2wayssl

Container image type: Public

Container image: httpd:2.4

Configuration

OS Type: Linux

Number of cores: 1

Memory (GB): 0.5

Public IP address: Yes

DNS name label: becke-ch--httpd--s0-v1--2wayssl

Port: 80

Open additional ports: No

Port protocol: TCP

Restart policy: Always

Environment variable: -

Add additional environment variables: No

Command override: -

 

And last but not least accessing the running container: http://becke-ch--httpd--s0-v1--2wayssl.westeurope.azurecontainer.io

 
3.0.0.0.0.0.2.2. Execute a command in a running Azure container instance

Precondition: Install Azure CLI as described in chapter Error: Reference source not found.

az container exec --resource-group <group-name> --name <container-group-name> --exec-command "<command>"

C:\Users\raoul-becke--s0-v1>az container exec --resource-group becke-ch--httpd--s0-v1 --name becke-ch--httpd--s0-v1--2wayssl --exec-command "/bin/bash"

<c6f8662aae6477-803088677-tph13:/usr/local/apache2#

 

 

 

3.0.0.0.0.0.3. Azure Cloud Services

https://docs.microsoft.com/en-us/azure/cloud-services/cloud-services-choose-me

Azure Cloud Services is an example of a platform as a service (PaaS). Like Azure App Service, this technology is designed to support applications that are scalable, reliable, and inexpensive to operate. In the same way that App Service is hosted on virtual machines (VMs), so too is Azure Cloud Services. However, you have more control over the VMs. You can install your own software on VMs that use Azure Cloud Services, and you can access them remotely.

More control also means less ease of use. Unless you need the additional control options, it's typically quicker and easier to get a web application up and running in the Web Apps feature of App Service compared to Azure Cloud Services.

There are two types of Azure Cloud Services roles. The only difference between the two is how your role is hosted on the VMs:

It is suggested that all the VMs in a single application run in the same cloud service. Users access the application through a single public IP address, with requests automatically load balanced across the application's VMs. The platform scales and deploys the VMs in an Azure Cloud Services application in a way that avoids a single point of hardware failure.

Even though applications run in VMs, it's important to understand that Azure Cloud Services provides PaaS, not infrastructure as a service (IaaS). Here's one way to think about it. With IaaS, such as Azure Virtual Machines, you first create and configure the environment your application runs in. Then you deploy your application into this environment. You're responsible for managing much of this world, by doing things such as deploying new patched versions of the operating system in each VM. In PaaS, by contrast, it's as if the environment already exists. All you have to do is deploy your application. Management of the platform it runs on, including deploying new versions of the operating system, is handled for you.

There's nothing like a Virtual Machines data disk. Instead, an Azure Cloud Services application should explicitly write all state to Azure SQL Database, blobs, tables, or some other external storage. Building applications this way makes them easier to scale and more resistant to failure, which are both important goals of Azure Cloud Services.

 

3.0.0.0.0.0.3.1. Creation
  1. 1.Click “Create” 

 
  1. 2.Enter the basic information: 

    1. a.DNS name: becke-ch--myapp--s0-v1 

    2. b.Subscription:  

    3. c.Resource Group: … 

    4. d.Location: West Europe 

 
  1. 3.Optional: Select a package: Optionally a package file “.cspkg” or “.zip” and a configuration file “.cscfg” can be uploaded from local file system or from blob storage and given a deployment label. 

 

3.0.0.0.0.0.3.2. Setup
 
3.0.0.0.0.0.4. Create new Directory

The problem with the private (trial) subscription in chapter A.1.1 is that a directory and domain is created which does not fit our (naming convention) needs (for organization (trial) subscriptions see chapter A.1.4 we can give the domain name that fits our needs):

(Trial) subscription:

 

 

https://portal.azure.com/?whr=live.com#create/Microsoft.AzureActiveDirectory