MS Cloud Foundation
Architecture Foundation - Appendix
Company / Organization: becke.ch
Scope: 0.0 {language={en}; document-type={ott}; organization={becke.ch}; interest={business}; domain={architecture}; level={foundation}; technology={ms-cloud,azure,office-365,dynamics-365}}
Version: 1.0.0
File-name: becke-ch--ms-cloud--s1-v1--foundation.odt
Author: ms-cloud--s0-v1@becke.ch
Copyright © 2019 becke.ch – All rights reserved
Document Version History
Version | Date | Author | Description |
1.0.0 | 07.2019 | Raoul Becke | Initial version: Changes according to requirements; see table below version 1.0.0. |
Work-Product Version History
Version | Date | Author | Requirements | Components Changed |
1.0.0 | 07.2019 | Raoul Becke | Due to its huge size, moved the appendix of document “becke-ch--ms-cloud--s0-v1” see [1] into this separate document. | Document |
Illustration Index
Illustration Index
Index of Tables
Index of Tables
Table 1: References 8
Table 2: Glossary 8
This document gives an overview of the MS cloud offerings with a focus on MS Azure. It is a summary of many different Microsoft Cloud articles spread over the Internet.
This document contains only the appendix moved over from “becke-ch--ms-cloud--s0-v1” see [1].
Reference | Location | Remarks |
MS Cloud foundation: Architecture Foundation. |
Terms / Abbreviations / Acronyms | Description |
DC | Data-center see Error: Reference source not found |
1.Go to “https://azure.microsoft.com/en-us/free/ “
2.Click on link “Start free” you will then be redirected to “https://login.microsoftonline.com/common/oauth2/authorize “
3.Click on link “Use another account”
4.Click on link “Create a new Microsoft account” you are then redirected to page: “https://signup.live.com/signup “
5.Enter the following information:
a.Email: ms-cloud--s0-v1@becke.ch
b.Pwd: e…
c.Optional De-Select “Send me promotional emails from Microsoft”
6.Click on button “Next”
7.Check you email account:
From: Microsoft account team <account-security-noreply@accountprotection.microsoft.com> Subject: Verify your email address |
Microsoft account Verify your email address To finish setting up your Microsoft account, we just need to make sure this email address is yours. To verify your email address use this security code: 8... If you didn't request this code, you can safely ignore this email. Someone else might have typed your email address by mistake. Thanks, The Microsoft account team
|
8.Enter the code you received via Email and click “Next”
9.Enter the characters you see on the captcha and click “Next” you are then redirected to page “https://account.azure.com/signup “
10.Enter the information “About you”:
a.Country Region: Switzerland
b.First Name: Microsoft
c.Last Name: Cloud (ms-cloud—s0-v1)
d.Email: ms-cloud--s0-v1@becke.ch
e.Work Phone: 004179…
f.Organization Optional: Leave Empty
11.Identity verification by phone:
a.Switzerland (+41)
b.79…
c.Enter the message you received into the text message field: 785…
12.Identity verification by card:
a.Payment method: New Debit/Credit Card
b.Card number: …
c.Expiration date: 07/2018
d.CVV: …
e.Name on card: Raoul …
f.Address line 1: S... 3
g.Address line 2: [empty]
h.City: Elgg
i.State/Province/Distrinct: [empty]
j.Postal Code: 8353
13.Agreement – click on “Agree” - you are then redirected to page: “https://account.azure.com/signuplanding/Index “
The subscription is active and you can proceed to the following main administration pages:
•Azure Portal: https://portal.azure.com/?whr=live.com
•Account/Subscription administration: https://account.azure.com/Subscriptions
1.Navigate to “http://portal.azure.com” which redirects you to “https://login.microsoftonline.com “
2.Enter your credentials:
a.User: ms-cloud--s0-v1@becke.ch
b.Password: ...
3.You are then redirected to: https://login.live.com/
1.Follow the steps described in chapter A.1.1 but when reaching step “https://account.azure.com/signup ” you get the message: “You already used your one free trial or you are not eligible for it.”:
2.Trying with a different user: ms-cloud--s0-v2@becke.ch
BUT after the last step getting the error message: “We cannot proceed with signup due to an issue with your account. Please contact support.”
https://account.azure.com/organization
Domain Name: Handling Insufficient: The domain name handling in Microsoft Azure is insufficient – see chapter Error: Reference source not found.
1.About you
a.FIRST NAME: Microsoft
b.LAST NAME: Cloud (ms-cloud—s0-v1)
c.CONTACT EMAIL ADDRESS: ms-cloud--s0-v1@becke.ch
d.ORGANIZATION NAME (OPTIONAL): becke.ch
2.Your login and country information:
a.DOMAIN NAME: beckech (.onmicrosoft.com) “check availability” (see issues above)
b.NEW USER ID: ms-cloud--s0-v1 (@beckech.onmicrosoft.com)
c.COUNTRY/REGION: Switzerland
d.CREATE NEW PASSWORD: E…
e.CONFIRM NEW PASSWORD: E…
3.Your mobile number and verification:
a.Your mobile number and verification: Switzerland (+41)
b.SEND TEXT MESSAGECALL ME: 79 ... (You get an SMS from +447781470659)
c.Verify: 155...
4.You are redirected to: Log-In to Microsoft Azure: https://login.microsoftonline.com :
Login: ms-cloud--s0-v1@beckech.onmicrosoft.com
Password: E...
5.You are redirected to: Account Azure: https://account.azure.com
6.About you:
a.* Country/Region: Switzerland
b.* First Name: Microsoft
c.* Last Name: Cloud (ms-cloud—s0-v1)
d.* Email address for important notifications: ms-cloud--s0-v1@becke.ch
e.* Work Phone: 79 ...
f.Organization (Optional): becke.ch
7.Identity Verification by phone:
a.Switzerland (+41)
b.79 ... (“Send text message”) (You get an SMS from +447781470659)
c.Verify: 572... (“Verify Code”)
8.Identity Verification by card:
a.Payment method: New Credit/Debit Card: VisaMasterCardAmerican Express (Your card will not be charged, though you might see a temporary authorization hold.)
b.* Card number: 3776 …
c.* Expiration date: 07/2018
d.* CVV: …
e.* Name on card: Ra…
f.* Address line 1: So…
g.Address line 2 (Optional): [empty]
h.* City: E…
i.State/Province/District (Optional): [empty]
j.* Postal Code: 8...
9.Agreement:
a.Select: “I agree to the subscription agreement, offer details, and privacy statement.”
b.Optional: Select: “Microsoft may use my email and phone to provide special Microsoft Azure offers.”
Welcome:
Get started with your Azure subscription: https://portal.azure.com
AND the good thing is that even I already had a trial subscription previously (see above) which has expired. I could again get a trial subscription:
There exist 2 different ways to change the subscription directory:
•Classic Portal: https://manage.windowsazure.com/
◦Click on “SETTINGS”: Select Subscription and click on “Edit Directory”
◦Drop-Down: Select Directory: “becke.ch - directory - scope: 0: {stage={development}} - version 1” and click “Next”
◦Click “Complete” and then click “OK” to reload the portal
•Azure Accounts Center:
Chrome (or Firefox): Always use the Chrome (or Firefox) Browser to e.g. avoid „strange” SSO behavior of IE.
Email: Wait for your confirmation of the MSDN Platform Subscription:
My Vsiual Studio: Developer License (MSDN): https://my.visualstudio.com/subscriptions
Log in with your business account.
Go to “Subscriptions” and click on “Add alternate account”:
Go to “Benefits” and copy the hyperlink on “Activate”: https://account.windowsazure.com/signup?offer=MS-AZ... (DO NOT CLICK ON THE HYPERLINK!)
Close all browsers.
Log-In with your private account: https://my.visualstudio.com/
Enter the hyperlink (you copied previously) in the browser: https://account.windowsazure.com/signup?offer=MS-AZ...
Select: “I agree to the subscription agreement, offer details, and privacy statement” and click on “Purchase”
DONE – you can now click on “Or get started with your Azure subscription”: https://portal.azure.com/
SUCCESS:
If you get the error: Oops: It appears that you have already used your MSDN benefit for a Microsoft Azure Subscription:
Solution: You need to go to your respective MSDN Subscription responsible, delete your MSDN Subscription and assign it new. You will then get again an Email and can start as described in the previous chapter.
We create a new Standard User for the DB Access:
•Name: Database User - scope: 0: {stage={global}} - version: 1
•User name: db-user--s0-v1@beckech.onmicrosoft.com
•Profile (Address book information for the user - default): Not configured
•Groups (default): 0 groups selected
•Directory role: User
•Password: X...
Create a database active directory user for DB Access:
•Name: Database AD User - scope: 0: {stage={global}} - version: 1
•User name: dbaduser0v1@beckech.onmicrosoft.com
•Profile (Address book information for the user - default): Not configured
•Groups (default): 0 groups selected
•Directory role: User
•Password: G...
Create a database active directory user for DB Access that is member of group db-user-group--s0-v1:
•Name: Database AD User - scope: 1: {stage={global}} - version: 1
•User name: dbaduser1v1@beckech.onmicrosoft.com
•Profile (Address book information for the user - default): Not configured
•Groups (default): db-user-group--s0-v1
•Directory role: User
•Password: D...
Create another user as db administrator:
•Name: Database AD Administrator - scope: 0: {stage={global}} - version: 1
•User name: db-ad-admin--s0-v1@beckech.onmicrosoft.com
•Profile (Address book information for the user - default): Not configured
•Groups (default): 0 groups selected
•Directory role: User
•Password: Z...
Create user for account access:
•Name: Storage Account User - scope: 0: {stage={global}} - version: 1
•User name: storage-account-user--s0-v1@beckech.onmicrosoft.com
•Profile (Address book information for the user - default): Not configured
•Groups (default): 0 groups selected
•Directory role: User
•Password: J...
Create user for key vault access:
•Name: Key Vault User - scope: 0: {stage={global}} - version: 1
•User name: key-vault-user--s0-v1@beckech.onmicrosoft.com
•Profile (Address book information for the user - default): Not configured
•Groups (default): 0 groups selected
•Directory role: User
•Password: K...
•User name: db-user--s0-v1@beckech.onmicrosoft.com
•Password: X...
•Change password: Li…
•User name: db-ad-admin--s0-v1@beckech.onmicrosoft.com
•Password: Z...
•Change password: Li...
•User name: dbaduser0v1@beckech.onmicrosoft.com
•Password: Z...
•Change password: Li…
•User name: dbaduser1v1@beckech.onmicrosoft.com
•Password: D...
•Change password: E...
•User name: storage-account-user--s0-v1@beckech.onmicrosoft.com
•Password: J...
•Change password: E...
•User name: key-vault-user--s0-v1@beckech.onmicrosoft.com
•Password: K…
•Change password: E…
1.Click on “Users”
2.Click on “+ New Guest User”
3.Enter the user details:
◦User name: e.g.: mytest@gmail.com
◦Include a personal message with the invitation: e.g. “This is a test ...”
4.In the email invitation click on the button “Get Started”:
5.Scenario: “User has no MS Account”: There exists no ms cloud account with this email address: A new account and password (Eo…) need to be created:
a.An email is sent to verify the email address and the user needs to enter the code on the signup page:
b.Fill out the captcha (to proof you’re a realy person and not a robot):
c.And finally the user needs to review and accept the permissions:
i.Sign you in: becke.ch will be able to sign you into their apps and resources.
ii.Read your profile info: becke.ch will be able to read your basic profile (name and email address).
d.Which brings the user to the home page: https://account.activedirectory.windowsazure.com
e.Respective when login into the azure portal the user gets the following screen:
f.And when login in as administrator in the azure portal we can see that this guest user is part of our AAD:
6.Scenario: “User has MS Account”: All steps are identical to “5.” with the following difference:
a.The user is instantly visible in the AAD:
Based on chapter Error: Reference source not found
We create a new security group (NOT office 365 group) containing the DB Administrators:
•Name: db-user-group--s0-v1
•Description: Database User Group - scope: 0: {stage={global}} - version: 1
•Membership Type: “Assigned”
•Enable Office features? NO
http://www.techmikael.com/2017/02/all-you-never-wanted-to-know-about.html
The Azure AD Admin UI allows you to create the following:
•Security Group – an AAD security group with explicitly added members.
◦Membership type = Assigned
◦Enable Office features = No
•Security Group – an AAD security group with dynamic members.
◦Membership type = Dynamic User (or device)
◦Enable Office features = No
•An Office 365 group with explicitly added members.
◦Membership type = Assigned
◦Enable Office features = Yes
•An Office 365 group with dynamic members
◦Membership type = Dynamic User (or device)
◦Enable Office features = Yes
Database: (Created an identical 2nd DB and DB-Server for “MSDN Subscription” but having “...s1-v1...” in the name):
•Database name: becke-ch--app--s0-v1--uc
•Subscription: Free Trial
•Resource group (Create new/Use existing): Create new: becke-ch--app--s0-v1
•Select source: Blank database
•Want to use SQL elastic pool? (Yes/Not now): Not now
•Pricing tier: Configure required settings: Default: “Standard S3: 100 DTU, 250 GB (Monthly cost 135.46 CHF)” → change to: “Basic: 5 DTU, 2 GB (Monthly cost 4.51 CHF)”
•Collation: SQL_Latin1_General_CP1_CI_AS
•Pin to dashboard: Optional: Yes (makes it easier to find it)
•Click “Create”
•(Ignore: Automation options)
Server: (Created an identical 2nd DB and DB-Server for “MSDN Subscription” but having “...s1-v1...” in the name):
•Configure required settings: Server: Create a new server (No servers found)
•Server name: becke-ch--app--s0-v1 (.database.windows.net)
•Server admin login: db-admin--s0-v1
•Password: E••••••••
•Confirm password: E••••••••
•Location: West Europe
Login:
Log-In with db-admin--s0-v1:
CREATE USER db-user--s0-v1;
Failed to execute query. Error: Incorrect syntax near '-'.
CREATE USER dbuser0v1;
Failed to execute query. Error: 'dbuser0v1' is not a valid login or you do not have permission.
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-manage-logins
CREATE USER dbuser0v1 WITH PASSWORD = 'E...';
CREATE SCHEMA schm--s0-v1;
Actually the dashes in the schema name cause some issues
CREATE SCHEMA schema0v1;
https://docs.microsoft.com/en-us/sql/t-sql/statements/grant-object-permissions-transact-sql
GRANT ALL ON schm--s0-v1 TO dbuser0v1;
Failed to execute query. Error: Incorrect syntax near 'schm'.
GRANT ALL ON schema0v1 TO dbuser0v1;
Failed to execute query. Error: Cannot find the object 'schema0v1', because it does not exist or you do not have permission.
https://docs.microsoft.com/en-us/sql/t-sql/statements/grant-schema-permissions-transact-sql
GRANT ALL ON SCHEMA :: schema0v1 TO dbuser0v1;
Failed to execute query. Error: The all permission has been deprecated and is not available for this class of entity
An alternative to “ALL” is “CONTROL”:
GRANT CONTROL ON SCHEMA :: schema0v1 TO dbuser0v1;
GRANT CREATE TABLE TO dbuser0v1;
GRANT CREATE PROCEDURE TO dbuser0v1;
https://docs.microsoft.com/en-us/sql/t-sql/statements/revoke-schema-permissions-transact-sql
REVOKE ALTER ON SCHEMA :: schema0v1 TO dbuser0v1;
https://stackoverflow.com/questions/497317/how-can-i-view-all-grants-for-an-sql-database
SELECT
class_desc
, CASE WHEN class = 0 THEN DB_NAME()
WHEN class = 1 THEN OBJECT_NAME(major_id)
WHEN class = 3 THEN SCHEMA_NAME(major_id) END [Securable]
, USER_NAME(grantee_principal_id) [User]
, permission_name
, state_desc
FROM sys.database_permissions
Log-in with user: dbuser0v1
CREATE TABLE schema0v1.Benutzer ( ID int NOT NULL, LastName nvarchar(50), FirstName nvarchar(50), modification datetime);
Failed to execute query. Error: CREATE TABLE permission denied in database 'becke-ch—app--s0-v1--uc'.
https://serverfault.com/questions/252808/with-sql-server-2008-can-you-grant-create-table-permission-within-a-schema
Requires CREATE TABLE permission in the database and ALTER permission on the schema in which the table is being created.
--INSERT INTO schema0v1.Benutzer VALUES (1, 'Raoul', 'Becke', SYSDATETIME());
INSERT INTO schema0v1.Benutzer VALUES (1, 'Raoul', 'Becke', '20170810 03:44:09 AM');
INSERT INTO schema0v1.Benutzer VALUES (2, 'Ludwig', 'Becke', '20170809 07:16:10 AM');
INSERT INTO schema0v1.Benutzer VALUES (3, 'Yaro', 'Becke', '20170809 07:21:10 AM');
INSERT INTO schema0v1.Benutzer VALUES (4, 'Ian', 'Becke', '20170809 07:26:10 AM');
SELECT * FROM schema0v1.Benutzer;
SELECT count(*) FROM schema0v1.Benutzer;
DROP TABLE schema0v1.Benutzer;
https://docs.microsoft.com/en-us/sql/t-sql/statements/truncate-table-transact-sql
TRUNCATE TABLE schema0v1.Benutzer;
CREATE SCHEMA schema0v1dest;
GRANT CONTROL ON SCHEMA :: schema0v1dest TO dbuser0v1;
GRANT CREATE TABLE TO dbuser0v1;
CREATE TABLE schema0v1dest.Benutzer ( ID int NOT NULL, LastName nvarchar(50), FirstName nvarchar(50), modification datetime);
Log-In with db-admin--s0-v1:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-manage-logins
CREATE USER [db-user--s0-v1@beckech.onmicrosoft.com] FROM EXTERNAL PROVIDER;
Failed to execute query. Error: Principal 'db-user--s0-v1@beckech.onmicrosoft.com' could not be created. Only connections established with Active Directory accounts can create other Active Directory users.
SOLUTION: See chapter Error: Reference source not found.
Log-In with db-ad-admin--s0-v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”!!):
CREATE USER [db-user--s0-v1@beckech.onmicrosoft.com] FROM EXTERNAL PROVIDER;
CREATE USER [dbaduser0v1@beckech.onmicrosoft.com] FROM EXTERNAL PROVIDER;
CREATE SCHEMA schema1v1;
GRANT CONTROL ON SCHEMA :: schema1v1 TO [db-user--s0-v1@beckech.onmicrosoft.com];
GRANT CONTROL ON SCHEMA :: schema1v1 TO [dbaduser0v1@beckech.onmicrosoft.com];
GRANT CREATE TABLE TO [db-user--s0-v1@beckech.onmicrosoft.com];
GRANT CREATE TABLE TO [dbaduser0v1@beckech.onmicrosoft.com];
Log-in with user: db-user--s0-v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”)
CREATE TABLE schema1v1.Benutzer ( ID int NOT NULL, LastName nvarchar(25), FirstName nvarchar(25));
INSERT INTO schema1v1.Benutzer VALUES (1, 'Raoul2', 'Becke2');
SELECT * FROM schema1v1.Benutzer;
Precondition: Make sure the group “db-user-group--s0-v1” was created and user “dbaduser1v1@beckech.onmicrosoft.com” has been assigned.
Log-In with db-ad-admin--s0-v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”!!):
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-aad-authentication-configure
To create a contained database user representing an Azure AD or federated domain group, provide the display name of a security group:
CREATE USER [db-user-group--s0-v1] FROM EXTERNAL PROVIDER;
GRANT CONTROL ON SCHEMA :: schema1v1 TO [db-user-group--s0-v1];
GRANT CONTROL ON SCHEMA :: schema1v1 TO [db-user-group--s0-v1];
GRANT CREATE TABLE TO [db-user-group--s0-v1];
GRANT CREATE TABLE TO [db-user-group--s0-v1];
Log-in with user: dbaduser1v1@beckech.onmicrosoft.com (SELECT “Active Directory Password Authentication”)
CREATE TABLE schema1v1.Benutzer ( ID int NOT NULL, LastName nvarchar(25), FirstName nvarchar(25));
INSERT INTO schema1v1.Benutzer VALUES (1, 'Raoul2', 'Becke2');
SELECT * FROM schema1v1.Benutzer;
ERROR: Failed to execute query. Error: Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
ANALYZE: Run the following query:
SELECT sqltext.TEXT,
req.session_id,
req.status,
req.command,
req.cpu_time,
req.total_elapsed_time
FROM sys.dm_exec_requests req
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext
Which returns the following result – e.g.:
A query gets suspended when it is requesting access to a resource that is currently not available. This can be a logical resource like a locked row or a physical resource like a memory data page. The query starts running again, once the resource becomes available.
High disk IO means that a lot of data pages need to be accessed to fulfill the request.
That is all that I can tell from the above screenshot. However, if I were to speculate, you probably have an IO subsystem that is too slow to keep up with the demand. This could be caused by missing indexes or an actually too slow disk. Keep in mind, that 15000 reads for a single OLTP query is slightly high but not uncommon.
Finding all currently blocked requests
The following example queries sys.dm_exec_requests to find information about blocked requests.
SELECT sqltext.TEXT, req.session_id , req.status , req.blocking_session_id
, req.wait_type , req.wait_time , req.wait_resource
, req.transaction_id
FROM sys.dm_exec_requests req
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext
WHERE req.status = N'suspended'
SOLUTION: https://docs.microsoft.com/en-us/sql/t-sql/language-elements/kill-transact-sql Start with the root blocking query i.e. the query that has BLOCKING_SESSION_ID==0 and kill this query which should then unblock all following queries!
KILL 95
•Name: becke-ch--app--s0-v1--uc
•Subscription: Free Trial
•Resource group: Create new/Use existing: Use existing: becke-ch--app--s0-v1
•Location: West Europe
•Pin to dashboard: Optional: Yes
•Automation options: Ignore
Switch to “Manually enter connection information”:
•*Connection Name: becke-ch--app--s0-v1--uc--dbuser0v1
•*SQL server name: becke-ch--app--s0-v1
•*SQL database name: becke-ch--app--s0-v1--uc
•*Username: db-user--s0-v1@beckech.onmicrosoft.com
dbuser0v1
dbaduser0v1@beckech.onmicrosoft.com
•*Password: E...
ERROR: Please check your account info and/or permissions and try again. Details: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.) inner exception: Microsoft SQL: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - No such host is known.)
SOLUTION:
•*SQL server name: becke-ch--app--s0-v1.database.windows.net
ERROR: Please check your account info and/or permissions and try again. Details: Cannot open server "beckech.onmicrosoft.com" requested by the login. The login failed.
inner exception: Microsoft SQL: Cannot open server "beckech.onmicrosoft.com" requested by the login. The login failed.
What are supported authentication types for SQL connector in Logic APP?
Does it support the Azure AD authentication (as described here https://azure.microsoft.com/en-gb/documentation/articles/sql-database-aad-authentication/). In Edit API connection, I tried to change the Authentication type to Active Directory Password, but same is not working.
…
The SQL Connector only supports windows and sql server login for authentication. It doesn't yet support AAD based auth.
It will be great if you can open feature request on our uservoice .
As mentioned in chapter Error: Reference source not found - ADF V2 now fully supports E(xtract) T(ransform) & L(oad) capabilities using SSIS (SQL Server Integration Services).
SQL Server Data Tools, BIDS, Visual Studio – What Do You Need? See Article: https://whitepages.unlimitedviz.com/2015/04/sql-server-data-tools-bids-visual-studio/
Server | Toolset Name | Location |
SQL Server 2008 R2 and below | Business Intelligence Development Studio (BIDS) | On SQL Media |
SQL Server 2012 and 2012 SP1 | SQL Server Data Tools | On SQL Media |
SQL Server 2014 | SQL Server Data Tools – BI | Download for Visual Studio 2012 Download for Visual Studio 2013 |
Download and install SQL Server Data Tools (SSDT) for Visual Studio: See https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017
Visual Studio versus “Stand-Alone”: If Visual Studio is not installed, selecting Install a new SQL Server Data Tools instance installs SSDT with a minimal version of Visual Studio, but for the best experience we recommend using SSDT with the latest version of Visual Studio. BUT personally I would suggest to keep it simple and anyway if you only focus on SSIS (SSAS or SSRS) development then just install SSDT standalone from here: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017#ssdt-for-vs-2017-standalone-installer
1.Go to: https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017#ssdt-for-vs-2017-standalone-installer
2.Download: English (United States): https://go.microsoft.com/fwlink/?linkid=875613&clcid=0x409
3.Run the installer: “SSDT-Setup-ENU.exe”
4.Click “Next”
5.Install all features i.e. use default selection:
a."Install new SQL Server Data Tools for Visual Studio 2017 Instance"
b.Set a nickname for the new Visual Studio 2017 instance: "SSDT"
c.Install tools for these SQL Server features:
i.SQL Server Database
ii.SQL Server Analysis Services
iii.SQL Server Integration Services
6.Click "Install"
7.Log-in as administrator to continue the installation.
ERROR: Error: Failed to get VS Instances: Die COM-Klassenfactory für die Komponente mit CLSID {177F0C4A-1CD3-4DE7-A32C-71DBBB9FA36D} konnte aufgrund des folgenden Fehlers nicht abgerufen werden: 80040154 Klasse nicht registriert (Ausnahme von HRESULT: 0x80040154 (REGDB_E_CLASSNOTREG))., stackTrace: bei System.RuntimeTypeHandle.CreateInstance(RuntimeType type, Boolean publicOnly, Boolean noCheck, Boolean& canBeCached, RuntimeMethodHandleInternal& ctor, Boolean& bNeedSecurityCheck)
bei System.RuntimeType.CreateInstanceSlow(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)
bei System.RuntimeType.CreateInstanceDefaultCtor(Boolean publicOnly, Boolean skipCheckThis, Boolean fillCache, StackCrawlMark& stackMark)
bei System.Activator.CreateInstance(Type type, Boolean nonPublic)
bei System.Activator.CreateInstance(Type type)
bei Microsoft.Sql.DataTools.ManagedBootstrapperApp.ViewModel.MainViewModel.GetVsInstances()
[0A80:0A84][2018-07-14T18:07:59]e000: MainViewModel.OnPackageAction: Cannot get SQL instance
[0A80:0A84][2018-07-14T18:07:59]i000: MainViewModel.OnPackageAction: Full list of installed instances:
SOLUTION: https://blogs.msdn.microsoft.com/heaths/2017/09/01/cleaning-up-corrupt-visual-studio-instances/
We have mitigated a couple of possible causes and continue to explore both mitigations and long-term remedies, but in the meantime, you will need to clean up, or obliterate, the instance and re-install.
Open an elevated command prompt.
Run: %ProgramFiles(x86)%\Microsoft Visual Studio\Installer\resources\app\layout\InstallCleanup.exe -i
This will clean up only instance data such as the installation directory and state. This will make re-installing the instance faster since any workloads and components you had installed would remain cached and would not be re-downloaded.
8.If everything goes well then there should now be an entry in the startup: “Visual Studio 2017 (SSDT)” and the Program is located in: “"C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\devenv.exe"”
9.Starting “Visual Studio 2017 (SSDT)” the next step is either to “sign in” with an existing account, “register” for a new account or “skip this step”:
a.I decided to sign-in:
b.Log in with my azure account “ms-cloud--s0-v1@beckech.onmicrosoft.com” and password:
i.Settings (General, JavaScript, Visual Basic, Visual C#, Visual C++, Web Development, Web Development (only code)): Choose Default: General
ii.Color scheme (Blue, Blue (additional contrast), Dark, Light): Default: Blue
10.If this entry is missing then I suggest do de-install all the following packages (and start the installation again): “Microsoft Visual C++ 2010 x86 Redistributable”, “Microsoft SQL Server 2012 Native Client”, “Microsoft-System-CLR-Typen für SQL Server 2017”, “Microsoft SQL Server 2016 LocalDB”, “Microsoft Visual Studio Tools for Applications 2017”, “Microsoft Visual C++ 2017 Redristributable (x86)”, “Microsoft SQL Server Data Tools – Visual Studio 2017”, “Microsoft Visual C++ 2017 Redistributable (x64)”, “Microsoft SQL Server Data Tools für Visual Studio”, “Microsoft Visual Studio Installer”.
This sample is based on https://www.codeproject.com/Articles/1182165/Call-a-Web-Service-using-SSIS-then-Store-and-Query applied to the Azure platform.
1.Analogue to chapter A.5.2 connect to the database “becke-ch--app--s1-v1--uc” with user “dbuser1v1”
2.Create an “input” tables “CityData” containing the “PostalCode” and an “output” table “WebServiceData” containing the entire address (city, state, area, …) as XML string.
CREATE TABLE [schema1v1].[CityData]([PostalCode] VARCHAR(10) NOT NULL) ON [PRIMARY];
CREATE TABLE [schema1v1].[WebServiceData]([City] [xml] NULL) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY];
The xml data type lets you store XML documents and fragments in a SQL Server database. An XML fragment is an XML instance that is missing a single top-level element. You can create columns and variables of the xml type and store XML instances in them – see here.
TEXTIMAGE_ON { filegroup | "default" } Indicates that the text, ntext, image, xml, varchar(max), nvarchar(max), varbinary(max), and CLR user-defined type columns (including geometry and geography) are stored on the specified filegroup.
3.Insert some data into the “input” tables “CityData”:
INSERT INTO [schema1v1].[CityData] VALUES ('76520');
INSERT INTO [schema1v1].[CityData] VALUES ('76541');
INSERT INTO [schema1v1].[CityData] VALUES ('76574');
4.Create a stored procedure to get all postal codes. This postal code result set will then be used as input for the SSIS-Foreach-Loop-Container see below:
CREATE PROCEDURE [schema1v1].[pr_GetPostalCodes] AS SET NOCOUNT ON SELECT [PostalCode] FROM [schema1v1].[CityData];
5.Start “Visual Studio 2017 (SSDT)” respective the Program located in: “"C:\Program Files (x86)\Microsoft Visual Studio\2017\SQL\Common7\IDE\devenv.exe"”:
6.Create a new “Business Intelligence →Integration Services” select “Integratrion Services Project” project and name it "Call Web Service".
Name: becke-ch--ssis--s0-v1--webservice
Location: c:\ws\becke-ch--ssis--s0-v1\webservice
Solution name: becke-ch--ssis--s0-v1--webservice
a.In the main-window titled “Package.dtsx”, in the tab “Flow Control”, right-click and select “New Connection...”:
i.In the “Add SSIS-Connection-Manager” select ADO.NET if possible respective supported as connection type. For a comparison of which driver to use see here: ADO versus OLEDB versus ODBC: https://datatellblog.wordpress.com/2015/01/13/ssis-data-flows-ado-net-vs-ole-db-vs-odbc/ . (ADO versus ADO.NET see https://msdn.microsoft.com/de-ch/library/bb978899.aspx).
ii.In the “Configure ADO.NET Connection Manager” dialog, click “New...”:
iii.In the “Connection Manager” dialog enter the connection details:
IMPORTANT: The firewall needs to be opened in order that the “Visual Studio 2017 (SSDT)” Machine can connect to the Azure DB (becke-ch—app—s1-v1.database.windows.net)!
IMPORTANT: You need to enter the database name before clicking on “Test connection” because the user “dbuser1v1” has only access to this database, otherwise the connection test will fail!
Provider (“.NET-Provider”: “SqlClient Data Provider” or “OracleClient Data Provider” or “Odbc Data Provider”. “.NET-Provider for OleDb”: “Microsoft OLE DB Provider Oracle”, “Microsoft OLE DB Provider Analysis Service”, etc.): Default: “.NET-Provider\SqlClient Data Provider”:
Server name: becke-ch--app--s1-v1.database.windows.net
Authentication: SQL Authentication
User name: dbuser1v1
Password: E…
Database name: becke-ch--app--s1-v1--uc
b.In the main-window titled “Package.dtsx” switch to the tab “Parameters” and add package variables: city (Package (Scope), Object (Data tpye), System.Object (Value)), postalCode (Package (Scope), String (Data tpye)) and postalCodes (Package (Scope), Object (Data tpye), System.Object (Value)):
IMPORTANT: On the top right of the “Package.dtsx” window you need to click on the button “Variables” otherwise (by default) the window section containing the variables is not displayed!
c.In the main-window titled “Package.dtsx”, switch back to the tab “Flow Control” and from the SSIS-Toolbox explorer, favorites section, drag & drop the “Execute SQL Task” into “Flow Control” window. Double click on “Execute SQL Task” and call the dbo.pr_GetPostalCodes stored procedure as follows:
General: Name: Default: Execute SQL Task
General: Description: Default: Execute SQL Task
Options: TimeOut: Default: 0
Options: CodePage: Default: 1252
Options: TypeConversionMode (None, Allowed (Default)): Allowed
ResultSet (None (Default), Single Row, Full Result Set, XML): Full result set
SQL Statement: ConnectionType (Excel, OLE DB (Default), ODBC, ADO, ADO.NET, SQLMOBILE): ADO.NET
SQL Statement: Connection (choose from drop down the connection configured see above): becke-ch--app--s1-v1.database.windows.net.becke-ch--app--s1-v1--uc.dbuser1v1
SQL Statement: SQLSourceType (Direct input (Default), File connection, Variable): Direct input
SQL Statement: SQLStatement: schema1v1.pr_getPostalCodes
SQL Statement: IsQueryStoreProcedure (False (Default), True): True
SQL Statement: BypassPrepare: True
d.In the “Execute SQL Task Editor” on the left click on “Resultset” and map the Execute SQL Task Result Set to the postalCodes variable and set the Result Name (string) to: "0":
e.From the SSIS-Toolbox explorer, contaienr section, drag & drop the “Foreach Loop Container”, connect the “Execute SQL Task” to the container and then configure the Collection and Variable Mappings.
f.Configure the Collection settings for the Foreach Loop Container
Enumerator (Foreach-Element-Enumerator (Default), Foreach-File-Enumerator, Foreach-ADO-Enumerator, Foreach-ADO.NET-Schemarowset Enumerator, Foreach Enumerator for Variable, ...): Foreach-ADO-Enumerator
ADO object source variable: User::postalCodes
Enumeration mode (Rows in the first table, Rows in all the tables (ADO.NET dataset only), All tables (ADO.NET dataset only)): Rows in all the tables
g.In the “Foreach Loop Container Editor” configure the Variable Mappings for the Foreach Loop Container. Use postalCode (singular) for each postal code in the postalCodes object.
Variable: User::postalCode
Index: 0
h.Create a new HTTP Connection Manager (the http connection manager offers the possibility to do basic authentication AND Client/Server Certificate 2-way-SSL!):
i.Right click in the Connection Managers pane and click “New Connection…" and select HTTP.
ii.Click Add…
iii.Enter the Server URL pointing to the endpoint of the SOAP Webservice: https://becke-ch-ssis-s0-v1.appspot.com:443/uszipsoap and click OK
to obtain the WSDL for the web service: https://becke-ch-ssis-s0-v1.appspot.com/uszipsoap?WSDL and click OK.
i.Download the WSDL file “https://becke-ch-ssis-s0-v1.appspot.com/uszipsoap?WSDL” to “c:\ws\becke-ch--ssis--s0-v1\webservice\uszipsoap.wsdl”
PROBLEM SSIS (AND OTHER MS TOOLS/LIBRARIES) CANNOT HANDLE NESTED WSDL/XSD FILES CONTAINING IMPORT STATEMENTS AND FOR FLATTENING OF SUCH WSDL FILES ONLY COMMERCIAL TOOLS (E.G. XMLSPY ETC.) ARE AVAILABLE!
Therefore I took a fallback downloading the original (MS) WSDL file “https://github.com/cheezy/soap-object/blob/master/features/wsdl/uszip.asmx.wsdl.xml” and changing the http-address!
j.Add a Web Service Task to the Foreach Loop Container:
i.On the General tab:
Name: Default: Web Service Task
Description: Default: Web Service Task
Connection: HttpConnection: Select the HTTP Connection Manager created in the previous step.
WSDLFile: Click the ellipsis button and browse to the WSDL file stored in “c:\ws\becke-ch—ssis--s0-v1\webservice\uszipsoap.wsdl”.
OverwriteWSDLFile: True.
DO NOT Click the Download WSDL button. Otherwise your WSDL file gets corrupted – and I don’t know why!
ii.Configure the Input and Output properties for the Web Service Task
1.Input:
Service: USZip
Method (GetInfoByZIP, GetInfoByState, GetInfoByCity, GetInfoByAreaCode): GetInfoByZip
Clicking the checkbox in the Variable column will switch the Value column to a dropdown list.
Value: User::postalCode.
2.Output tab
OutputType (File Connection (Default), Variable): Variable.
Variable: User::city and click OK.
k.Add another Execute SQL Task and configure it to call the schema1v1.pr_AddCity stored procedure. Configure the sequence between the Web Service Task and the Execute SQL Task accordingly.
i.On the General Tab:
General: Name: Execute SQL Task
General: Description: Execute SQL Task
Options: TimeOut: 0
Options: CodePage: 1252
Options: TypeConversionMode: Allowed
ResultSet: None
SQL Statement: ConnectionType: ADO.NET
SQL Statement: Connection: becke-ch--app--s1-v1.database.windows.net.becke-ch--app--s1-v1--uc.dbuser1v1
SQL Statement: SQLSourceType: Direct input
SQL Statement: SQLStatement: schema1v1.pr_AddCity
SQL Statement: IsQueryStoreProcedure: True
SQL Statement: BypassPrepare: True
ii.On the Parameter Mappings Tab: Configure the Parameter Mapping properties of the Execute SQL Task:
Click the Add button.
Select User::city from the Variable Name dropdown list.
Select Xml as the Data Type.
Type "city" without the quotation marks as the Parameter Name and click OK.
ERROR: "Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: Could not execute the Web method. The error is: Fehler beim Reflektieren des Typs 'System.Xml.XmlText'.
SSIS-Paket "c:\ws\becke-ch--ssis--s0-v1\webservice\becke-ch--ssis--s0-v1--webservice\becke-ch--ssis--s0-v1--webservice\Package.dtsx" gestartet.
Fehler: 0xC002F304 bei Task "Webdienst", Task "Webdienst": Fehler mit folgender Fehlermeldung: "Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebserviceTaskException: Could not execute the Web method. The error is: Fehler beim Reflektieren des Typs 'System.Xml.XmlText'..
bei Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebMethodInvokerProxy.InvokeMethod(DTSWebMethodInfo methodInfo, String serviceName, Object connection)
bei Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTaskUtil.Invoke(DTSWebMethodInfo methodInfo, String serviceName, Object connection, VariableDispenser taskVariableDispenser)
bei Microsoft.SqlServer.Dts.Tasks.WebServiceTask.WebServiceTask.executeThread()".
Fehler beim Task: Task "Webdienst"
Warnung: 0x80019002 bei Foreach-Schleifencontainer: SSIS-Warnungscode "DTS_W_MAXIMUMERRORCOUNTREACHED". Die Execution-Methode wurde erfolgreich ausgeführt, aber die Anzahl von ausgelösten Fehlern (1) hat den maximal zulässigen Wert erreicht (1). Deshalb tritt ein Fehler auf. Dieses Problem tritt auf, wenn die Anzahl von Fehlern den in "MaximumErrorCount" angegebenen Wert erreicht. Ändern Sie den Wert für "MaximumErrorCount", oder beheben Sie die Fehler.
Warnung: 0x80019002 bei Package: SSIS-Warnungscode "DTS_W_MAXIMUMERRORCOUNTREACHED". Die Execution-Methode wurde erfolgreich ausgeführt, aber die Anzahl von ausgelösten Fehlern (1) hat den maximal zulässigen Wert erreicht (1). Deshalb tritt ein Fehler auf. Dieses Problem tritt auf, wenn die Anzahl von Fehlern den in "MaximumErrorCount" angegebenen Wert erreicht. Ändern Sie den Wert für "MaximumErrorCount", oder beheben Sie die Fehler.
SSIS-Paket "c:\ws\becke-ch--ssis--s0-v1\webservice\becke-ch--ssis--s0-v1--webservice\becke-ch--ssis--s0-v1--webservice\Package.dtsx" fertig gestellt: Fehler.
Das Programm "[4348] DtsDebugHost.exe: DTS" wurde mit Code 0 (0x0) beendet.
REASON: It seems that “Microsoft.SqlServer.Dts.Tasks.WebServiceTask” cannot handle “<s:complexType mixed="true">”
<s:element name="GetInfoByZIPResponse">
<s:complexType>
<s:sequence>
<s:element minOccurs="0" maxOccurs="1" name="GetInfoByZIPResult">
<s:complexType mixed="true">
<s:sequence>
<s:any />
</s:sequence>
</s:complexType>
</s:element>
</s:sequence>
</s:complexType>
</s:element>
LETS TRY THE FOLLOWING:
<s:element name="GetInfoByZIPResponse">
<s:complexType>
<s:sequence>
<s:element minOccurs="0" maxOccurs="1" name="GetInfoByZIPResult" type="s:string">
<!--s:element minOccurs="0" maxOccurs="1" name="GetInfoByZIPResult">
<s:complexType mixed="true">
<s:sequence>
<s:any />
</s:sequence>
</s:complexType>
</s:element-->
</s:sequence>
</s:complexType>
</s:element>
THIS SEEMS WORKING – QED – “<s:complexType mixed="true">” AND/OR “<s:sequence> <s:any />...” NOT WORKING!
There is not much useful information on the web regarding: WebserviceTaskException System.Xml.XmlText complexType any
https://stackoverflow.com/questions/2567414/correct-xml-serialization-and-deserialization-of-mixed-types-in-net
FACT SSIS IS UNUSABLE FOR WEBSERVICES BECAUSE IT CANNOT WELL HANDLE WSDL SPECIFICATIONS!
https://docs.microsoft.com/en-gb/azure/data-factory/tutorial-deploy-ssis-packages-azure
1.Create a Azure SQL DB & Server see chapter A.5 with the following settings:
•Database name: becke-ch--ssis--s0-v1--uc
•Subscription: MSDN Subscription
•Resource group (Create new/Use existing): Create new: becke-ch--ssis--s0-v1
•Select source: Blank database
•Want to use SQL elastic pool? (Yes/Not now): Not now
•Pricing tier: Configure required settings: Default: “Standard S3: 100 DTU, 250 GB (Monthly cost 135.46 CHF)” → change to: “Basic: 5 DTU, 2 GB (Monthly cost 4.51 CHF)”
•Collation: SQL_Latin1_General_CP1_CI_AS
•Click “Create”
•(Ignore: Automation options)
Server:
•Configure required settings: Server: Create a new server (No servers found)
•Server name: becke-ch--ssis--s0-v1 (.database.windows.net)
•Server admin login: db-admin--s0-v1
•Password: E••••••••
•Confirm password: E••••••••
•Location: West Europe
•IMPORTANT: Select: “Allow azure services to access server”!
Security: DB: We don’t set-up a firewall for now and we don’t do AAD authentication (instead using SQL authentication).
2.Select New on the left menu, select Data + Analytics, and then select Data Factory
a.Name (must be globally unique – see https://docs.microsoft.com/en-gb/azure/data-factory/naming-rules ): becke-ch-ssis-s0-v1-uc
b.Subscription: MSDN Subscription
c.Resource group (Create new/Use existing): Use existing: becke-ch--ssis--s0-v1
d.Version: V2
e.Location: West Europe
3.Select Author & Monitor to open the Data Factory user interface (UI) on a separate tab
a.On the Let's get started page, select the Configure SSIS Integration Runtime tile
b.On the General Settings page of Integration Runtime Setup, complete the following steps:
i.Name (Integration Runtime Name can contain only letters, numbers and the dash (-) character. The first and last characters must be a letter or number. Every dash (-) character must be immediately preceded and followed by a letter or a number. Consecutive dashes are not permitted in integration runtime name!): becke-ch-ssis-s0-v1-uc-xyz
ii.Description: enter the description of your integration runtime.
iii.Location (select the same location of your database server to host SSISDB!): West Europe
iv.Node Size (Select a large node size (scale up), if you want to run many compute/memory –intensive packages – select the smallest: D1 V2: 1: 3.50 GiB: 50 GiB: $0.592/hour: $0.202/hour (~66%)):
v.Node (Select a large cluster with many nodes (scale out), if you want to run many packages in parallel: 1..10): 1
vi.Edition/License: (select SQL Server edition/license for your integration runtime: Standard or Enterprise. Select Enterprise, if you want to use advanced/premium features on your integration runtime): Standard
vii.Save Money (Select Yes, if you want to bring your own SQL Server license with Software Assurance to benefit from cost savings with hybrid use): No
viii.Click Next.
c.On the SQL Settings page, complete the following steps:
i.Subscription (subscription that has your database server to host SSISDB): MSDN Subscription
ii.Location (location of your database server to host SSISDB. Select the same location of your integration runtime!): West Europe
iii.Catalog Database Server Endpoint: (created on your behalf as a standalone database, part of an Elastic Pool, or in a Managed Instance (Preview). See Compare SQL Database and Managed Instance (Preview). See Create Azure-SSIS IR in a virtual network): becke-ch--ssis--s0-v1.database.windows.net
iv.Use AAD authentication: (See Create Azure-SSIS IR with AAD authentication): Do not select
v.Admin Username (SQL authentication username for your database server): db-admin--s0-v1
vi.Admin Password: Eo...
vii.Catalog Database Service Tier (Basic/Standard/Premium tier or Elastic Pool name): Basic
viii.Click Test Connection and if successful, click Next.
d.On the Advanced Settings page, complete the following steps:
i.Maximum Parallel Executions Per Node maximum number of packages to execute concurrently per node in your integration runtime cluster. Select a low number, if you want to use more than one cores to run a single large/heavy-weight package that is compute/memory -intensive. Select a high number, if you want to run one or more small/light-weight packages in a single core): 1
ii.Custom Setup Container SAS URI (optionally enter Shared Access Signature (SAS) Uniform Resource Identifier (URI) of your Azure Storage Blob container where your setup script and its associated files are stored, see Custom setup for Azure-SSIS IR): Leave empty
iii.Select a Vnet (join your integration runtime to a virtual network): Do not select
iv.Click Finish to start the creation of your integration runtime (takes approximately 20 to 30 minutes to complete).
When you provision an instance of an Azure-SSIS IR, the Azure Feature Pack for SSIS and the Access Redistributable are also installed. These components provide connectivity to Excel and Access files and to various Azure data sources, in addition to the data sources supported by the built-in components. You can also install additional components. For more info, see Custom setup for the Azure-SSIS integration runtime.
e.On the Connections tab, switch to Integration Runtimes if needed. Select Refresh to refresh the status
4.Create an Azure-SSIS integration runtime
a.In the Azure Data Factory UI, switch to the Edit tab, select Connections, and then switch to the Integration Runtimes tab to view existing integration runtimes in your data factory.
b.
5.
As mentioned in chapter Error: Reference source not found - ADF V1 only supported E(xtract) & L(oad) capabilities and the transformation had to be done in the source and/or destination mostly using stored procedures. This functionality is still available in V2 and should be preferred when doing simple copying of data instead of using the rather heavy weight SSIS functionality.
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-azure-blob-connector
Based on: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-copy-data-wizard-tutorial
To copy data from one location to another azure offers a preview feature to perform this task:
1.Open your data factory and click on “Copy data (PREVIEW)” which opens “https://datafactory.azure.com” in a new tab:
2.Properties: Enter name and description for the copy data task and specify how often you want to run the task.
a.Task name (required): CopyPipeline-hpw
b.Task description: Optional: ...
c.Task cadence or Task schedule: Run once now/Run regularly on schedule: Select: Run regularly on schedule
i.Recurring pattern: Monthly/Weekly/Daily/Hourly/Minute: Select: every 15 minutes
ATTENTION: THE MINIMUM IS A 15 MINUTE RECURRENCE!!!
d.Start date time (UTC): 07/24/2017 03:00 am
e.End date time (UTC): 12/30/2099 11:00 pm
3.Source data store: Specify the source data store for the copy task. You can use an existing data store connection or specify a new data store. Click HERE to suggest new copy sources or give comments.
a.Select “Azure SQL Database”
4.Connection: Specify the Azure SQL Database:
a.Connection name (required): Source-SQLAzure-hpw
b.Server / database selection method (required): From Azure subscription / Enter manually: From Azure subscription
c.Azure subscription (required): Select your subscription
d.Server name (required): becke-ch--app--s0-v1
e.Database name (required): becke-ch--app--s0-v1--uc
f.User name (required): db-user--s0-v1@beckech.onmicrosoft.com
g.Password (required): ...
ERROR: Error when processing request, Type=Microsoft.Data.Mashup.MashupValueException, Message=Microsoft SQL: Cannot open server "beckech.onmicrosoft.com" requested by the login. The login failed., Source=Microsoft.Data.Mashup,', ActivityId: ....
REASON/SOLUTION: AD Authentication not supported use instead SQL Authentication
5.Select the table: [schema0v1].[Benutzer]
6.Apply Filter: None / custom query: None
7.Destination: Repeat the same steps again as we DID for the Source
BUT Destination Sources are limited:
•Azure Blob Storage
•Azure Data Lake Store
•Azure DocumentDB
•Azure SQL Database
•Azure SQL Data Warehouse
•Azure Table Storage
•Azure Search
•File System
•Oracle
•SQL Server
8.Table mapping: For each table you have selected to copy in the source data store, select a corresponding table in the destination data store or specify the stored procedure to run at the destination. Select: [schema0v1].[Benutzer]
9.Schema mapping: Choose how source and destination columns are mapped.
10.Settings: More options for data movement: Leave the setting to default:
a.Performance settings:
i.Advanced settings:
1.Parallel Copy:
a.Cloud units: 1, 2, 4, 8, 16, 32: Auto (default)
b.Parallel copies: 1,2,3,4...,32: Auto (default)
Summary:
Start Deployment:
Open the Data Factory and click on “Author and Deploy”:
Write output data to separate schema: “schema0v1dest”:
1.Open the Output Dataset:
2.Change “tableName”: "[schema0v1].[Benutzer]" → "[schema0v1dest].[Benutzer]" and click “Deploy”
{
"name": "OutputDataset-xfz",
"properties": {
"structure": [
{
"name": "ID",
"type": "Int32"
},
{
"name": "LastName",
"type": "String"
},
{
"name": "FirstName",
"type": "String"
}
],
"published": false,
"type": "AzureSqlTable",
"linkedServiceName": "Destination-SQLAzure-xfz",
"typeProperties": {
"tableName": "[schema0v1dest].[Benutzer]"
},
"availability": {
"frequency": "Minute",
"interval": 15
},
"external": false,
"policy": {}
}
}
Automation Script: The export of a data-factory template via automation script from the resource group see chapter Error: Reference source not found is not possible and give the following error:
Export template operation completed with errors. Some resources were not exported. Please see details for more information. (Code: ExportTemplateCompletedWithErrors)
•The schema of resource type 'Microsoft.DataFactory/dataFactories' is not available. Resources of this type will not be exported to the template. (Code: ResourceTypeSchemaNotFound, Target: Microsoft.DataFactory/dataFactories)
REASON: https://social.msdn.microsoft.com/Forums/azure/en-US/06b0a24a-c088-4ab1-ace0-5957bb4c2440/data-factory-resource-provider-wont-create-template-from-resource-group?forum=AzureDataFactory : ARM template export is not supported for Azure Data Factory. However, you can still create/edit an ARM template for deployment of Data Factories. Please see https://docs.microsoft.com/en-us/azure/data-factory/data-factory-build-your-first-pipeline-using-arm .
SOLUTION/WORKAROUND: There currently exists no solution to this problem i.e. these resources need to be handled outside the automation script individually.
•Or: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-build-your-first-pipeline-using-arm
Download script from above, save it under the name “becke-ch--app--s0-v1--df--uc.json” and modify its content based on the one that was generated using the wizard in A.8.1.
Open questions: Comparing the downloaded script with the script generated by the wizard:
Section | Only in Wizard | Only in Download |
Linked Service | ... "name": "Destination-SQLAzure-xfz", "properties": { "hubName": "becke-ch-app-s0-v1-uc_hub", "type": "AzureSqlDatabase", ... |
|
Dataset | ... ], "published": false, "type": "AzureSqlTable", "linkedServiceName": "Source-SQLAzure-xfz", "typeProperties": { "tableName": "[schema0v1].[Benutzer]" }, "availability": { "frequency": "Minute", "interval": 15 }, … "policy": {} } ... | ... "type": "datasets", "name": "[variables('azureSqlDatasetNameFrom')]", "dependsOn": [ "[variables('dataFactoryName')]", "[variables('azureSqlLinkedServiceNameFrom')]" ], "apiVersion": "2015-10-01", "properties": { … |
Pipeline | ... "sink": { "type": "SqlSink", "writeBatchSize": 0, "writeBatchTimeout": "00:00:00" }, ... "policy": { "timeout": "1.00:00:00", "concurrency": 1, "executionPriorityOrder": "NewestFirst", "style": "StartOfInterval", "retry": 3, "longRetry": 0, "longRetryInterval": "00:00:00" }, … "start": "2017-07-05T06:33:01.03Z", "end": "2099-12-30T23:00:00Z", "isPaused": false, "hubName": "becke-ch-app-s0-v1-uc_hub", "pipelineMode": "Scheduled" } ... | ... ], "apiVersion": "2015-10-01", "properties": { … "sink": { "type": "SqlSink", "sqlWriterCleanupScript": "$$Text.Format('DELETE FROM {0}', 'emp')" }, … |
IMPORTANT CHANGES: Edit the deployment script “becke-ch--app--s0-v1--df--uc.json” and perform the following important changes:
•Dataset: Policy:
◦"executionPriorityOrder": "OldestFirst"
◦"style": "EndOfInterval"
•Set: "location": "North Europe"
◦To avoid the following error message:
The provided location 'West Europe' is not available for resource type 'Microsoft.DataFactory/dataFactories'
•Double hyphen “--” are not allowed: Set: "dataFactoryName": "beckech-app-s0-v1-df-uc"
◦To avoid the following error message:
Name: 'becke-ch--app--s0-v1--df--uc' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx",
"code": "InvalidName"
•Input Data-Set set: "external": true
•Output Data-Set: set: "external": false
◦To avoid the following runtime issue: “Waiting: Dataset dependencies” And “The upstream dependencies are not ready”
Deployment:
1.Follow the steps in chapter: Error: Reference source not found.
2.Deploy Data Factory entities using the Resource Manager template you created above: New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json
PS C:\Users\raoul-becke--s0-v1> New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json
DeploymentName : becke-ch--app--s0-v1--adf--uc-deployment
ResourceGroupName : becke-ch--app--s0-v1
ProvisioningState : Succeeded
Timestamp : 25.07.2017 20:14:26
Mode : Incremental
TemplateLink :
Parameters :
Name Type Value
=============== ========================= ==========
sqlServerNameFrom String becke-ch--app--s0-v1
databaseNameFrom String becke-ch--app--s0-v1--uc
sqlServerUserNameFrom String dbuser0v1
sqlServerPasswordFrom SecureString
sqlTableFrom String [schema0v1].[Benutzer]
sqlServerNameTo String becke-ch--app--s0-v1
databaseNameTo String becke-ch--app--s0-v1--uc
sqlServerUserNameTo String dbuser0v1
sqlServerPasswordTo SecureString
sqlTableTo String [schema0v1dest].[Benutzer]
Outputs :
DeploymentDebugLogLevel :
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-scheduling-and-execution
https://docs.microsoft.com/en-us/azure/data-factory/data-factory-repeatable-copy
As described in the articles above we can achieve an incremental load by adapting the “sqlReaderQuery” to the “WindowStart” and “WindowEnd” times of the slice:
"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= \\'{0:yyyy-MM-dd HH:mm}\\' AND modification < \\'{1:yyyy-MM-dd HH:mm}\\'', WindowStart, WindowEnd)"
ISSUE: Sometimes data is synchronized delayed i.e. the record is persisted or propagated and then persisted after its modification time-stamp.
Data Set: Offset: The offset that can be set in Data Set does not help because it shifts the whole slice start and end but what we want is that for a certain slice we retrieve the records considering the delay of the propagation/persistence.
SOLUTION: TODATETIMEOFFSET: https://docs.microsoft.com/en-us/sql/t-sql/functions/todatetimeoffset-transact-sql: In the following sample we assume that the maximum delay of persistence compared to the modification time-stamp is 5 minutes:
"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', +5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"
ALTERNATIVE SOLUTION: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-create-datasets
dataDelay The time to delay the check on the availability of the external data for the given slice. For example, you can delay an hourly check by using this setting.
The setting only applies to the present time. For example, if it is 1:00 PM right now and this value is 10 minutes, the validation starts at 1:10 PM.
Note that this setting does not affect slices in the past. Slices with Slice End Time + dataDelay < Now are processed without any delay.
Times greater than 23:59 hours should be specified by using the day.hours:minutes:seconds format. For example, to specify 24 hours, don't use 24:00:00. Instead, use 1.00:00:00. If you use 24:00:00, it is treated as 24 days (24.00:00:00). For 1 day and 4 hours, specify 1:04:00:00.
PRECONDITION: The incremental load see previous chapter A.8.4 has been established.
ISSUE: Your database changes are only processed minimum every 15 minutes on: HH:00:00, HH:15:00, HH:30:00 and HH:45:00 - but you need the changes at the destination every 5 minutes (on: HH:00:00, HH:05:00, HH:10:00, HH:15:00, HH:20:00, ...)
SOLUTION: Edit the deployment script “becke-ch--app--s0-v1--df--uc.json” and perform the following changes:
1.Create 2 more copies of the existing: Pipeline section, Input-Dataset section and Output-Dataset section. You need to give these copies different names (“name” attribute) and in the Pipeline section you need to reference these names accordingly in the “dependsOn”, “inputs” and “outputs” attribute.
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameFrom'),'05')]",
"dependsOn": [
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameFrom'),'10')]",
"dependsOn": [
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameTo'),'05')]",
"dependsOn": [
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameTo'),'10')]",
"dependsOn": [
…
{
"type": "datapipelines",
"name": "[concat(variables('pipelineName'),'05')]",
"dependsOn": [
…
"[concat(variables('azureSqlDatasetNameFrom'),'05')]",
"[concat(variables('azureSqlDatasetNameTo'),'05')]"
],
"apiVersion": "2015-10-01",
"properties": {
"activities": [
{
…
"inputs": [
{
"name": "[concat(variables('azureSqlDatasetNameFrom'),'05')]"
}
],
"outputs": [
{
"name": "[concat(variables('azureSqlDatasetNameTo'),'05')]"
}
],
…
{
"type": "datapipelines",
"name": "[concat(variables('pipelineName'),'10')]",
"dependsOn": [
…
"[concat(variables('azureSqlDatasetNameFrom'),'10')]",
"[concat(variables('azureSqlDatasetNameTo'),'10')]"
],
"apiVersion": "2015-10-01",
"properties": {
"activities": [
{
…
"inputs": [
{
"name": "[concat(variables('azureSqlDatasetNameFrom'),'10')]"
}
],
"outputs": [
{
"name": "[concat(variables('azureSqlDatasetNameTo'),'10')]"
}
],
…
2.Because datasets are only processed every 15 minutes (if there are not any failures) exactly on: HH:00:00, HH:15:00, HH:30:00 and HH:45:00 and therefore you need to give the 2 copies of the dataset an offset: "offset": "00:05:00" respective "offset": "00:10:00":
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameFrom'),'05')]",
…
"properties": {
"availability": {
"frequency": "Minute",
"interval": 15,
"offset": "00:05:00"
},
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameFrom'),'10')]",
…
"properties": {
…
"availability": {
"frequency": "Minute",
"interval": 15,
"offset": "00:10:00"
},
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameTo'),'05')]",
…
"properties": {
…
"availability": {
"frequency": "Minute",
"interval": 15,
"offset": "00:05:00"
},
…
{
"type": "datasets",
"name": "[concat(variables('azureSqlDatasetNameTo'),'10')]",
…
"properties": {
…
"availability": {
"frequency": "Minute",
"interval": 15,
"offset": "00:10:00"
},
…
3.Having the incremental load established – see chapter A.8.4 – you need to make sure that the overlapping slices do not load the same data several time. This can be achieved by adapting the TODATETIMEOFFSET – so that the window size is reduced to 5 minutes:
…
{
"type": "datapipelines",
"name": "[variables('pipelineName')]",
…
"properties": {
"activities": [
{
…
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', -5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"
…
{
"type": "datapipelines",
"name": "[concat(variables('pipelineName'),'05')]",
…
"properties": {
"activities": [
{
…
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', -5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"
…
{
"type": "datapipelines",
"name": "[concat(variables('pipelineName'),'10')]",
…
"properties": {
"activities": [
{
…
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "$$Text.Format('SELECT * FROM [schema0v1].[Benutzer] WHERE modification >= TODATETIMEOFFSET (\\'{0:yyyy-MM-dd HH:mm}\\', -5) AND modification < TODATETIMEOFFSET (\\'{1:yyyy-MM-dd HH:mm}\\', +5)', WindowStart, WindowEnd)"
…
ERROR: The provided location 'West Europe' is not available for resource type 'Microsoft.DataFactory/dataFactories':
PS C:\Users\raoul-becke--s0-v1> New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json
New-AzureRmResourceGroupDeployment : 21:56:20 - Error: Code=LocationNotAvailableForResourceType; Message=The provided location 'West Europe' is not available for resource type 'Microsoft.DataFactory/dataFactories'. List of available regions for the resource type is 'westus,northeurope,eastus,westcentralus'.
At line:1 char:1
+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep
loymentCmdlet
New-AzureRmResourceGroupDeployment : The deployment validation failed
At line:1 char:1
+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : CloseError: (:) [New-AzureRmResourceGroupDeployment], InvalidOperationException
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep
loymentCmdlet
SOLUTION: Fix location error: Edit “becke-ch--app--s0-v1--df--uc.json” and set: "location": "North Europe"
ERROR: Name: 'becke-ch--app--s0-v1--df--uc' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx", "code": "InvalidName"
PS C:\Users\raoul-becke--s0-v1> New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--uc-deployment -ResourceGroupName becke-ch--app--s0-v1 -TemplateFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc.json -TemplateParameterFile C:\ws\data\becke-ch--ms-cloud--s0-v1\document\becke-ch--app--s0-v1--df--uc-param.json
New-AzureRmResourceGroupDeployment : 22:01:45 - Resource Microsoft.DataFactory/datafactories
'becke-ch--app--s0-v1--df--ucdzsa54yhmvwma' failed with message '{
"message": "Name: 'becke-ch--app--s0-v1--df--uc' is not a valid name. For Data Factory naming restrictions, please see http://msdn.microsoft.com/library/dn835027.aspx",
"code": "InvalidName"
}'
At line:1 char:1
+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep
loymentCmdlet
New-AzureRmResourceGroupDeployment : 22:01:55 - Template output evaluation skipped: at least one resource deployment
operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-debug for usage details.
At line:1 char:1
+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep
loymentCmdlet
New-AzureRmResourceGroupDeployment : 22:01:55 - Template output evaluation skipped: at least one resource deployment
operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-debug for usage details.
At line:1 char:1
+ New-AzureRmResourceGroupDeployment -Name becke-ch--app--s0-v1--adf--u ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [New-AzureRmResourceGroupDeployment], Exception
+ FullyQualifiedErrorId : Microsoft.Azure.Commands.ResourceManager.Cmdlets.Implementation.NewAzureResourceGroupDep
loymentCmdlet
DeploymentName : becke-ch--app--s0-v1--adf--uc-deployment
ResourceGroupName : becke-ch--app--s0-v1
ProvisioningState : Failed
Timestamp : 25.07.2017 20:01:52
Mode : Incremental
TemplateLink :
Parameters :
Name Type Value
=============== ========================= ==========
sqlServerNameFrom String becke-ch--app--s0-v1
databaseNameFrom String becke-ch--app--s0-v1--uc
sqlServerUserNameFrom String dbuser0v1
sqlServerPasswordFrom SecureString
sqlTableFrom String [schema0v1].[Benutzer]
sqlServerNameTo String becke-ch--app--s0-v1
databaseNameTo String becke-ch--app--s0-v1--uc
sqlServerUserNameTo String dbuser0v1
sqlServerPasswordTo SecureString
sqlTableTo String [schema0v1dest].[Benutzer]
Outputs :
DeploymentDebugLogLevel :
SOLUTION: Fix naming error – see https://docs.microsoft.com/en-us/azure/data-factory/data-factory-naming-rules – double hyphen “--” are not allowed: Edit “becke-ch--app--s0-v1--df--uc.json” and set: "dataFactoryName": "beckech-app-s0-v1-df-uc",
ISSUE: “Waiting: Dataset dependencies” And “The upstream dependencies are not ready”
SOLUTION: See https://stackoverflow.com/questions/38123673/azure-waiting-dataset-dependencies-in-manual-created-pipelines
setting the external property on the input dataset json definition to true lets azure data factory know that the data is being externally generated and not from another pipeline in the factory!
A new deployment is needed because otherwise we get:
Error
Dataset 'AzureSqlDatasetTo' is output of activity 'CopyFromAzureSQLToAzureSQL' in pipeline 'SQL2SQLPipeline'. It cannot change to external.
a.edit “becke-ch--app--s0-v1--df--uc.json”
i.Input Data-Set set: "external": true
…
"availability": {
"frequency": "Minute",
"interval": 15
},
"external": true
}
…
ii.Output Data-Set: set: "external": false
…
"availability": {
"frequency": "Minute",
"interval": 15
},
"external": false
}
…
ERROR: Cannot set active period Start=..., End=... for pipeline 'SQL2SQLPipeline00' due to conflicts on Output with Pipeline: SQL2SQLPipeline05, Activity CopyFromAzureSQLToAzureSQL05, Period: Start=..., End=...
Cannot set active period Start=08/07/2017 20:45:00, End=08/07/2017 23:30:00 for pipeline 'SQL2SQLPipeline00' due to conflicts on Output: AzureSqlDatasetTo with Pipeline: SQL2SQLPipeline05, Activity CopyFromAzureSQLToAzureSQL05, Period: Start=08/07/2017 20:45:00, End=08/07/2017 23:30:00
SOLUTION: https://social.msdn.microsoft.com/Forums/expression/en-US/17802772-cf18-4e8c-a0e6-3af805c7856a/pipelineactivity-conflict?forum=AzureDataFactory
Use different data sets! You need separate Input- and Output-Dataset for each pipeline! If you only have different Output-Datasets then this will fix this error but you can then not have different start- and end-times!
ERROR: Database operation failed on server 'Source:tcp:becke-ch--app--s0-v1.database.windows.net,1433' with SQL Error Number '208'. Error message from database execution : Invalid object name 'parameters'..
SOLUTION: sqlReaderQuery was missing the concat operation and therefore not correct constructed and was looking for a table named “FROM parameters('sqlTableFrom')”:
…
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "[concat('SELECT * FROM ',parameters('sqlTableFrom'))]"
},
...
1.AD Security Group: According to naming convention in chapter Error: Reference source not found create an AD Security Group (see chapter Error: Reference source not found): “becke-ch--app--s0-v1--adgrp--kv”
•Name: Follow the naming convention in chapter Error: Reference source not found BUT double hyphen “--” are not allowed and therefore use single hyphen instead: becke-ch-app-s0-v1-kv
•Subscription: Choose the subscription: “Pay-As-You-Go”
•Resource Group: Create new/Use existing: Use existing: becke-ch--app--s0-v1
•Location: West Europe
•Pricing tier: Default: Standard
•Access policies: Default: 1 principal selected
•Advanced access policy: Default is non selected:
◦Configure advanced access policy
▪Enable access to Azure Virtual Machines for deployment: Do not select
▪Enable access to Azure Resource Manager for template deployment: Select
▪Enable access to Azure Disk Encryption for volume encryption: Select
•Pin to dashboard: Default: Not checked
1.Download: https://powerbi.microsoft.com/en-us/desktop/
2.Execute: PBIDesktop_x64.msi
3.Welcome Screen: Click “Next”:
4.Microsoft Software License Terms: Select “I accept the terms in the License Agreement” and Click “Next”:
5.Destination Folder: Leave Default: “C:\Program Files\Microsoft Power BI Desktop\” and click “Next”:
6.Ready to install: Click “Install” - you need to log in as administrator to perform the installation!
1.Splash Screen: “Sign in to get the most out of Power-BI … sign in to create and share dashboards, and get automatic insights from you data ...” - I skip this step because I don’t have a “Power BI Pro” license with this account to create group workspaces, create content packs and publish them
2.Click “Get Data”, select “more ...” and select “Azure SQL Database” (actually it pops up the same dialog as if you were selecting “SQL Server”):
3.SQL Server Database: Enter the connection details:
a.Server: becke-ch--app--s0-v1.database.windows.net
b.Database (optional): becke-ch--app--s0-v1--uc
c.Data connectivity mode: Select “DirectQuery” (to have always the latest data shown in your report – alternatively you can select “Import” which is the default):
i.Direct Query: https://powerbi.microsoft.com/en-us/documentation/powerbi-desktop-use-directquery/
1.Supported Datasources: SQL Server, Azure SQL Database, Azure SQL Data Warehouse, SAP HANA, Oracle Database, Teradata Database, Amazon Redshift (Preview), Impala (Preview), Snowflake (Preview)
2.Benefits:
a.build visualizations over very large datasets, where it otherwise would be unfeasible to first import all of the data
b.DirectQuery reports always use current data
c.1 GB dataset limitation does not apply to DirectQuery
3.Limitations
a.All tables must come from a single database
b.Query Editor query is overly complex, an error will occur. To remedy the error you must either delete the problematic step in Query Editor, or Import the data instead of using DirectQuery
c.Relationship filtering is limited to a single direction, rather than both directions
d.Time intelligence capabilities are not available in DirectQuery.
e.By default, limitations are placed on DAX expressions allowed in measures
f.1 million row limit for returning data when using DirectQuery
4.Important
a.Performance and load - All DirectQuery requests are sent to the source database, so the time required to refresh a visual is dependent on how long that back-end source takes to respond with the results from the query (or queries).
b.Using Row Level Security (RLS) can have a significant impact as well; a non-RLS dashboard tile shared by multiple users results in a single query to the database, but using RLS on a dashboard tile usually means the refresh of a tile requires one query per user, thus significantly increasing load on the source database and potentially impacting performance.
c.Security - All users who consume a published report connect to the back-end data source using the credentials entered after publication to the Power BI service. BUT the good thing is that RLS is being considered!
d.Advanced Options:
i.Command time out in minutes (optional): [leave empty] (default)
1.https://powerbi.microsoft.com/en-us/blog/power-bi-desktop-march-update-feature-summary/
You can now specify a command timeout value (in minutes) when connecting to database sources. This was previously possible only via custom formula authoring. (This should help to avoid connection timeouts on large data-sets which are truncating the data).
ii.SQL statement (optional, requires database): [leave empty] (default)
iii.Include relationship columns: Selected (default)
1.https://community.powerbi.com/t5/Desktop/include-relationship-columns-option-in-query-editor/td-p/56644
Functionality: I have tableA with relationship to tableB, then when connecting to database if i tick 'Include Relationship Columns' then under Navigator window when I click 'Select related Tables' it will automatically pick tableB but if i have not ticked that option then when you click 'Select related tables' it will not pick tableB.
iv.Navigate using full hierarchy: Not selected (default)
v.Enable SQL Server Failover support: Not selected (default)
Authentication: SQL Azure connector with Azure AD Auth: NOT SUPPORTED!
Therefore trying Database / SQL User authentication:
•User name: dbuser0v1
•Password: E...
ERROR: Details: "Microsoft SQL: Cannot open server 'becke-ch--app--s0-v1' requested by the login. Client with IP address '77.58.5.221' is not allowed to access the server. To enable access, use the Windows Azure Management Portal or run sp_set_firewall_rule on the master database to create a firewall rule for this IP address or address range. It may take up to five minutes for this change to take effect.".
https://portal.azure.com – open the SQL Server “becke-ch--app--s0-v1”:
•Select “Firewall” on the left side and click on “Add client IP” to add the following IP Address “Client IP address 77.58.5.221”
•Click “Save”
Power BI Desktop: Navigator:
•Select Related Tables: “schema0v1.Benutzer” and click “
https://my.visualstudio.com/subscriptions
Click on "MSDN Platforms": https://my.visualstudio.com/subscriptions#
Go to "Power BI Pro" and click "Get Code": https://my.visualstudio.com/
"Success – The coupon code has been successfully retrieved. Click "Activate" to proceed."
You get the message “You already have a license for your work account which is administered by your organization. If you want to register for Power B I Pro Developer (MSDN), sign-out from your work account and create a new account that you administer yourself”
In German: “Sie haben bereits ein Abonnement über Ihre Arbeit Ihre Organisation verwaltet Ihre Verwendung von . Wenn Sie sich für Power BI Pro Developer (MSDN) registrieren möchten, melden Sie sich bei Ihrem Geschäftskonto ab, und erstellen Sie ein neues Konto, das Sie selbst verwalten.”
Click on "Create new account" (Neues Konto erstellen")
You are then signed out and redirected to: https://signup.microsoft.com
But instead of creating a new account click on “Add to an existing subscription”:
Login: https://login.microsoftonline.com
Email: ms-cloud--s0-v1@beckech.onmicrosoft.com
Password: …
Enter address details and click on “Continue”:
•Address 1: S...
•Address 2: [empty]
•Zip Code: 8...
•City: E...
•Short-name for Canton: ZH
•Telephone: 079...
Check order details and click on “Continue”:
•Power BI Pro Developer (MSDN): Fr 0.0 pro Monat / Laufzeit 1 Jahr. 1 Benutzer – Fr. 0.00
•Total: Fr. 0.00
Check Payment method: monthly payment (default)
And click on "Submit order"
Now everything is ready:
Assign new users to the subscription: https://portal.office.com/admin/default.aspx#ActiveUsersPage
Go to administrator start page: https://www.office.com
1.Login to: https://portal.office.com
2.On the left side open “Billing” and click on “Licenses”
3.On the line “Power BI Pro” click on “Assign now”:
4.Under “Users → Active Users” select the user you want to assign the license to e.g. “ms-cloud--s0-v1”
5.On the line “Product licenses ...” click on the “Edit” link
6.Select the location and activate the “Power BI Pro” license and click “Save”:
Login to: https://www.office.com
Click on the “Power BI” icon: https://app.powerbi.com/ :
Click on “Sign in”:
Click on “Workspaces” and click on “Create app workspace”:
•Name your workspace: becke-ch--app--s0-v1--uc
•Workspace ID: becke-ch--app--s0-v1--uc
•Available
•Add workspace members: [leave empty]
This functionality is currently very limited only to single members:
https://powerbi.microsoft.com/en-us/documentation/powerbi-service-create-apps/
Today you can add only individuals to the members and admin lists. Soon you'll be able to add multiple AD security groups or modern groups to these lists to allow for easier management.
•Advanced
◦Premium: Off
https://docs.microsoft.com/en-us/azure/storage/storage-introduction
•Name (.core.windows.net): beckechapp0v1
◦
•Deployment model: Resource manager (default)/Classic: Resource manager
•Account kind: General purpose (default)/Blob storage: General purpose
◦A general-purpose storage account gives you access to Azure Storage services such as Tables, Queues, Files, Blobs and Azure virtual machine disks under a single account. This type of storage account has two performance tiers.
▪A standard storage performance tier which allows you to store Tables, Queues, Files, Blobs and Azure virtual machine disks.
▪A premium storage performance tier which currently only supports Azure virtual machine disks.
◦A Blob storage account is a specialized storage account for storing your unstructured data as blobs (objects) in Azure Storage. Blob storage accounts are similar to your existing general-purpose storage accounts and share all the great durability, availability, scalability, and performance features that you use today including 100% API consistency for block blobs and append blobs.
•Performance: Standard (default)/Premium: Standard
◦Standard: Backed by magnetic drives and provide lowest cost per GB. Best for application that required bulk storage and access data infrequently.
◦Premium: Backed by solid state drives and offer consistent, low-latency performance.
•Replication: Locally redundant storage (LRS)/Zone-redundant storage (ZRS)/Geo-redundant storage (GRS)/Read-access geo-redundant storage (RA-GRS)(default): Read-access geo-redundant storage (RA-GRS)
◦Locally redundant storage (LRS). Locally redundant storage maintains three copies of your data. LRS is replicated three times within a single data center in a single region. LRS protects your data from normal hardware failures, but not from the failure of a single data center.
◦Zone-redundant storage (ZRS). Zone-redundant storage maintains three copies of your data. ZRS is replicated three times across two to three facilities, either within a single region or across two regions, providing higher durability than LRS. ZRS ensures that your data is durable within a single region.
◦Geo-redundant storage (GRS). GRS maintains six copies of your data. With GRS, your data is replicated three times within the primary region, and is also replicated three times in a secondary region hundreds of miles away from the primary region, providing the highest level of durability. In the event of a failure at the primary region, Azure Storage will failover to the secondary region. GRS ensures that your data is durable in two separate regions.
◦Read-access geo-redundant storage (RA-GRS). Read-access geo-redundant storage replicates your data to a secondary geographic location, and also provides read access to your data in the secondary location. Read-access geo-redundant storage allows you to access your data from either the primary or the secondary location, in the event that one location becomes unavailable. Read-access geo-redundant storage is the default option for your storage account by default when you create it.
•Storage service encryption (blobs and files): Disabled (default)/Enabled: Disabled
◦https://azure.microsoft.com/documentation/articles/storage-service-encryption/
Azure Storage Service Encryption (SSE) for Data at Rest helps you protect and safeguard your data to meet your organizational security and compliance commitments. With this feature, Azure Storage automatically encrypts your data prior to persisting to storage and decrypts prior to retrieval. The encryption, decryption, and key management are totally transparent to users.
▪Who manages the encryption keys? The keys are managed by Microsoft.
▪Can I use my own encryption keys? We are working on providing capabilities for customers to bring their own encryption keys.
▪Can I revoke access to the encryption keys? Not at this time; the keys are fully managed by Microsoft.
•Secure transfer required: Disabled (default)/Enabled: Disabled
•Subscription: Free Trial
•Resource group: Create new/Use existing: becke-ch--app--s0-v1
•Location: West Europe
•Name: becke-ch-app-s0-v1-uc
◦Only lowercase letters, numbers and hyphens and must begin or end with a letter or a number. Cannot contain 2 consecutive hyphens.
•Quota: (leave empty)
◦Max/Default quota size is 5120 GB
https://docs.microsoft.com/en-us/azure/storage/storage-security-guide
Build-Time: Administration: RBAC: During build-time respective for administration the security is based on RBAC (Role Based Access) BUT ONLY IF the Resource Manager based Deployment Model was chosen! This means that authentication and authorization are based on AAD i.e. we can bind the resource roles Owner/Contributor/Reader with AD Users and/or AD Security Groups.
Run-Time: Data Security: Storage Account Keys or Shared Access Signatures and Stored Access Policies: Data Plane Security refers to the methods used to secure the data objects stored in Azure Storage – the blobs, queues, tables, and files.
•Storage Account Keys: Storage account keys are 512-bit strings created by Azure that, along with the storage account name, can be used to access the data objects stored in the storage account.
Giving out your storage account key is like sharing the keys of your storage kingdom. It grants complete access. Someone could use your keys and upload their entire music library to your storage account. They could also replace your files with virus-infected versions, or steal your data. Giving away unlimited access to your storage account is something that should not be taken lightly.
•Shared Access Signatures: A Shared Access Signature is a string containing a security token that can be attached to a URI that allows you to delegate access to storage objects and specify constraints such as the permissions and the date/time range of access.
◦Stored Access Policies: You can create a Stored Access Policy for an entire container, file share, table, or queue. Then you can use this as the basis for the SAS URIs you create. Permissions based on Stored Access Policies can be easily revoked. You can have up to 5 policies defined on each container, queue, table, or file share. If you have a logical set of parameters that are similar each time, using a Stored Access Policy is a better idea.
If a user has no role on the storage account or on the higher level artifacts: resource group or subscription then the storage account and its underlying Blobs, Files, Tables and Queues are not visible.
Reader: Adding a user as reader to the storage account via IAM:
•Role: Reader
•Selected Members: Storage Account User - scope: 0: {stage={global}} - version: 1
storage-account-user--s0-v1@beckech.onmicrosoft.com
Adding a user as Reader gives him access to view the Storage Account BUT he cannot access the underlying Blobs, Files, Tables and Queues:
Access denied
You do not have access
Looks like you don't have access to this content. To get access, please contact the owner.
Contributor: Giving a user the contributor role gives him access to view the Storage Account AND to the underlying Blobs, Files, Tables and Queues. They can perform all operations on the storage account except assigning access. Someone with this role can view and regenerate the storage account keys. With the storage account keys, they can access the data objects.
•Data plane: Azure Portal: CRD: A contributor can easily access the data plane and C(reate) R(ead) and D(elete) data objects using the azure portal itself (no need to install a separate storage explorer client!)
◦Cons: Actually this would be a nice way to use RBAC together with AAD to control as well the access to the data plane BUT unfortunately the contributor has too many rights and can as well access the storage key and by using the storage key he can directly access the storage account without AAD authentication.
https://docs.microsoft.com/en-us/azure/container-instances/
•Container name: becke-ch--httpd--s0-v1--2wayssl
•Container image type (Public | Private): Public
•Container image: httpd:2.4
•Resource Group: Create New: becke-ch--httpd--s0-v1
•Location: West Europe
•OS Type (Windows | Linux (default)): Linux
•Number of cores (1(default)-4): 1
•Memory GB (1.5 (default)): 0.5
•Networking:
◦Public IP Address: Yes
◦DNS name label: becke-ch--httpd--s0-v1--2wayssl
Under Configuration, specify a DNS name label for your container. The name must be unique within the Azure region you create the container instance. Your container will be publicly reachable at <dns-name-label>.<region>.azurecontainer.io.
◦Port (80 default): 80
◦Open additional ports: No
◦Port protocol (TCP (default) | UDP): TCP
•Advanced:
◦Restart Policy (Always (default) | On failure | Never): Always
◦Environment variable:
Setting environment variables in your container instances allows you to provide dynamic configuration of the application or script run by the container. To set environment variables in a container, specify them when you create a container instance. You can set environment variables when you start a container with the Azure CLI, Azure PowerShell, and the Azure portal.
◦Add additional environment variables: No
◦Command override:
Summary:
Basics
Subscription: MSDN Platforms
Resource group: becke-ch--httpd--s0-v1
Location: West Europe
Container name: becke-ch--httpd--s0-v1--2wayssl
Container image type: Public
Container image: httpd:2.4
Configuration
OS Type: Linux
Number of cores: 1
Memory (GB): 0.5
Public IP address: Yes
DNS name label: becke-ch--httpd--s0-v1--2wayssl
Port: 80
Open additional ports: No
Port protocol: TCP
Restart policy: Always
Environment variable: -
Add additional environment variables: No
Command override: -
And last but not least accessing the running container: http://becke-ch--httpd--s0-v1--2wayssl.westeurope.azurecontainer.io
Precondition: Install Azure CLI as described in chapter Error: Reference source not found.
az container exec --resource-group <group-name> --name <container-group-name> --exec-command "<command>"
C:\Users\raoul-becke--s0-v1>az container exec --resource-group becke-ch--httpd--s0-v1 --name becke-ch--httpd--s0-v1--2wayssl --exec-command "/bin/bash"
<c6f8662aae6477-803088677-tph13:/usr/local/apache2#
https://docs.microsoft.com/en-us/azure/cloud-services/cloud-services-choose-me
Azure Cloud Services is an example of a platform as a service (PaaS). Like Azure App Service, this technology is designed to support applications that are scalable, reliable, and inexpensive to operate. In the same way that App Service is hosted on virtual machines (VMs), so too is Azure Cloud Services. However, you have more control over the VMs. You can install your own software on VMs that use Azure Cloud Services, and you can access them remotely.
More control also means less ease of use. Unless you need the additional control options, it's typically quicker and easier to get a web application up and running in the Web Apps feature of App Service compared to Azure Cloud Services.
There are two types of Azure Cloud Services roles. The only difference between the two is how your role is hosted on the VMs:
•Web role: Automatically deploys and hosts your app through IIS.
•Worker role: Does not use IIS, and runs your app standalone.
It is suggested that all the VMs in a single application run in the same cloud service. Users access the application through a single public IP address, with requests automatically load balanced across the application's VMs. The platform scales and deploys the VMs in an Azure Cloud Services application in a way that avoids a single point of hardware failure.
Even though applications run in VMs, it's important to understand that Azure Cloud Services provides PaaS, not infrastructure as a service (IaaS). Here's one way to think about it. With IaaS, such as Azure Virtual Machines, you first create and configure the environment your application runs in. Then you deploy your application into this environment. You're responsible for managing much of this world, by doing things such as deploying new patched versions of the operating system in each VM. In PaaS, by contrast, it's as if the environment already exists. All you have to do is deploy your application. Management of the platform it runs on, including deploying new versions of the operating system, is handled for you.
There's nothing like a Virtual Machines data disk. Instead, an Azure Cloud Services application should explicitly write all state to Azure SQL Database, blobs, tables, or some other external storage. Building applications this way makes them easier to scale and more resistant to failure, which are both important goals of Azure Cloud Services.
1.Click “Create”
2.Enter the basic information:
a.DNS name: becke-ch--myapp--s0-v1
b.Subscription: …
c.Resource Group: …
d.Location: West Europe
3.Optional: Select a package: Optionally a package file “.cspkg” or “.zip” and a configuration file “.cscfg” can be uploaded from local file system or from blob storage and given a deployment label.
The problem with the private (trial) subscription in chapter A.1.1 is that a directory and domain is created which does not fit our (naming convention) needs (for organization (trial) subscriptions see chapter A.1.4 we can give the domain name that fits our needs):
(Trial) subscription:
•Email (is fine): ms-cloud--s0-v1@becke.ch
•Directory: msclouds0v1becke
This resulted out of the private trial subscription and looks like it was combined out of the email address BUT should be “beckech.onmicrosoft.com” for the base/root domain/directory and “beckech0v1.onmicrosoft.com” for further domains/directories - dashes “-” are not allowed – even they are allowed in the DNS naming convention :-( !
•Domain: msclouds0v1becke
This resulted out of the private trial subscription and looks like it was combined out of the email address BUT should be “beckech.onmicrosoft.com” for the base/root domain/directory and “beckech0v1.onmicrosoft.com” for further domains/directories - dashes “-” are not allowed – even they are allowed in the DNS naming convention :-( :
https://portal.azure.com/?whr=live.com#create/Microsoft.AzureActiveDirectory
•Organization name: becke.ch - directory - scope: 0: {stage={development}} - version 1
•Initial domain name (dashes “-” are not allowed in the ms-cloud (sub) domain names – even they are allowed according to the DNS naming convention): beckechdir0v1
beckechdir0v1.onmicrosoft.com
•Country or regions: Switzerland