//Cloud notes from my desk -Maheshk

"Fortunate are those who take the first steps.” ― Paulo Coelho

[Azure Service Fabric] Five steps to achieve Event aggregation and collection using EventFlow in Service Fabric

Monitoring and diagnostic are critical part in application development for diagnosing issue at production or development time. It helps one to easily identify any application issue, h/w issue and performance data to guide scope for improvement. It has 3 part workflow starts with 1) Event Generation 2) Event Aggregation 3) Analysis.

1) Event Generation –> creation and generation of events & logs. Logs could be of infra level events(anything from the cluster) or application level events (from the apps and services).

2) Event Aggregation –> generated events needs to be collated and aggregated before they can be displayed

3) Analysis –> visualized in some format

Once we decide the log provider, the next phase is aggregation. In Service Fabric, the event aggregation can be achieved by using (a) Azure Diagnostic logs (agent installed on VM’s) or (b) EventFlow (in process log collection).

Agent based log collection is a good option if our event source and destination does not change and have one to one mapping. Any change would require cluster level update which is sometime tedious and time consuming. In this type, the logs get tanked in storage and then goes to display phase.

But in case of EventFlow, in process logs are directly thrown to a remote service visualizer. Changing the data destination doesn’t require any cluster level changes as like in agent way update. Anytime we can change the data destination path from this file eventFlowConfig.json. Depends on the criticality we can have both if required. However, Azure diagnostics logs are recommended for mostly infra level log collection where as EventFlow suggested for Application level logs. The last step is Event Analysis where we analysis and visualize the incoming data. Azure Service fabric has better integration support for OMS and Application Insights.

In this article, let us see how one can easily use EventFlow in their Service Fabric Stateful application in 5 steps.

Step1:- Let say, create a new Service Fabric Project by selecting “Stateful Service” application. Pls change the .NET version of the project to 4.6 and above.

Step2:- Right click and add the following nuget packages. Search for “Diagnostics.EventFlow” and then add the following packages. 

    Microsoft.Diagnostics.EventFlow.ServiceFabric
    Microsoft.Diagnostics.EventFlow.Outputs.ApplicationInsights
    Microsoft.Diagnostics.EventFlow.Input.EventSource

          image

Step3:- Update the eventflowconfig.json file as below. Event Source class uses the Json file to send the data. This file needs to be modified to capture data or configure to desired destination.

          image

Step4: Update the “ServiceEventSource.cs” class.  We need a name of Service’s ServiceEventSource is the value of the attribute set for this class.  

          image

Step5:- Instantiate the EventFlow pipeline in our service startup code and start writing the service message.

        image

       image

Deploy the application and confirm all green and no issue with deployment or any dependency issue.

image

To verify the trace logs, you can log into portal.azure.com > your_application insights > search and refresh (allow few mins to see the data flowing here )

image

Reference article:-

https://github.com/Azure/diagnostics-eventflow

https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-diagnostics-event-aggregation-eventflow

Advertisements

2017-07-11 Posted by | .NET, Azure Dev, C#, LogCollection, ServiceFabric, VS2017 | | Leave a comment

[Azure Powershell] How to improve the performance of BlobCopy by placing inline C#.NET code

Performance issues are always hard and tricky to fix. Recently I had this ask to check why the given Powershell execution takes time and gets slower on heavy iteration. In the beginning, copy operation used to take only few mins but over a heavy iteration, we saw the performance degraded to 2x, 3x..

Sample looping of copy command,

    foreach($SrcBlob in $SrcBlobs)
    {
          $DestBlob = “root/” + $SrcBlob.Name
          Start-AzureStorageBlobCopy -SrcBlob $SrcBlob.Name -SrcContainer $SrcContainerName -Context $SrcContext -DestBlob $DestBlob -DestContainer $DestContainerName -DestContext $DestContext -Force
    }

We have also used measure-command to list the execution duration and print the value for each iteration and also the sum at the end of the loop to confirm. We ran the loop by copying 5000+ times b/w storage accounts and found that, the Powershell command let execution gets slower on iteration for some reason.  Later on investigation, we confirmed this was due to known limitation with Powershell.

   $CopyTime = Measure-Command{

         Start-AzureStorageBlobCopy -SrcBlob $SrcBlob.Name -SrcContainer $SrcContainerName -Context $SrcContext -DestBlob $DestBlob -DestContainer $DestContainerName -DestContext $DestContext -Force

    }

Yes, there is a known issue with PowerShell running slow while iterating large loops due to the nature how PowerShell works, as in the 16th time of the iteration the content of the loop must be compiled dynamically and the .Net will need to run some extra checks. Thanks for some of our internal folks to provide clarity on this. To overcome this, there is a workaround suggested which is what we are here. Yes, we replaced with .Net code having copy logic inside the Powershell script to improve the performance. In this way, the security check only running once instead every 16 time. You will find the detailed information here –> Why cant PowerShell run loops fast ? https://blogs.msdn.microsoft.com/anantd/2014/07/25/why-cant-powershell-run-loops-fast/

How to place the C# as inline code within Powershell:-

$reflib = (get-item “c:tempMicrosoft.WindowsAzure.Storage.dll”).fullname

[void][reflection.assembly]::LoadFrom($reflib)

$Source = @”

using Microsoft.WindowsAzure.Storage;

using Microsoft.WindowsAzure.Storage.Auth;

using Microsoft.WindowsAzure.Storage.Blob;

using System;

 

namespace ns{

public static class copyfn1{

   public static void Copy_bw_SA_Blobs_Test(string sourceAccountKey, string destAcKey, string SrcSAName, string DestSAName, string SrcContainerName, string DestContainerName, string SrcPrefix, string DestPrefix){

       StorageCredentials scSrc = new StorageCredentials(SrcSAName, sourceAccountKey);

       CloudStorageAccount srcAc = new CloudStorageAccount(scSrc, true);

       CloudBlobClient cbcSrc = srcAc.CreateCloudBlobClient();

       CloudBlobContainer contSrc = cbcSrc.GetContainerReference(SrcContainerName);

 

      //Generate SAS Key and use it for delegate access

      SharedAccessBlobPolicy sasConstraints = new SharedAccessBlobPolicy();

      sasConstraints.SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24);

      sasConstraints.Permissions = SharedAccessBlobPermissions.Write

          | SharedAccessBlobPermissions.List | SharedAccessBlobPermissions.Add

          | SharedAccessBlobPermissions.Delete | SharedAccessBlobPermissions.Create

          | SharedAccessBlobPermissions.Read;

      string sasContainerToken = contSrc.GetSharedAccessSignature(sasConstraints);

      //Return the URI string for the container, including the SAS token.

      string containersas = contSrc.Uri + sasContainerToken;

      CloudBlobContainer container = new CloudBlobContainer(new Uri(containersas));

 

      //Destination account – no SAS reqd

      StorageCredentials scDst = new StorageCredentials(DestSAName, destAcKey);

      CloudStorageAccount DstAc = new CloudStorageAccount(scDst, true);

      CloudBlobClient cbcDst = DstAc.CreateCloudBlobClient();

      CloudBlobContainer contDst = cbcDst.GetContainerReference(DestContainerName);

 

      foreach (var eachblob in container.ListBlobs(SrcPrefix, true, BlobListingDetails.Copy)){

         CloudBlob srcBlob = (CloudBlob)eachblob;

         string srcpath = srcBlob.Name;

         string dstpath = (DestPrefix != “”) ? srcpath.Replace(SrcPrefix, DestPrefix) : srcpath;

                       Console.WriteLine(“Files copying-” + dstpath);

       

         if (srcBlob.BlobType == BlobType.BlockBlob){

             CloudBlockBlob dstblob = contDst.GetBlockBlobReference(dstpath);

             dstblob.StartCopy((CloudBlockBlob)srcBlob);

            }

        else if (srcBlob.BlobType == BlobType.AppendBlob){

           CloudAppendBlob dstblob = contDst.GetAppendBlobReference(dstpath);

           dstblob.StartCopy((CloudAppendBlob)srcBlob);

           }

        else if (srcBlob.BlobType == BlobType.PageBlob){

          CloudPageBlob dstblob = contDst.GetPageBlobReference(dstpath);

          dstblob.StartCopy((CloudPageBlob)srcBlob);

          }

       }

     }

  }

}

“@

Add-Type ReferencedAssemblies $reflib TypeDefinition $Source -Language CSharp PassThru

[ns.copyfn1]::Copy_bw_SA_Blobs_Test(“acc_key1”, “acc_key2”, “storage_acc1”, “storage_acc2”

, src_container_name, dest_container_name, “sales/2017/Jan”

, “sales/2017/backup/”)

 

With this inline C# code, we were able to optimize the copy time duration and also able to mitigate the  performance degrade over a heavy loop.

Let me know if this helps in someway or issue with this.

2017-04-17 Posted by | .NET, Azure, C#, Powershell | | Leave a comment

[Azure Functions] Intro to Serverless applications in Azure

“Serverless computing(SC)” is a most cited buzzword in cloud computing after AI/ML, microservices, IoT, AR/VR/MR etc. Almost all the tech conferences has at least one session around this topic in their slot. Serverless computing is there for sometime now but recently getting adapted and used in conjunction with various cloud born solutions because of its easiness.

What is actually then SC? It is our next generation of “true” PaaS computing service or also called our next layer of abstraction in Cloud Compute. In Cloud computing term, this is called as function as a service(FaaS) aka event driven computing where our code invoked on demand. Studies predicts that, in future(2022) we will be buying bunch of functions to meet most of the current simple requirements. In the serverless model, the cloud provider fully manages the underlying infra so as a developer we focus only on our requirement or integration points. The idea is to break our application into silos functions like say add, update, validate, send an email or delete some backend records based on the events submitted over REST API endpoints(on demand) etc.

Background? AWS was the first one to introduce such model of computing called AWS Lamda in 2014. Within two years, Microsoft(Azure Functions), IBM(OpenWhisk), Googles (Google cloud functions) and even mid tier providers also has their offerings for this. At the current rate, we wouldn’t be surprised to see FaaS as main tool used for get it done(GTD) “many” disjointed computing from our mainstream application. Let say, it could be a simple validation to complex log analysis or job submission invoked from any device or platform or language. Lot of cool things like tooling, continuous delivery, source control integration support etc happening in this space brings this confident.

Is it a microservices? It may look similar to microservices but actually not. In microservices, we can group multiple such services to form a single service but in case of serverless computing, our code is executed in isolation(silos), autonomous and in granular level.

Scenarios?  It can used for log analysis, IoT backend, cognitive bots, mobile back ends and REST API like querying weather forecast on a button click event etc. Most used one is user uploading an image, process (resize,crop etc) and then to required size for the reader device type.

benefits? The main advantages it brings is “cost”, “scalability”, “development cycle”, “tooling setup” and freedom from infra maintenance. User is expected to pay only for the milliseconds compute used for this function runs and not for the whole instance.

What Azure offers in this space (Serverless architectures)? Azure Functions allows developers who already working with the Azure platform to integrate functions easily into their existing Azure-based workflow. It is a serverless compute executes little bit of code on demand. It provides more or less similar features like AWS Lambda and Google Functions however this stackoverflow(page) summarize all the differences in detail. Best part about this service is, it rightly expose required “Functions as a Service” over HTTP end endpoints which can be triggered through timer based events or external service events like blob uploaded to storage or data insertion in backend etc. It can be scaled based on customer selection. By the way, Azure function is built on top of App Service and WebJobs SDK.

Any other serverless service from Azure? Azure functions and Azure Logic apps offers serverless compute. The key difference is, the LogicApps doesn’t provide a way to write custom code.

Let us see a sample Azure Functions usage in real time usecase.

image image

This example is borrowed from my Pluralsight training video to demonstrate the power of Azure functions. In this demo, we expect the user to upload the image to Azure storage which will be submitted to Cognitive service after Azure functions invoked after “BlobTrigger” event

Step1:- Create a Cognitive Service Emotion API from the Azure portal and keep the key ready. https://www.microsoft.com/cognitive-services/en-us/Emotion-api/documentation

image

Step 2) Create a new Storage Account having container named “images”, this is where we are going to dump the images as input for Cognitive service to fetch from.

image

Step3:- Go ahead and create a function say “imageprocessingdemo” by selecting a “BlobTrigger-CSharp” template and have the below code.

image

Step4:- Input the image path, storage account details from where the images will be pulled after the trigger event.

image[65]

 

Step5:- Pls note, Azure Functions gets compiled on Save button event. So, upon “run” event we can the see the uploaded image score.

2017-03-18T21:39:32.590 GetUri:https://functionimages.blob.core.windows.net/images/crying-baby%5B1%5D.jpg?sv=2015-12-11&sr=b&sig=ZBktU%2F4T60SfgeTgTpQXF%2FF%2xxxxxNP93YN1U%3D&st=2017-03-18T21:09:32Z&se=2017-03-18T22:09:32Z&sp=r 2017-03-18T21:39:38.197 results:[{“faceRectangle”:{“height”:967,”left”:1560,”top”:653,”width”:967},”scores”:{“anger”:1.18772381E-09,”contempt”:1.18056471E-08,”disgust”:3.18779553E-06,”fear”:7.162184E-08,”happiness”:5.468896E-10,”neutral”:1.40484147E-07,”sadness”:0.9999966,”surprise”:1.34302474E-11}}] 2017-03-18T21:39:38.197

Function completed (Success, Id=ce4f2b14-9cee-4148-a111-d38f5d2aaa16)

Code:-

#r “Newtonsoft.Json”
#r “Microsoft.WindowsAzure.Storage”

using Microsoft.WindowsAzure.Storage.Blob;
using Newtonsoft.Json;
using System.Text;

public static async Task Run(CloudBlockBlob myBlob, string name, TraceWriter log)
{
    log.Info($”C# Blob trigger function Processed blobn Name:{name}”);

var uri=GetUri(myBlob);
log.Info($”GetUri:{uri}”);

    var analysis= await GetAnalysis(uri);

    log.Info($”results:{analysis}”);
}

public static Uri GetUri(CloudBlockBlob blob)
{
   var sasPolicy =new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = DateTime.Now.AddMinutes(-30),
SharedAccessExpiryTime = DateTime.Now.AddMinutes(30)
};
var sastoken= blob.GetSharedAccessSignature(sasPolicy);
var uri = new Uri($”{blob.Uri.ToString()}{sastoken}”);
return uri;
}

public static async Task<string> GetAnalysis(Uri uri)
{
    var client = new HttpClient();
var data = JsonConvert.SerializeObject(new { url = uri.ToString() } );
var request = new HttpRequestMessage
{
RequestUri=new Uri(“
https://api.projectoxford.ai/emotion/v1.0/recognize”),
Method= HttpMethod.Post
};

    request.Headers.Add(“Ocp-Apim-Subscription-Key”,”e538a0xxfexxxxc6f41965894″);
request.Content= new StringContent(data,Encoding.UTF8, “application/json”);

    var response = await client.SendAsync(request);
var result = await response.Content.ReadAsStringAsync();
return result;
}

Bindings:-

{
“bindings”: [
{
“name”: “myBlob”,
“type”: “blobTrigger”,
“direction”: “in”,
“path”:
“images/{name}”,
      “connection”: “functionimages_STORAGE”
}
],
“disabled”: false
}

Few other use cases where Azure Functions can be used,

  • Find and clean invalid data from the backend storage table for every 15 mins (Timer based)
  • Transform the user uploaded files to storage – say convert to different format or compress or resize or insert into data rows (Event Based)
  • Read the sales data excel file from OneDrive, analyze and create new sheets with charts (Saas Event processing)
  • Loaded web page calling web hook > Create ad based on user profile (Applications – Serverless web Application architectures)
  • Photo taken in mobile, webbook called to store it in Azure storage which is then scaled to different resolution using Azure functions (Async background processing as mobile backend)
  • Massage the streaming data and transform to structured data for DB insertion
  • Message sent to chatbot and responding to the questions (real-time bot messaging)
  • ETL loads (Extract, Transform, & Load)
  • Sending email reminder after some alert/event say insurance renewal
  • Automated billing reminder

Reference:-

https://functions.azure.comhttps://tryappservice.azure.com

https://crontab.guru/  (useful to generate “Schedule” for timerbasedtriggers, don’t forget to add secondsl)

https://www.troyhunt.com/azure-functions-in-practice/ 

http://mikhail.io/2017/03/azure-functions-as-facade-for-azure-monitoring/

Let me know if you have used Azure Functions for your interesting scenario.

2017-03-19 Posted by | .NET, Azure Functions, C#, PaaS, Storage | | 1 Comment

[Azure Batch] Server failed to authenticate the request

Today happen to work on this problem, where developer running unit test code creating Azure Batch Jobs in row and check the status in a tight loop getting forbidden error. Interestingly, it worked for him for the first couple of calls “job” creation but fails continuously after that in row. Grabbed the the fiddler log to see the request and response and noticed something as like below. 

Fiddler trace:

HTTP/1.1 403 Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.

Content-Length: 864

Content-Type: application/json;odata=minimalmetadata

Server: Microsoft-HTTPAPI/2.0

request-id: fd81d621-xxxxx-b252-1db46324f1a6

Strict-Transport-Security: max-age=31536000; includeSubDomains

X-Content-Type-Options: nosniff

DataServiceVersion: 3.0

Date: Tue, 13 Dec 2016 15:51:34 GMT

Cache-Control: proxy-revalidate

Proxy-Connection: Keep-Alive

Connection: Keep-Alive

{

  “odata.metadata”:”https://xxxxxx.eastus2.batch.azure.com/$metadata#Microsoft.Azure.Batch.Protocol.Entities.Container.errors/@Element“,”code”:”AuthenticationFailed”,”message”:{

    “lang”:”en-US”,”value“:”Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.nRequestId:xxxxx-564c-4930-xxx-xxxnTime:2016-12-13T15:51:34.2917937Z”

  },”values”:[

    {

      “key”:”AuthenticationErrorDetail“,”value“:”The MAC signature found in the HTTP request ‘6nxk/xxxxxx/xmdrWt55RnMRmsg=’ is not the same as any computed signature. Server used following string to sign: ‘GETnnnnnnnTue, 13 Dec 2016 15:42:18 GMTnn0x8D4236E9EB80606nnnocp-date:Tue, 13 Dec 2016 15:51:35 GMTn/xxxxx/jobs/deleteJob0napi-version:2016-07-01.3.1’.”

    }

  ]

}

Findings:

This appears to be some kind of caching – https://social.msdn.microsoft.com/Forums/SqlServer/en-US/67183d62-60ab-4ef0-a1ca-b765d85ea2f6/authenticationfailed?forum=azurebatch.  It is caused by client-side caching at the proxy layer. Usually the proxy server at client side caches the get responses and for subsequent gets proxy server will try to serve the request from its cache. It sends a different request to server by adding If-Match header with the ETag. However, this causes a problem in this case because the proxy server does not change the Authorization header, it uses the same Auth header that client sends. Hence, the auth header that client sends does not match what server expects.

Fix: 

Added the below config value to App.Config file which resolved this issue. He was able to continue execute his test harness by submitting/deleting 10 jobs successfully without caching issues.

<system.net>

    <requestCaching defaultPolicyLevel=”NoCacheNoStore“/>

Hope this helps.

2016-12-20 Posted by | Azure, Azure Batch, C#, VS2015 | | Leave a comment

Quick tip on Service Fabric Remoting service development

Azure Service Fabric needs no introduction. It is our next gen PaaS offering or also called PaaS v2. It’s been used internally for many years, tested and released as SDK for consumption. Some of the well known offerings like Az Sql, Az DocDB, Skype etc runs on Service Fabric. We already see developer community consuming for their production and hearing lot of goodness.

It is free, any one can download the SDK, develop and run from their laptop or own data center or publish to Azure. It works on windows and Linux as well. It has lot of rich features over the previous PaaS offerings (cloud services) so seeing lot of traction from big companies considering for critical application.

This sample is based on this example:-https://azure.microsoft.com/en-us/documentation/articles/service-fabric-reliable-services-communication-remoting/ 

Service side proj settings: Set the platform target as x64 If you want to use reliable collections, reliable actors APIs, failing to have this set throws as binding exception as below.

System.BadImageFormatException was unhandled
  FileName=Microsoft.ServiceFabric.Services, Version=5.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35
  FusionLog=Assembly manager loaded from:  C:WindowsMicrosoft.NETFrameworkv4.0.30319clr.dll
Running under executable  D:Cases_CoderemotingclienttestbinDebugremotingclienttest.vshost.exe
— A detailed error log follows.

 

platform

 

service

 

For client side/calling method, I do not see set up related information in detailed here https://azure.microsoft.com/en-us/documentation/articles/service-fabric-reliable-services-communication-remoting/. I found, these 3 dll’ s has to be referred at client side project for consuming service. I simply copied from service side sample packages folder to calling side proj folder.

image

image

image

client

sample code available – https://1drv.ms/u/s!ApBwDDnGdg5BhNd-KQHtWtaH-sbRcA

2016-11-13 Posted by | .NET, Azure Dev, C#, Microservices, PaaS, ServiceFabric, VS2015 | | Leave a comment

How to list all available VM sizes in a region using .NET (ARM endpoint)

Today, I had a query from a developer asking how to silent authenticate and fetch the list of available VM’s sizes from a particular region using .NET code. They wanted to fetch this detail from their worker role more precisely. They wanted to call the URI as in this article silent authenticated https://msdn.microsoft.com/en-us/library/azure/mt269440.aspx

Method Request URI
GET https://management.azure.com/subscriptions/{subscription-id}/providers/Microsoft.Compute/locations/{location}/vmSizes?api-version={api-version}

On first sight, I thought this as an RDFE endpoint(older portal/SMAPI), but on closer look this turned to be an ARM end point.

How to identify the url is an RDFE/ARM endpoint?

Please note, for RDFE end point we may have to either use certificate based or native client way of authentication.

Since this is an ARM endpoint, we need to follow the service principal way to get the bearer token which is needed for the URI GET call’s.

Step1:-

Perform the following action one by one carefully as in this URL –  https://azure.microsoft.com/en-us/documentation/articles/resource-group-create-service-principal-portal/

  1. Create an Active Directory application
  2. Get client id and authentication key
  3. Get tenant id
  4. Set delegated permissions
  5. Assign application to role

Step2:-

using System;
using System.IO;
using System.Net;
using Microsoft.IdentityModel.Clients.ActiveDirectory;

namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{

            var context = new AuthenticationContext(“https://login.microsoftonline.com/+ “your_tenantid”);
ClientCredential credential = new ClientCredential(“your_client_ID”, “your_client_secret”);
AuthenticationResult result = context.AcquireToken(“
https://management.azure.com/”, credential);
var token = result.CreateAuthorizationHeader().Substring(“Bearer “.Length);

            string uri = @”https://management.azure.com/subscriptions/<your_subscription_Id>/providers/Microsoft.Compute/locations/Southeast Asia/vmSizes?api-version=2015-05-01-preview”;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
request.Headers.Add(“Authorization:Bearer ” + token);
var response = request.GetResponse().GetResponseStream();
var output = new StreamReader(response).ReadToEnd();

            Console.WriteLine(output);
}
}
}

P.s:- I have used Adal 2.28.2.795 to avoid async complexities.

on executing,

image

2016-10-19 Posted by | .NET, AAD, ARM, Azure, Azure Dev, C#, PaaS | | 1 Comment

%d bloggers like this: