//Cloud notes from my desk -Maheshk

"Fortunate are those who take the first steps.” ― Paulo Coelho

Ignite 2019 – Wednesday session #mywatchlist #todo

(1) Figuring out Azure Functions

Tailwind Traders is curious about the concept behind “serverless” computing – the idea that they can run small pieces of code in the cloud, without having to worry about the underlying infrastructure. In this session, we cover the world of Azure Functions, starting with an explanation of the servers behind serverless, exploring the languages and integrations available, and ending with a demo of when to use Logic Apps and Microsoft Flow.

From <https://myignite.techcommunity.microsoft.com/sessions/83218?source=sessions>

(2) Moving your database to Azure

Northwind kept the bulk of its data in an on-premises data center, which hosted servers running both SQL Server and MongoDB. After the acquisition, Tailwind Traders worked with the Northwind team to move their data center to Azure. In this session, see how to migrate an on-premises MongoDB database to Azure Cosmos DB and SQL Server database to an Azure SQL Server. From there, walk through performing the migration and ensuring minimal downtime while you switch over to the cloud-hosted providers. From <https://myignite.techcommunity.microsoft.com/sessions/82989?source=sessions>

(3) Migrating web applications to Azure

When Tailwind Traders acquired Northwind earlier this year, they decided to consolidate their on-premises applications with Tailwind Traders’ current applications running on Azure. Their goal: vastly simplify the complexity that comes with an on-premises installation. In this session, examine how a cloud architecture frees you up to focus on your applications, instead of your infrastructure. Then, see the options to “lift and shift” a web application to Azure, including: how to deploy, manage, monitor, and backup both a Node.js and .NET Core API, using Virtual Machines and Azure App Service. From <https://myignite.techcommunity.microsoft.com/sessions/82988?source=sessions>

(4) Azure Portal: 10 tips to get more out of Azure

If you’re new to Azure or a seasoned Azure user, come learn everything that’s new in the Azure Portal to be more productive managing Azure resources. We also walk you through tips and tricks for creating and managing your resources efficiently. From <https://myignite.techcommunity.microsoft.com/sessions/84100?source=sessions>

(5) Debugging and interacting with production applications

Now that Tailwind Traders is running fully on Azure, the developers must find ways to debug and interact with the production applications with minimal impact and maximal efficiency. Azure comes with a full set of tools and utilities that can be used to manage and monitor your applications. In this session, see how streaming logs work to monitor the production application in real time. We also talk about deployment slots that enable easy A/B testing of new features and show how Snapshot Debugging can be used to live debug applications. From there, we explore how you can use other tools to manage your websites and containers live. From <https://myignite.techcommunity.microsoft.com/sessions/82991?source=sessions>

(6) .NET platform overview and roadmap

Join Scott Hunter and Scott Hanselman as they talk about the present and future of .NET. Come see the latest updates in .NET Core 3.1 and how you can be more productive building intelligent applications that run on any platform. Learn about the latest Windows desktop support, features for building better web apps and microservices, and much more. Also, see what the future has in store with .NET 5. From <https://myignite.techcommunity.microsoft.com/sessions/81592?source=sessions>

(7) Why you should care about containers and how to get started

A lot has been said about containers recently, but why should you care? Containers is not an “all or nothing” situation and understanding when they can be beneficial is key to a successful implementation. Come and learn from the containers team how you can get started with this technology and some tips and tricks that will help you in your containerization journey! From <https://myignite.techcommunity.microsoft.com/sessions/89324?source=sessions>

(8) Deploy apps to Kubernetes using CI/CD in 20 minutes

With 20 minutes on the clock, can you go from code in a Git repo, to containers running on a Kubernetes cluster, all while following some best practices? Join this demo-driven session to see how you can use Azure Pipelines to create your build and release pipelines as code, build containers and deploy them to Azure Kubernetes Service.

From <https://myignite.techcommunity.microsoft.com/sessions/83971?source=sessions>

(9) Five powerful tips to make the most of your mentor/mentee relationship

Mentors have a powerful impact in the workplace, but relationships don’t always go as we envisioned. Whether you are looking for your first mentor or you want to help someone else, you may find the process frustrating. Mentoring is more than just telling someone what to do, it requires the ability to understand the mentee and their goals. As the mentee it requires showing up prepared and having some kind of idea what you want to achieve. How do you find the right mentor? How can you prepare to be coached when you’re not sure what you want? What can you do to ensure you get the most out of your mentoring relationship? We talk success strategies and ways you can find the right mentoring relationship to bring out the best in both of you. From <https://myignite.techcommunity.microsoft.com/sessions/78900?source=sessions>

(10) Building event driven apps with Azure Cosmos DB and Azure Functions

Users expect modern apps to offer event-driven, real-time experiences. In this demo-packed session, learn how to build event-driven apps using Azure Cosmos DB and Azure Functions. Learn how to subscribe to changes in Azure Cosmos DB collections and trigger logic in real time without needing to manage any servers. Understand real-world use cases in multi-billion dollar industries, ranging from retail, IoT, gaming, and more. From <https://myignite.techcommunity.microsoft.com/sessions/79933?source=sessions>

(11) Azure CLI Deep Dive for Developers, DevOps and Architects

Are you a developer or architect who likes to discover, prototype and experiment with the cloud using a command line? Are you a DevOps professional who want to automate deployments using your tool of choice like GitHub, Jenkins, etc? In this session, we’ll deep dive into how you can use the Azure CLI for these scenarios separately and together. From <https://myignite.techcommunity.microsoft.com/sessions/85030?source=sessions>

(12) Building enterprise capable serverless applications

Industry and customer needs push enterprises to innovate and modernize their applications at a faster rate than ever before. Serverless solutions are a clear and natural choice for such demand due its proven developer productivity gains. However enterprises also require using services that can respond to their critical needs around: networking, security, performance, DevOps, ability to run on-premises and compatibility with industry standards (e.g. Kubernetes). In this session we explore how serverless development with the Azure platform helps satisfy all these requirements. We go over details of enterprise scenarios, such as resource automation, highlighting premium capabilities in Azure Functions and other Azure services. From <https://myignite.techcommunity.microsoft.com/sessions/81605?source=sessions>

(13) Staying up to date with the Microsoft Azure roadmap

Cloud services like Azure are evolving faster and unlike any other technology we use today. However, as a technologist responsible for helping your organization keep up with this pace of change and make sense of it all, it is easy to be overwhelmed. In this session, the Azure Service Operations team shares how we track, manage, and communicate change – so you can stay ahead of new capabilities, changes, and deprecations in Azure.  From <https://myignite.techcommunity.microsoft.com/sessions/83896?source=sessions>

(14) Career skills: IT pro to cloud pro – strap on your jetpack!

With Azure, Microsoft Teams, Microsoft Security, Windows Autopilot, and Microsoft Intune, the hottest topics in IT today, you need to seriously up your game. We are seeing a seismic shift towards the cloud and IT professionals need the skills to embrace the cloud, collaboration, and mobility. Join cloud professional and career mentor Andrew Bettany, MCT, MVP and author to outline the key skill areas that you must embrace to succeed.From <https://myignite.techcommunity.microsoft.com/sessions/79810?source=sessions>

(15) Breaking the mold: The tech career journey guide

The world of years past is no more. We all live in a world of change from technology to our personal lives. Change is the only thing that is constant. Technology is changing so fast and with that comes challenges, especially when we think of careers and the journey you go through. Roles change, outsourcing and downsizing followed by increasing use of cloud services has definitely disrupted this industry. What is a person to do, take the BLUE or RED pill? Join this session as we discuss current challenges for professionals today and the steps you can take for your career journey. From <https://myignite.techcommunity.microsoft.com/sessions/83489?source=sessions>

(16) Azure architecture framework: How to leverage it to improve your workloads

Evaluate your workload against the five pillars of the Azure architecture framework (resiliency, scalability, cost, security, and operations). Come learn how you can leverage this powerful framework to improve your workloads to meet your business needs. With a concrete demo using one of the reference architectures published in the Azure Architecture Center, we highlight how it can be improved and why certain decisions were made. From <https://myignite.techcommunity.microsoft.com/sessions/87490?source=sessions>

(17) Deploy an app in Azure Kubernetes and App Services with MySQL

Come and join us to see how easy it is to quickly deploy a web scale application leveraging the tight integration between Azure Kubernetes Service, Azure Database for MySQL, App Services, and Azure Database for MySQL. We share tips and tricks for how to get the best performance and most secure setup for your applications. From <https://myignite.techcommunity.microsoft.com/sessions/83538?source=sessions>

(18) Microservices and cloud native: Standard Bank’s DevOps journey

At Standard Bank we’re always exploring new technologies, and we’ve deployed two separate apps in the insurance space built using microservices and relying on many Azure services. However, the real hero in our digital transformation journey was Azure DevOps. We look at how we used Azure Pipelines and other DevOps services on Azure to build and deploy our apps. This includes App Center (for building and distributing Kotlin, Swift, and React Native apps), and infrastructure-as-code with Azure Pipelines and ARM to automate our infra and app deployments. We also explain how Azure DevOps has had a positive effect on our working environment, boosting team confidence, and why we feel it is the ideal platform for us moving forward. From <https://myignite.techcommunity.microsoft.com/sessions/83987?source=sessions>

(19) How to use automation for Linux workloads deployment on Azure

Automation is a life saver for IT implementers and site reliability engineers. A good configuration could save time and energy while maintaining the best performance possible for all your pipelines. Join the Azure Customer Support team to hear about best practices on how to use Terraform + Ansible to deploy your Linux workloads running on Azure, easliy connect services that matter and reduce manual work and overhead. From <https://myignite.techcommunity.microsoft.com/sessions/81978?source=sessions>

(20) Streaming data with Azure Event Hubs and Kafka

Come learn about the newest Azure Event Hubs support for Kafka, enabling both fully-managed and open-source streaming solutions.

From <https://myignite.techcommunity.microsoft.com/sessions/83937?source=sessions>

(21) Keep on learning and look ten years younger!

Speakers

Bill Ayers – Flow Simulation Ltd.

IT and development is seen by many as being a young man’s game and we’ve done a lot to encourage women which is great, but there’s also a stigma around age and an expectation that you have to move into management as your career progresses. But not everybody is suited to that. So how to stay relevant and keep track of constantly changing technology? See strategies around learning and keeping up-to-date using resources like Microsoft Learn and certification. And if you find yourself in a non-technical role there’ll be some pointers about how to get back into technology.

(22) Nine-and-a-half years without certification

We love what we do – but many of us don’t love exams, and many of us perform our day-to-day roles or have gone through our professional lives without ever getting certified. Is certification necessary for our success? Learn why certification is necessary for long-term career development, why we must consistently challenge workplace cultures where certification is not valued and why we must deconstruct and challenge our own reasons for not owning our continuing professional development. From <https://myignite.techcommunity.microsoft.com/sessions/78972?source=sessions>

(23) Mentoring: Is it your most powerfull networking tool?

Mentorship, whether that be as a mentor or as a mentee, is perhaps one of the single most important things that one can look to do as part of undertaking long-term career improvements. Come and learn a bit more about what mentorships can entail, both as a mentor or as a mentee and just how simple and easy it can be to get started.

From <https://myignite.techcommunity.microsoft.com/sessions/80806?source=sessions>

(24) Bring the performance of your Azure applications to the next level

The combination of proximity placement groups, generation 2 Azure VMs, Azure Ultra Disks, pre-provisioning services and ephemeral OS disks can bring the performance of your applications to the next level. Choosing the right Azure VM and taking advantage of Azure Advisor recommendations gives you additional ways to get the most out of Azure IaaS for your workloads. This engaging session demonstrates practical ways to significantly improve the performance of your Azure applications. From <https://myignite.techcommunity.microsoft.com/sessions/82916?source=sessions>

(25) Optimize Azure spend while maximizing cloud potential

Learn how Microsoft has migrated 94% of its workloads to Azure and how we’re optimizing our cloud spend from Microsoft Core Services Engineering and Operations (CSEO)—the experts who build, deploy, and operate the systems that run the Microsoft enterprise. This group provides the networking, infrastructure, and services for Microsoft’s line-of-business teams and in this session, hear what we’ve learned along the way and how we’ve matured our cost governance model from reactive to proactive. We share practical guidance to optimize your cloud spend while maximizing your cloud potential. From <https://myignite.techcommunity.microsoft.com/sessions/84625?source=sessions>

(26) Java on Azure: Building Spring Boot microservices

Spring Boot and Spring Cloud are the de-facto choices for many companies building microservices, using Java. In this session, we share our best practices and tools to go from development to production using Spring Boot and Azure, with a specific focus on microservices configuration, resiliency and scalability in the cloud. We also cover monitoring and security and discuss how Spring Boot applications can scale and handle failure on Azure. If you are thinking about building microservices, this is a session you don’t want to miss.

From <https://myignite.techcommunity.microsoft.com/sessions/81594?source=sessions>

(27) What best practices are used in Azure when deploying changes?

Azure services get frequent updates to bring features and fixes. Azure teams follow best practices that help us reduce the risk of deploying changes. In this chat we discuss how Azure deploys updates, share the learnings we had along the way, and talk about what steps we are taking to start sharing our internal processes and tooling with our customers. From <https://myignite.techcommunity.microsoft.com/sessions/89336?source=sessions>

2019-11-25 Posted by | #ignite2019, .NET, OSS, PaaS, Uncategorized | , | Leave a comment

Why Azure Kubernetes Service(AKS) vs Others

What is AKS?
– deploy a managed Kubernetes cluster in Azure.
– reduces the complexity and operation overhead of managing
K8s by offloading much of that responsibility to Azure
– handles critical tasks like health monitoring and maintenance for you.
– masters are managed by Azure and You only manage and maintain the agent nodes.
– free, you only pay for the agent nodes and not for the master

tst3

Why AKS vs Others?
– Streamlined application onboarding with integrated VSTS CI/CD via DevOps Project
– Deep integration with Azure Monitor and Log Search
– Using Azure Dev Spaces for AKS – enables multiple developers to collaborate and rapidly iterate/debug microservices directly in AKS dev environment
– Open source thought leadership through projects like Virtual Kubelet, Helm, Draft, Brigade & Kashti & our contribution to the open source community
– Support for scenarios such as elastic bursting using Azure Container Instance (ACI) and Virtual Kubelet
– Users can use Key Vault for increased security and control over Kubernetes keys and passwords, create and import encryption keys in minutes
– Developers and operations can be assured their workloads will have Automated OS & Framework Patching with ACR Build
– Rich Tooling Support  VS Code/VS integration (VSCode is a free code editor; try today, you’ll thank us )

Best practice guidance
———————-
> For integration with existing virtual networks or on-premises networks, use advanced networking in AKS.
> greater separation of resources and controls in an enterprise environment

Two different ways to deploy AKS clusters into virtual networks:
+ Basic networking – Azure manages the virtual network resources as the cluster is deployed and uses the kubenet Kubernetes plugin.
+ Advanced networking – Deploys into an existing virtual network, and uses the Azure Container Networking Interface (CNI) Kubernetes plugin. Pods receive individual IPs that can route to other network services or on-premises resources.
The Container Networking Interface (CNI) is a vendor-neutral protocol that lets the container runtime make requests to a network provider. The Azure CNI assigns IP addresses to pods and nodes, and provides IP address management (IPAM) features as you connect to existing Azure virtual networks. Each node and pod resource receives an IP address in the Azure virtual network, and no additional routing is needed to communicate

$ az aks create –resource-group myAKSCluster –name myAKSCluster –generate-ssh-keys \
–aad-server-app-id \
–aad-server-app-secret \
–aad-client-app-id \
–aad-tenant-id

$ az aks get-credentials –resource-group myAKSCluster –name myAKSCluster –admin
Merged “myCluster” as current context ..

$ kubectl get nodes

NAME STATUS ROLES AGE VERSION
aks-nodepool1-42032720-0 Ready agent 1h v1.9.6
aks-nodepool1-42032720-1 Ready agent 1h v1.9.6
aks-nodepool1-42032720-2 Ready agent 1h v1.9.6

2019-03-06 Posted by | AKS, Azure Dev, Kubernetes, Linux, Microservices, PaaS | | Leave a comment

[Azure PaaS] Why to consider Azure PaaS?

Happy new year ! As a Technical Evangelist, I work with bunch of ISV’s where I meet technical architects for engagements which includes Architectural design talk, review or POC’s assistance. To me, its easy to work with and explain Uber architects vs PowerPoint architects who resist PaaS services aggressively and often describe it’s a burden for them. As a leader in PaaS offerings, we encourages developers/startups to focus on their “apps” rather spending time on managing the underlying resources. It gives us a “real” freedom to operate cloud easily and also provide easy integration happy path to various tools, reports and monitoring. Here
I wanted to list out the advantages of adapting PaaS Service over IaaS.

1) Open and Hybrid: Azure is committed to being open and speak customer languages of their choice and platforms they prefer. We openly work with many open source communities, integrating the demands, acknowledging the development community movement and also contribute the same fix back to the community repos. By this way, It put us on a win/win situation. Hybrid path and options in Azure is so useful for someone to leverage their existing on Premise investments. Azure PaaS provides high control, high productivity + Intelligence platform which you can consider for both Open source and Hybrid scenario’s.

2) Data-driven Intelligence: Azure is the “only” cloud provider which provides comprehensive monitoring solutions to monitor both Cloud and On- Premises in a single pane. It has Azure Monitor, Application Insights, Log Insights etc helps to get various intelligence about the services easily. It also has the Azure Security Centre to helps us easily detect threats early and avoid false positives.

3) Continuous Innovation: Azure services can be easily integrated with VSTS which provides an end to end DevOps having planning, build & release and required tools. Azure Cognitive and AI services is way ahead of other providers. In fact, Azure provides 29 API’s for Cognitive Service which is a true value proposition and also a sign of Innovation.

4) Cross Platform: Using VS tools for Xamarin, developers can build native Android, iOS and windows apps sharing 90% of the code across device platforms. Non developers can also build scalable applications for desktop and mobile on top of cloud & On Premise services using PowerApps (no coding skills required).

5) Productivity & Tools: Azure offers unparalleled developer productivity tools for PaaS Development with Visual Studio through which one can even step into remote codes in few mins(remote debugging). 


Do you still believe as a burden?

#StayLearning..

2018-01-18 Posted by | Azure, Open Source, PaaS | | Leave a comment

[Azure Service Fabric] Use of EnableDefaultServicesUpgrade property

Recently I had this issue where Service Fabric application upgrade fails to deploy as expected after changing the instance count in cloud.xml. Here is what I tried and error received.

problem:-

  1. Create a stateless project with latest Azure Service fabric sdk 5.5
  2. Deploy first with Stateless1_InstanceCount set to –1  (default)
  3. Now set Stateless1_InstanceCount to say 2 from cloud.xml and redeploy with upgrade option checked

While publishing this upgrade from visual studio, I saw a property value expected to be “true” but no clue at initial glance.

Visual studio error:-

1>—— Build started: Project: Application3, Configuration: Debug x64 ——
2>—— Publish started: Project: Application3, Configuration: Debug x64 ——
2>Started executing script ‘GetApplicationExistence’.
2>Finished executing script ‘GetApplicationExistence’.
2>Time elapsed: 00:00:01.5800095
——– Package started: Project: Application3, Configuration: Debug x64 ——
Application3 -> D:Cases_CodeApplication3Application3pkgDebug
——– Package: Project: Application3 succeeded, Time elapsed: 00:00:00.7978341 ——–
2>Started executing script ‘Deploy-FabricApplication.ps1’.
2>. ‘D:Cases_CodeApplication3Application3ScriptsDeploy-FabricApplication.ps1’ -ApplicationPackagePath ‘D:Cases_CodeApplication3Application3pkgDebug’ -PublishProfileFile ‘D:Cases_CodeApplication3Application3PublishProfilesCloud.xml’ -DeployOnly:$false -ApplicationParameter:@{} -UnregisterUnusedApplicationVersionsAfterUpgrade $false -OverrideUpgradeBehavior ‘None’ -OverwriteBehavior ‘SameAppTypeAndVersion’ -SkipPackageValidation:$false -ErrorAction Stop
2>Copying application package to image store…
2>Copy application package succeeded
2>Registering application type…
2>Register application type succeeded
2>Start upgrading application…
2>Unregister application type ‘@{FabricNamespace=fabric:; ApplicationTypeName=Application3Type; ApplicationTypeVersion=1.1.0}.ApplicationTypeName’ and version ‘@{FabricNamespace=fabric:; ApplicationTypeName=Application3Type; ApplicationTypeVersion=1.1.0}.ApplicationTypeVersion’ …
2>Unregister application type started (query application types for status).
2>Start-ServiceFabricApplicationUpgrade : Default service descriptions can not be modified as part of upgrade.
2>Modified default service: fabric:/Application3/Stateless1. To allow it, set EnableDefaultServicesUpgrade to true.
2>At C:Program FilesMicrosoft SDKsService
2>FabricToolsPSModuleServiceFabricSDKPublish-UpgradedServiceFabricApplication.ps1:248 char:13
2>+             Start-ServiceFabricApplicationUpgrade @UpgradeParameters
2>+             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
2>    + CategoryInfo          : InvalidOperation: (Microsoft.Servi…usterConnection:ClusterConnection) [Start-Servi
2>   ceFabricApplicationUpgrade], FabricException
2>    + FullyQualifiedErrorId : UpgradeApplicationErrorId,Microsoft.ServiceFabric.Powershell.StartApplicationUpgrade
2>
2>Finished executing script ‘Deploy-FabricApplication.ps1’.
2>Time elapsed: 00:00:22.5520036
2>The PowerShell script failed to execute.
========== Build: 1 succeeded, 0 failed, 1 up-to-date, 0 skipped ==========
========== Publish: 0 succeeded, 1 failed, 0 skipped ==========

Upon searching in our internal discussion forum, I noticed this property needs an update from resources.azure.com or through PS.

By default, we would be having this property set to “-1” in cloud.xml or application manifest xml. The value “-1” is default and it deploys to all available nodes. At situation, we  may need to reduce the instance count, so if that this is the case follow any of the option.

Option # 1 ( Update through resources.azure.com portal )

1) From the error message it is clear that, sf cluster expects a property “EnableDefaultServicesUpgrade” to be set it true to proceed this upgrade.

2) This link talks about adding sf cluster settings from resources.azure.com portal – https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-cluster-fabric-settings  ( refer the steps at the top of the page).

3) Update your cluster settings as below and wait for atleast 30-40 mins depends on the number of nodes etc.

         werer

4) After this PUT command, you would see a small banner message saying upgrading cluster in Portal.azure.com > sf cluster overview page blade.

5) Wait till the upgrade banner goes away so that you can run the GET command from resources.azure.com to confirm this value is reflecting or not.

Option#2: ( update through  PS )

You can use the below PS to update this value.

$ClusterName= “<your client connection endpoint > eg. abc.westus.cloudapp.azure.com:19000”

$Certthumprint = “xxxxxx5a813118ef9cf523a4df13d”

Connect-serviceFabricCluster -ConnectionEndpoint $ClusterName -KeepAliveIntervalInSec 10 `

-X509Credential `

-ServerCertThumbprint $Certthumprint  `

-FindType FindByThumbprint `

-FindValue $Certthumprint `

-StoreLocation CurrentUser `

-StoreName My

Update-ServiceFabricService -Stateless fabric:/KeyPair.WebService/KeyPairAPI -InstanceCount 2

https://docs.microsoft.com/en-us/powershell/module/servicefabric/update-servicefabricservice?view=azureservicefabricps 

Final step:-

After the settings update, now go back to Visual Studio (2017) and try publishing app upgrade. At this point, we should see application getting deployed without any error.

You can confirm this by checking the number of node where this app is deployed. From the service fabric explorer (SFX) portal, you could see our application deployed just in 2 nodes instead all the available nodes. 

I had 3 node cluster where I set the instance count to 2 to see the application reduction.

Note:- The only caveat here is, we won’t be seeing the SFX portal manifest having this latest instance count value reflected. It would still show “-1” which you can ignore.

2017-05-24 Posted by | ARM, Azure, PaaS, Powershell, ServiceFabric | | 1 Comment

[Azure Billing] Sample REST API postman request

Azure Billing REST API helps to predict and manage Azure costs.  Following API’s are there to query the billing data.

1) Resource Usage (Preview)

Get consumption data for an Azure subscription- https://msdn.microsoft.com/en-us/library/azure/mt219001.aspx

Sample working query:-
https://management.azure.com/subscriptions/your-subscription-id/providers/Microsoft.Commerce/UsageAggregates?api-version=2015-06-01-preview&reportedStartTime=2014-05-01T00%3a00%3a00%2b00%3a00&reportedEndTime=2017-04-01T00%3a00%3a00%2b00%3a00&aggregationGranularity=Daily&showDetails=false

image

2) Resource RateCard (Preview)

Get price and metadata information for resources used in an Azure subscription- https://msdn.microsoft.com/en-us/library/azure/mt219004.aspx

Sample working query:-
https://management.azure.com/subscriptions/your-subscription-id/providers/Microsoft.Commerce/RateCard?api-version=2016-08-31-preview&$filter=OfferDurableId eq ‘MS-AZR-0063P’ and Currency eq ‘USD’ and Locale eq ‘en-US’ and RegionInfo eq ‘US’

image

 

Ps:- Assuming you have a valid bearer token for authorization.

https://docs.microsoft.com/en-us/rest/api/index?redirectedfrom=MSDN#send-the-request

2017-04-28 Posted by | Azure, Billing, PaaS | | Leave a comment

[Azure Functions] Intro to Serverless applications in Azure

“Serverless computing(SC)” is a most cited buzzword in cloud computing after AI/ML, microservices, IoT, AR/VR/MR etc. Almost all the tech conferences has at least one session around this topic in their slot. Serverless computing is there for sometime now but recently getting adapted and used in conjunction with various cloud born solutions because of its easiness.

What is actually then SC? It is our next generation of “true” PaaS computing service or also called our next layer of abstraction in Cloud Compute. In Cloud computing term, this is called as function as a service(FaaS) aka event driven computing where our code invoked on demand. Studies predicts that, in future(2022) we will be buying bunch of functions to meet most of the current simple requirements. In the serverless model, the cloud provider fully manages the underlying infra so as a developer we focus only on our requirement or integration points. The idea is to break our application into silos functions like say add, update, validate, send an email or delete some backend records based on the events submitted over REST API endpoints(on demand) etc.

Background? AWS was the first one to introduce such model of computing called AWS Lamda in 2014. Within two years, Microsoft(Azure Functions), IBM(OpenWhisk), Googles (Google cloud functions) and even mid tier providers also has their offerings for this. At the current rate, we wouldn’t be surprised to see FaaS as main tool used for get it done(GTD) “many” disjointed computing from our mainstream application. Let say, it could be a simple validation to complex log analysis or job submission invoked from any device or platform or language. Lot of cool things like tooling, continuous delivery, source control integration support etc happening in this space brings this confident.

Is it a microservices? It may look similar to microservices but actually not. In microservices, we can group multiple such services to form a single service but in case of serverless computing, our code is executed in isolation(silos), autonomous and in granular level.

Scenarios?  It can used for log analysis, IoT backend, cognitive bots, mobile back ends and REST API like querying weather forecast on a button click event etc. Most used one is user uploading an image, process (resize,crop etc) and then to required size for the reader device type.

benefits? The main advantages it brings is “cost”, “scalability”, “development cycle”, “tooling setup” and freedom from infra maintenance. User is expected to pay only for the milliseconds compute used for this function runs and not for the whole instance.

What Azure offers in this space (Serverless architectures)? Azure Functions allows developers who already working with the Azure platform to integrate functions easily into their existing Azure-based workflow. It is a serverless compute executes little bit of code on demand. It provides more or less similar features like AWS Lambda and Google Functions however this stackoverflow(page) summarize all the differences in detail. Best part about this service is, it rightly expose required “Functions as a Service” over HTTP end endpoints which can be triggered through timer based events or external service events like blob uploaded to storage or data insertion in backend etc. It can be scaled based on customer selection. By the way, Azure function is built on top of App Service and WebJobs SDK.

Any other serverless service from Azure? Azure functions and Azure Logic apps offers serverless compute. The key difference is, the LogicApps doesn’t provide a way to write custom code.

Let us see a sample Azure Functions usage in real time usecase.

image image

This example is borrowed from my Pluralsight training video to demonstrate the power of Azure functions. In this demo, we expect the user to upload the image to Azure storage which will be submitted to Cognitive service after Azure functions invoked after “BlobTrigger” event

Step1:- Create a Cognitive Service Emotion API from the Azure portal and keep the key ready. https://www.microsoft.com/cognitive-services/en-us/Emotion-api/documentation

image

Step 2) Create a new Storage Account having container named “images”, this is where we are going to dump the images as input for Cognitive service to fetch from.

image

Step3:- Go ahead and create a function say “imageprocessingdemo” by selecting a “BlobTrigger-CSharp” template and have the below code.

image

Step4:- Input the image path, storage account details from where the images will be pulled after the trigger event.

image[65]

 

Step5:- Pls note, Azure Functions gets compiled on Save button event. So, upon “run” event we can the see the uploaded image score.

2017-03-18T21:39:32.590 GetUri:https://functionimages.blob.core.windows.net/images/crying-baby%5B1%5D.jpg?sv=2015-12-11&sr=b&sig=ZBktU%2F4T60SfgeTgTpQXF%2FF%2xxxxxNP93YN1U%3D&st=2017-03-18T21:09:32Z&se=2017-03-18T22:09:32Z&sp=r 2017-03-18T21:39:38.197 results:[{“faceRectangle”:{“height”:967,”left”:1560,”top”:653,”width”:967},”scores”:{“anger”:1.18772381E-09,”contempt”:1.18056471E-08,”disgust”:3.18779553E-06,”fear”:7.162184E-08,”happiness”:5.468896E-10,”neutral”:1.40484147E-07,”sadness”:0.9999966,”surprise”:1.34302474E-11}}] 2017-03-18T21:39:38.197

Function completed (Success, Id=ce4f2b14-9cee-4148-a111-d38f5d2aaa16)

Code:-

#r “Newtonsoft.Json”
#r “Microsoft.WindowsAzure.Storage”

using Microsoft.WindowsAzure.Storage.Blob;
using Newtonsoft.Json;
using System.Text;

public static async Task Run(CloudBlockBlob myBlob, string name, TraceWriter log)
{
    log.Info($”C# Blob trigger function Processed blobn Name:{name}”);

var uri=GetUri(myBlob);
log.Info($”GetUri:{uri}”);

    var analysis= await GetAnalysis(uri);

    log.Info($”results:{analysis}”);
}

public static Uri GetUri(CloudBlockBlob blob)
{
   var sasPolicy =new SharedAccessBlobPolicy
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessStartTime = DateTime.Now.AddMinutes(-30),
SharedAccessExpiryTime = DateTime.Now.AddMinutes(30)
};
var sastoken= blob.GetSharedAccessSignature(sasPolicy);
var uri = new Uri($”{blob.Uri.ToString()}{sastoken}”);
return uri;
}

public static async Task<string> GetAnalysis(Uri uri)
{
    var client = new HttpClient();
var data = JsonConvert.SerializeObject(new { url = uri.ToString() } );
var request = new HttpRequestMessage
{
RequestUri=new Uri(“
https://api.projectoxford.ai/emotion/v1.0/recognize”),
Method= HttpMethod.Post
};

    request.Headers.Add(“Ocp-Apim-Subscription-Key”,”e538a0xxfexxxxc6f41965894″);
request.Content= new StringContent(data,Encoding.UTF8, “application/json”);

    var response = await client.SendAsync(request);
var result = await response.Content.ReadAsStringAsync();
return result;
}

Bindings:-

{
“bindings”: [
{
“name”: “myBlob”,
“type”: “blobTrigger”,
“direction”: “in”,
“path”:
“images/{name}”,
      “connection”: “functionimages_STORAGE”
}
],
“disabled”: false
}

Few other use cases where Azure Functions can be used,

  • Find and clean invalid data from the backend storage table for every 15 mins (Timer based)
  • Transform the user uploaded files to storage – say convert to different format or compress or resize or insert into data rows (Event Based)
  • Read the sales data excel file from OneDrive, analyze and create new sheets with charts (Saas Event processing)
  • Loaded web page calling web hook > Create ad based on user profile (Applications – Serverless web Application architectures)
  • Photo taken in mobile, webbook called to store it in Azure storage which is then scaled to different resolution using Azure functions (Async background processing as mobile backend)
  • Massage the streaming data and transform to structured data for DB insertion
  • Message sent to chatbot and responding to the questions (real-time bot messaging)
  • ETL loads (Extract, Transform, & Load)
  • Sending email reminder after some alert/event say insurance renewal
  • Automated billing reminder

Reference:-

https://functions.azure.comhttps://tryappservice.azure.com

https://crontab.guru/  (useful to generate “Schedule” for timerbasedtriggers, don’t forget to add secondsl)

https://www.troyhunt.com/azure-functions-in-practice/ 

http://mikhail.io/2017/03/azure-functions-as-facade-for-azure-monitoring/

Let me know if you have used Azure Functions for your interesting scenario.

2017-03-19 Posted by | .NET, Azure Functions, C#, PaaS, Storage | | 1 Comment

%d bloggers like this: