Azure Solutions

Core Solutions Available in Azure

  • Serverless Computing
    • Azure Functions
    • Logic Apps
  • AI
    • Azure Machine Learning
    • Cognitive Services
    • Azure Bot Service
  • IOT
    • Azure IoT Hub
    • IoT Central
    • Azure Sphere
  • Big Data
    • Azure Synapse Analytics
    • HDInsight
    • Azure Databricks
  • DevOps
    • Azure DevOps
    • GitHub and GitHub Actions
    • Azure DevTest Labs

Serverless Computing Solutions

  • Azure Functions
  • Azure Logic Apps

To recap, serverless computing enables you to run code without deploying and managing a server to host and run that code. Although ultimately there is a server running your code, that server is abstracted, meaning you have no visibility to or ability to directly interact with the server. Instead, you focus solely on the code and its function. In addition to simplifying management of the solution, Azure handles scaling when needed. In addition, you pay only for the resources used by the code.

The term serverless is a misnomer; there are still servers and an infrastructure to execute your code or workflow on. The concept is about not needing to create, manage, or pay for the underlying infrastructure so that you can execute your code or workflow. The term serverless just means that Microsoft abstracts this entirely (removes it from your concern and control) and provides this cloud-hosted serverless code and workflow execution environment to the customer as a service.

In the diagram, we can see that our focus has shifted away from infrastructure, compute, and runtimes so that you can focus on the code and logic; the lower layer’s needs for runtimes, compute, and infrastructure are not our concern and not under our control; they have been abstracted (when we say abstracted, we mean that this requirement is removed from us and provided back to us as a service by the platform provider).

Serverless positioning

Azure Functions

The Azure Functions service enables you to host a single method or function that runs in response to an event.

Azure Functions is an event-triggered serverless compute service that’s provided as a cloud-hosted development Platform-as-a-Service (PaaS).

In a nutshell, this means that an event such as uploading a file, a web request (HTTP trigger), a schedule being met, and so on can be used as a trigger to execute small pieces of code; we call this a code-first (imperative) development approach.

Generally, an Azure function is stateless, meaning it does not store its state from execution to execution. Instead, it executes the same every time it responds to an event.

The following methods can be used to create functions:

  • The Azure portal
  • Microsoft Visual Studio and Visual Studio Code
  • The Azure Command-Line Interface (CLI)
Serverless code execution platform

Azure Logic Apps

Azure Logic Apps is an event-triggered serverless workflow (and orchestration) service that’s provided as a cloud-hosted development PaaS.

In a nutshell, this means that it reacts to an event much like Azure Functions does, but instead of triggering code execution, Azure Logic Apps triggers a workflow. A workflow is a set of conditional task actions (operations) and can act as an integration orchestration service for these business processes.

Like Functions, Logic Apps are serverless in that you focus on the app rather than the underlying server or resources needed to maintain and run the app.

Functions and Logic Apps are both priced based on consumption, like most Azure services. However, the Azure Functions service is priced based on the number of function executions and running time for each. The Azure Logic Apps feature is priced based on the number of executions and the type of connectors that the app uses.

Serverless workflow/orchestration platform
Serverless use case scenario

Artificial Intelligence Solutions

  • Azure Machine Learning
  • Cognitive Services
  • Azure Bot Service

Artificial Intelligence (AI) is the ability of a computer to imitate intelligent human behavior. AI allows computers to make predictions based on data, analyze images, take actions, recognize speech and text, and interact naturally.

The following are some important terminologies to understand regarding AI:

  • Artificial Intelligence (AI): This is the broad term given to describe the ability to mimic human intelligence by a computer; it can be applied to any app that reasons, senses, acts, and adapts behaviors or outcomes.
  • Machine Learning (ML): This is the ability for computers to improve at tasks through experience (learning); this is known as training and it will produce a model that can be deployed and is said to be trained. ML is based on algorithms, whose output improves over time as they are fed more data to analyze and learn from; these are known as training datasets. It is a subset of AI.
  • Deep Learning (DL): This is the ability for a computer to train itself to perform a task; this is based on multi-layered neural networks and they can achieve this self-learning from vast amounts of data. It is a subset of ML.
Artificial intelligence inter-relations

No matter which approach we take to creating AI, there are ethical and legal considerations we must understand; there needs to be a governance framework that controls how AI is created and used. Microsoft has six AI principles: Fairness, Reliability and Safety, Privacy and Security, Inclusiveness, Transparency, and Accountability.

Microsoft AI platform services

Azure Machine Learning

In a nutshell, Azure Machine Learning (ML) is Microsoft’s fully managed cloud-based ML environment for making predictions from data. It is comprised of a collection of services and tools that allow you to train, deploy, and manage ML models at an enterprise scale.

The Azure ML service provides capabilities for building and deploying models while having complete control of the design of the algorithm, training, and your data.

The models can be trained and deployed at scale using web interfaces and Software Development Kits (SDKs). In addition, open source technologies can be used, including Python frameworks such as PyTorch, TensorFlow, and scikit-learn; Azure ML also supports Python and R.

Azure Machine Learning consists of a collection of Azure services and tools that enable you to use data to train and validate models. Through testing, you determine the model that provides the most accurate predictions. Then you can deploy the model for use through a web API endpoint. Azure Machine Learning provides a number of features that enable you to define how to obtain and manage data, train and validate predictive models, manage the process and resources for scoring your algorithms, and deploy the final model to an API endpoint, where it can be used in real time by other applications.

Azure Machine Learning solution approach

Azure Cognitive Services

Azure Cognitive Services is a cloud-based pre-built AI and is a fully managed Microsoft platform; it simplifies and speeds up the creation and deployment of AI solutions.

Azure Cognitive Services provides pre-built ML models that allow developers to enable speech, hearing, understanding, and reasoning in their apps while also bringing in live data. This differs from Azure ML, which is based on requiring you to create models trained on your data.

Azure Cognitive Services is comprised of the following:

  • Vision: This service provides identification and recognition capabilities for analyzing pictures, video, and other visual content.
  • Speech: This service provides text-to-speech and speech-to-text conversion, language translation, and speaker recognition and verification.
  • Language: This service provides pre-built scripts for processing natural language, key phrase extraction, sentiment analysis, and user recognition.
  • Decision: This service provides personal recommendations, content moderation, and abnormality detection.

Azure Bot Service

Azure Bot Service enables you to create and use virtual agents to interact with users by answering questions, gathering information, and potentially initiating activities through other Azure services. Azure Bot Service can use other services such as Cognitive Services to understand what users are asking and respond accordingly.

Essentially, a bot is a software program that is designed to perform a particular automated task. Bots use AI to learn human activity and behaviors and interact through speech and text. The most common example of this is a chatbot, which uses conversational AI to help engage with customer service scenarios.

Azure Bot Service is provided as a PaaS service so that it can be quickly adopted with no need to create any underlying compute or infrastructure. Being built on top of Azure App Service, it has built-in capabilities regarding portal UI-driven simplicity, scale, and automation.

Internet of Things (IoT)

IoT is a technology solution that provides intelligent devices (things) equipped with sensors to collect and send data to a cloud platform for analysis and take action based on insights.

The three core elements of an IoT solution are collecting the data, processing the data, and taking action on the data.

  • Things: The physical things that have embedded sensors that, when connected to the internet, can send telemetry (values) data to a cloud platform for analysis and action.
  • Insights: The results from analyzing the received data (values) from the things.
  • Actions: The responses to the insights. These can be manual or automated. For example, the actions could be to change the device’s settings, update an inventory or metrics dashboard, trigger an intervention such as scheduling a site visit/appointment with a professional, sending a part, ordering an inventory item, and so on.

Azure IoT solutions

  • Azure IoT Hub: Build your own IoT services.
  • Azure IoT Central: Consume IoT services.
  • Azure Sphere: Secure IoT services.

Azure IoT has a collection of solutions that allow an organization to take actions based on valuable insights that have been captured from data that’s been received from connected devices. These solutions can help address the most common IoT challenges of cost, complexity, and security.

The Azure IoT technology portfolio allows computational sensor devices (things) to send recorded values (transmit telemetry) to the Azure computing platform for processing. This is where actions can be taken on data that’s been received and visualizations that have been presented.

  1. Devices: These register with the Azure IoT platform and send telemetry data.
  2. Cloud gateway: The cloud gateway reads the ingested data (received) securely and provides device management capabilities.
  3. Stream processors: These consume the telemetry data that’s sent by the gateway. They integrate with the business processes and will take action based on the insights; they place data into storage.
  4. Visualization interface: Users will interact with this interface to visualize the data; it provides easy device management.
Azure IoT reference architecture

Azure IoT Hub

A PaaS cloud-hosted Azure solution.

Azure IoT Hub is an Azure‐hosted service that functions as a message hub for bidirectional communication between your deployed IoT devices and Azure services.

IoT devices are added to IoT Hub, and you can then manage them, monitor them, and send messages to them, either individually or to groups that you create. You can add up to 1,000,000 IoT devices to a single IoT Hub.

From IoT Hub, you can send messages to devices (called cloud-to-device, or C2D messaging) or from your device to IoT Hub (called device-to-cloud, or D2C messaging). You can also intelligently route messages to Event Hub, Azure Storage, and Service Bus based on the content in the message.

When you add a new IoT device, IoT Hub creates a connection string that uses a shared access key for authentication. This key prevents unauthorized access to your IoT Hub. Once connected, messages between your device and IoT Hub are encrypted for additional security.

In addition to messages, you can also use IoT Hub to send files to your devices. This allows you to easily update the firmware on your devices in a secure way. To update the firmware on an IoT device, you simply copy the firmware to the device. The device will detect the firmware and will reboot and flash the new firmware to the device.

One important concept in IoT Hub is the concept of what’s called a device twin. Every IoT device in IoT Hub has a logical equivalent that is stored in IoT Hub in JSON format.

There are two pricing tiers for IoT Hub:

  1. Basic
  2. Standard

Each tier offers multiple editions that offer pricing based on the number of messages per day for each IoT Hub unit. When you scale an IoT Hub, you add additional units. This adds the ability to handle more messages at an increased price.

For example, assume that ContosoPharm message needs are approximately 5,000,000 per day. They would choose the S2 pricing tier and pay $250 US per month if they are running one IoT Hub unit in the North America geography. If the number of messages increases to 8,000,000 (either because of configuration changes or the addition of additional IoT devices), they would likely choose to scale to two IoT Hub units. Doing so would give them 12,000,000 messages per day at a cost of $500 US per month.

Note: You cannot change to a lower pricing tier after you create your IoT Hub. If you create your IoT Hub in the Standard tier, you cannot change it to the Basic tier. If you create an IoT Hub in the Standard tier using the S1, S2, or S3 edition, you cannot change it to the free edition.

IoT Hub does not provide analysis services or dashboards for viewing device state or analyzing data. That is where IoT Central comes in.

Azure IoT Central

IoT Central is an IoT application platform as a service (aPaaS) that reduces the burden and cost of developing, managing, and maintaining enterprise-grade IoT solutions.

IoT Hub is a great way to manage and provision devices, and it provides a robust means of dealing with messages. You can even use Azure Stream Analytics to route messages to Power BI for a near real-time dashboard of device messages, but doing that requires a bit of complex configuration. If you’re looking for a first-class experience in monitoring IoT devices without having to do complex configuration, IoT Central is a good choice. If you choose to build with IoT Central, you’ll have the opportunity to focus time, money, and energy on transforming your business with IoT data, rather than just maintaining and updating a complex and continually evolving IoT infrastructure.

Through IoT Central’s interface, you can easily connect new devices, view telemetry, view overall device performance, create and manage alerts that notify you when a device needs maintenance, and push updates to devices when needed.

Azure Sphere

Azure Sphere is an end-to-end IoT security solution where the security of telemetry data and communications is critical.

There are three components to Azure Sphere:

  1. Hardware: This is the Azure Sphere microcontroller unit (MCU), which hosts the OS and receives the sensor data.
  2. OS: This is a customized Linux OS and runs the security service.
  3. Security Service: This ensures that the device has not been compromised. The device must always authenticate before sending data. The security service checks that the device has not been tampered with before providing a secure communication channel that can push any updates to the device. Once Azure Sphere has validated the device and ensured any updates are available, it allows the device to send data and receive instructions for the settings that need to be updated.

Although you can build a complete IoT solution using just IoT Hub and IoT Central, Azure Sphere gives you the ability to create a custom, highly secure IoT solution.

Big Data and Analytics Solutions

  • Azure Synapse Analytics
  • HDInsight
  • Azure Databricks

We should first understand what we mean by big data and analytics. In a nutshell, it is about discovering information hidden in data, which should present actionable/ acknowledgeable information to help an organization make informed decisions and, dependent on the context, gain a competitive advantage.

The challenge is that traditional Data Warehouse (DW) solutions and technologies cannot handle the massive volumes of complex, unstructured data, which is a characteristic of big data. This makes the traditional DW approach defunct due to its cloud mindset.

There are four types of analytics techniques for getting insights out of data; some are based on traditional business intelligence (BI) analysis techniques, while others are based on AI (machine learning) techniques.

There is a very close relationship between BI and AI. BI uses analytics services, which can be considered a suite of business analytics tools for analyzing data and sharing insights, but what value do they provide? And what do we do with those insights?

This is where AI comes into the picture; AI refers to the ability to analyze large quantities of data, learn from the results, and then use this knowledge (insights) to optimize and change future processes, systems, and so on. It is often said that the most challenging part of AI isn’t AI… it’s data!

It is also said that you should not AI before you BI. BI is often the bridge between AI and the data.

The following are the characteristics you should consider for big data:

  • Quantity (volume) of data arriving (ingesting): This is often on the petabytes scale.
  • Speed (velocity) of data arriving (ingesting): This could be near/real-time streamed data, a time-framed schedule, or batch data.
  • Age (validity) of data arriving (ingesting): This data could have a life cycle that means the data value is no longer valid. This is critical for decisions or actions based on a value.
  • Format (variety) of data arriving (ingesting): This could be structured, semi-structured, or unstructured data.

Traditional database systems and data stores do not have the characteristics of the preceding list. To allow us to move beyond the limitations of the traditional DW and move toward more intelligent and actionable insights, we need a paradigm shift in terms of our mindset and technologies; we need a modern set of technologies and a new architecture is required to support this modern way of ingesting, processing, analyzing, and visualizing data to take action.

Big data is more than just the volume of data; when we combine the velocity of the data, its complexity, and the format (being unstructured), we get the term big data.

Big data architecture:

  • Data Sources: Databases, log files, file stores, IoT devices, and social media
  • Data Storage: Data lakes and blob storage
  • Real-Time Message Ingestion: Azure Event Hubs, Azure IoT Hubs, and Kafka
  • Batch Processing: HDInsight and Databricks
  • Stream Processing: Azure Stream Analytics and HDInsight
  • Analytical Data Store: Azure Synapse Analytics, SQL Data Warehouse, and HDInsight
  • Analysis and Reporting: Azure Synapse Analytics, Azure Analysis Services, Power BI, and Excel
  • Orchestration: Data Factory

Three Azure big data and analytics services:

  1. Azure Synapse Analytics
  2. Azure HDInsight
  3. Azure Databricks
Big data and analytics architecture
Azure data landscape

Azure Synapse Analytics

PaaS Solution
Analytics service that brings together enterprise data warehousing and Big Data analytics.
It gives you the freedom to query data on your terms, using either serverless on-demand or provisioned resources, at scale.

Big data means more data than you can analyze through conventional means within the desired timeframe. Analyzing big data requires a powerful system for storing data, the ability to query the data in multiple ways, enormous power to execute large queries, assurance that the data is secure, and much more. That’s exactly what Azure Synapse Analytics provides.

Note: While it’s true to say that Azure Synapse is the replacement for SQL Data Warehouse, it’s also important to note that Azure Synapse adds much more functionality.

SQL Data Warehouse was focused primarily on storage of big data (called warehousing), but Azure Synapse provides that functionality in addition to powerful analytics features.

Azure Synapse Analytics is intended for large-scale modern Data Warehouse and analytic scenarios with petabytes of data, with complex queries running against it.

Azure Synapse runs in an Azure Synapse cluster. A cluster is a combination of four different components:

  1. Synapse SQL
  2. Apache Spark integration
  3. Data integration of Spark and Azure Data Lake Storage
  4. A web-based user interface called Azure Synapse Studio

Synapse SQL is the data warehousing portion of Azure Synapse. Using Synapse SQL, you can run powerful queries against your big data.

Many consumers of big data use a third-party big data processing engine called Apache Spark, and Azure Synapse tightly integrates with the Spark engine. Spark features are automatically incorporated into Azure Synapse when you create a cluster.

Azure Synapse integrates Apache Spark functionality with Azure Data Lake Storage. Azure Data Lake Storage is designed for storing large amounts of data that you’d like to analyze, but Data Lake Storage is designed for a wide array of data instead of relational data. In a data lake, data is stored in containers. Each container typically contains related data.

Azure Synapse makes it easy to analyze data and manage your data with a web-based portal called Azure Synapse Studio. Once you’ve created your Azure Synapse workspace, you simply click a button to launch Synapse Studio, and from there, you can easily manage and analyze your data.

Azure Synapse Analytics solution architecture

Azure HDInsight

A cloud-based service from Microsoft for big data analytics.
It is a cloud-based service from Microsoft for big data analytics that helps organizations process large amounts of streaming or historical data.

HDInsight makes it possible to easily create and manage clusters of computers on a common framework designed to perform distributed processing of big data. Essentially, HDInsight is Microsoft’s managed service that provides a cloud-based implementation of a popular data analytics platform called Hadoop. However, it also supports many other cluster types.

Although the clusters primarily run Hadoop as a managed service, there is also cluster support for the HBase, Storm, Spark, Interactive Query, R Server, and Kafka open source technologies.

Hadoop: An open-source framework for storing data and running apps on clusters. It offers massive storage for any data, lots of processing power. It can handle virtually “limitless” concurrent tasks. Hadoop has been declared open source and is now named Apache Hadoop.

HDInsight supported cluster types:

  • Hadoop
  • HBase: Extremely fast and scalable NoSQL database
  • Storm: Fast and reliable processing of unbounded streams of data in real time
  • Spark: Extremely fast analytics using in-memory cache across multiple operations in parallel
  • Interactive Query: In-memory analytics using Hive and LLAP (processes that execute fragments of Hive queries)
  • R Server: Enterprise-level analytics using R, a language that’s specialized for big data analytics
  • Kafka: Extremely fast processing of huge numbers of synchronous data streams, often from IoT devices.

The following are some examples of how HDInsight is being used in the real world:

  • Telecoms: Churn prediction, market offers, pricing call detail records (CDRs), network monitoring, demand provisioning, and optimizations.
  • Financial Services: Customer 360 and fraud detection.
  • Health Care: Clinical trial selection, patient mining, vaccine effectiveness, personalized medication, and health plans.
  • Industry: Predictive maintenance, supply chain, stock control, and inventory optimization.
  • Utility: Demand prediction and smart metering.

Building your own cluster is time-consuming and often difficult unless you have previous experience. With HDInsight, Microsoft does all the heavy lifting on its own infrastructure. You benefit from a secure environment—one that is easily scalable to handle huge data processing tasks.

An HDInsight cluster performs analytics by breaking up large data blocks into segments that are then handed off to nodes within the cluster. The nodes then perform analytics on the data and reduce it down to a result set. All this work happens in parallel so that operations are completed dramatically faster than they would be otherwise. By adding additional nodes to a cluster, you can increase the power of your analytics and process more data even faster.

Azure Databricks

You can think of Azure Databricks as Spark-as-a-Service. Azure Databricks is an Apache Spark-based big data analytics PaaS service; it provides fully managed Spark clusters that Databricks runs on top of. This means that there is no need to provision VMs or master and worker nodes; this requirement has been abstracted (removed) and is provided as the managed nodes service.

Spark: General purpose distributed data processing engine. It can be used for a wide range of circumstances. It uses a lot of libraries that can be used. For example: SQL, machine learning, graph computing, and streaming processing. Spark does not provide storage, only a computation engine. Spark extends the Hadoop MapReduce framework to work in an optimized way.

Databricks: Databricks was founded by the creator of Spark. The team behind databricks keeps the Apache Spark engine optimized to run faster and faster. The databricks platform provides around five times more performance than an open-source Apache Spark. With Databricks, you have collaborative notebooks, integrated workflows, and enterprise security. This will be in a fully managed cloud platform.

Data is managed and interacted with through the Databricks workspace, which is a browser-based user interface.

It can be used in machine learning solutions to prep and train large amounts of data through its optimal design, which was made for such scenarios. As such, it includes the MLlib machine learning library; it also has a use case for IoT solutions and streaming near/real-time data.

Azure Databricks solution architecture

DevOps Solutions

  • Azure DevOps
  • GitHub / GitHub Actions
  • Azure DevTest Labs

What is DevOps?

DevOps is a software and systems development culture that’s enabled through team collaboration, common processes, and technology to deliver a better aligned and unified outcome.

It was conceived as more of a mindset than a toolset; this means it’s as much a cultural shift of silos of the Development and Operations teams as it is a technology. The hardest thing to change in any organization is not the technology but the people and processes.

Microsoft’s definition of DevOps is that “DevOps is the union of people, processes, and technology to enable continuous delivery of value to your end users.

Development and operations teams can be more effective when collaborating on working toward a common set of aligned goals and automating repetitive tasks. This means systems and software can be created, updated, and deployed much quicker with increased quality, removing cost and adding value at every stage.

A DevOps approach can be achieved by automating each stage of the process as much as possible.

CI/CD is an abbreviation for Continuous integration/continuous delivery.

In a nutshell, these are a set of practices and operating principles that form the basis of the DevOps culture. This is to allow Dev and Ops teams to work more collaboratively, align team goals, and release software and systems with greater frequency, control, and quality.

The DevOps process flow

Azure DevOps

Azure DevOps (formerly Visual Studio Team Services (VSTS)) is an integrated code development and deployment platform.

Provided as a Software-as-a-Service (SaaS) platform by Microsoft, it can be accessed and used via the Azure portal; its purpose is to allow software and systems to be created while DevOps practices are being followed.

Azure DevOps is also available as an on-premises solution with Azure DevOps Server.

Azure DevOps provides tools and capabilities to manage complex software projects that require many individuals, often across many teams, and provides planning, tracking, software builds, versioning, testing, deployment, monitoring, and overall management of projects.

Azure DevOps is comprised of the following five services:

  1. Azure Artifacts: Provides a repository for storing development artifacts such as compiled source code. Artifacts can be used by other services for testing or deployment.
  2. Azure Boards: Provides capabilities for managing development projects and individual items, including user stories, backlog items, tasks, features, and bugs.
  3. Azure Pipelines: Enables you to automatically build and test code projects.
  4. Azure Repos: A source‐code repository for publishing and collaborating on development projects.
  5. Azure Test Plans: Provides an automated testing tool for testing code.
Azure DevOps services

GitHub and GitHub Actions

GitHub is a code hosting platform (repository) for open source software and is provided as an online hosted SaaS tool. Its equivalent Azure DevOps service is Azure DevOps Repos.

GitHub Actions is an API that allows you to automate/orchestrate CI/CD workflows.

GitHub provides more than just a code hosting platform; among other things, it also enables code sharing, development collaboration, review and discussion, documentation, and other collaboration mechanisms for sharing and collaborating on open source code projects. GitHub Actions provides workflow automation services.

GitHub and GitHub Actions offer many of the same functions as Azure DevOps, and these services can integrate. In general, GitHub is the appropriate choice for collaborating on open source projects and DevOps is the appropriate choice for enterprise/internal projects.

Azure DevTest Labs

Azure DevTest Labs is an environment for automatically creating lab resources from pre-configured base items or ARM templates, all while utilizing the least administrative resources and effort.

This allows VMs and resources for testing and development purposes to be quickly created on-demand and then torn down. This can be done to integrate with a DevOps CI/CD approach and reduces the administrative effort needed to deploy a large number of machines, even if they have very different configurations and requirements and are required for a short duration, such as for build testing before releasing code.

Azure DevTest Labs automates the deployment, configuration, and decommissioning of virtual machines and other Azure resources. For example, let’s assume you need to deploy 20 virtual machines of various types in a test subnet with specific network security groups and other resources. You can use DevTest Labs to automate that deployment, and when testing is complete, you can decommission all of those services so that you pay only for the resources you need for testing while you are actually testing them. All of this can be largely automated, greatly simplifying the development testing process.


Internet of Things: The IoT‐related services in Azure enable you to build complex IoT solutions. Azure IoT Hub functions as a hub for bidirectional communication between IoT devices and Azure services. IoT Central builds on IoT Hub to provide visualization, control, and management features for IoT devices. Azure Sphere enables you to deploy and manage IoT solutions through custom devices using a combination of integrated microcontroller units, management software, and certificate‐based security services, ensuring a secure IoT solution tailored to your organization’s needs.

Machine Learning and AI: Azure Machine Learning consists of a collection of Azure services and tools that you use to train and validate AI models. Azure Cognitive Services provides a comprehensive set of machine learning models to execute cognitive functions that humans would normally do, such as recognizing images, performing speech‐to‐text and text‐to‐speech functions, and translating languages. Azure Bot Service enables you to create and use virtual agents to interact with users, answering questions, gathering information, and initiating corresponding activities.

Development and Serverless Computing: With Azure Functions, you can create simple, typically stateless bits of code that execute in response to events. With the addition of a storage account, you can make functions stateful, maintaining their state between executions. Azure Logic Apps, also a serverless computing solution, enables you to build low‐code and no‐code workflow and process solutions using a web‐based design interface.

DevOps features in Azure are designed to simplify code development and collaboration. Azure DevOps Services consists of several services for code management, testing, and deployment. GitHub and GitHub Actions provide many of the same types of features but are targeted and open source projects. Finally, Azure DevTest Labs provides the means for development teams to easily deploy virtual machines and other Azure services for code testing, then quickly and easily decommission those resources when testing is completed.


Leave a Reply

Related Post