Microservice Architecture Demystified

Inspiration for writing this article came after a recent debate with one of my friend on "what is the real meaning of term microservice architecture". Focus of this article is to demystify basic concepts and core architectural principles for building microservices. Information is curated and para phrased from multiple sources. I tried my best to provide references and links to original sources and authors.

Microservice architecture:

Microservice architecture is a design pattern to build software system composed of multiple isolated, autonomous, context-bound, reusable services that interacts with each other to provide a business capability.

Is it same as SOA or different?

  • Microservices architecture extends core SOA design principles where service is defined as : loosely coupled, autonomous, composable, reusable, contract driven, context bounded and discoverable.

  • It is different than SOA as it does not carry the misinterpreted concepts associated with SOA like ESB, WS-* vs REST or building wrappers over legacy systems to expose as a service without application re-architecture.

  • Checkout SOA is Dead; Long Live Services by Ann Thomas Manes for better understanding.


Five Core Principles of a Successful Microservice Architecture:


  • SRP is THE MOST important principle of microservice architecture. Term "micro" refers to one and only one capability provided by a service.

  • It promotes high cohesion i.e. all entities in a microservice are focused on single task. Each service should have one and only one reason to change.

  • Defining the "scope" or "responsibility" of a microservice is more of an art than science. Domain-Driven Design is a recommended read and an excellent guide to gain more insights on core concepts of microservice architecture.

    Isolation helps in building and maintaining a single capability without impacting other functions. Isolation principle enables:

  • Design, Build, Test and Release isolation - Enables continuous delivery, reduced time to market and time to value as change is isolated to a specific business function.

  • Database isolation - Eliminates integration at database level. Any changes in database affects only one microservice and thus reduces surface area for testing, validation and deployment.

  • Failure isolation - Contain propagation of failure and prevent cascading impact to whole system.

  • Microservices are autonomous, self-governing and controls its own destiny. Consumers cannot impose obligation (Command & Control) on provider. An autonomous provider service can only promise its own behavior by publishing its interface.

  • Consumer can choose to accept or reject the promise offered by provider but cannot direct it to behave in a particular way. Here is a Google Tech Talk by Mark Burgess on Promise Theory that explains the merits of this principle.

  • Microservice encapsulates the data (or state) and behavior as a single unit. Persistent data for each service must be private. Consumer can access data only through its published interfaces or API.

  • Distributed Monolith is a major anti-pattern where bunch of so called independent services shares a common database for persistent state.

  • Isolating state (persistence) and behavior in a microservice opens new avenues for innovation, refactoring and changing a service within bounded context. Services team can pick the right tool for the job to solve specific problem.

  • Polyglot persistence and multiple programming languages can be leveraged within a single system composed of multiple autonomous microservices.

  • Microservices interacts with each other to fulfill a user request or perform a business function. Interaction can be Synchronous or Asynchronous.

  • Sync communication blocks consumer thread until provider completes processing. In Async communication, consumer thread does not block or wait for provider response. It initiates a request and retrieve results at a later point of time using callback or similar mechanism.

  • Checkout Convenience Over Correctness to understand challenges of synchronous message passing for IPC. Here is another great blog on valid use cases of synchronous messaging.

Summary

Moving from distributed monoliths to true microservices can sometimes be challenging due to organization culture, appetite for innovation, technical skills and understanding of concepts. Microservice architecture is a discipline. Like any other discipline it has set of rules, design patterns and core principles that must be applied for a successful implementation. It is a paradigm shift that requires strict adherence to core principles of Single Responsibility, Isolation, Autonomy, Private data and loosely coupled interaction among services.

Cloud Computing:

Cloud computing has helped many enterprises transform themselves over the last five years, but experts agree that the market is entering something of a second wave, both for public cloud and private cloud services built and hosted in corporate data centers. The cloud market will accelerate faster in 2017 as enterprises seek to gain efficiencies as they scale their compute resources to better serve customers, says Forrester Research in a new report.

“The No. 1 trend is here come the enterprises,” says Forrester analyst Dave Bartoletti, primary author of the research. “Enterprises with big budgets, data centers and complex applications are now looking at cloud as a viable place to run core business applications. ” Forrester says the first wave of cloud computing was created by Amazon Web Services, which launched with a few simple compute and storage services in 2006. A decade later, AWS is operating at an $11 billion run rate.

Forrester found that 38 percent of 1,000-plus North American and European enterprise infrastructure technology decision-makers said that they are building private clouds, with 32 percent procuring public cloud services and the remainder planning to implement some form of cloud technology in the next 12 months. Also, 59 percent of respondents said they were adopting a hybrid cloud model.

Forrester offered the following cloud predictions for 2017:

1.Regional players complement ’mega cloud providers’

CIOs who initially elected to build private clouds may find themselves switching to public clouds as they realize just how time-consuming and costly the work will prove. Capital One shuttered private cloud efforts in favor of Amazon Web Services a few years ago, says its CIO Rob Alexander. “We recognized that we were spending a lot of time, energy, effort and management bandwidth to create infrastructure that already exists out there in a much better state and is evolving at a furious pace,” Alexander says.

  • The global public cloud market will top $146 billion in 2017, up from just $87 billion in 2015 and is growing at a 22 percent compound annual growth rate. The lion’s share of this growth will come from Amazon.com, Microsoft, Google and IBM, which have emerged as "mega-cloud providers,” Bartoletti says. They are opening new data centers and making concessions, such as Microsoft’s agreement to have T-Systems manage its cloud in Germany to meet data localization requirements. But the big players won’t be able to service every unique request, which means smaller regional players will see an uptick in adoption in 2017. Bartoletti recommends: "Keep you options open and don't be afraid to use multiple providers."

2.Cloud cost containment

COne popular theory is that CIOs will save money by investing in public cloud software, but that’s not always the case. The fact that most CIOs leverage multiple cloud providers means enterprises are already waist-deep in complex cloud vendor management. Also, if companies leave public cloud instances running through the weekend when they don’t need them, CIOs can actually spend more money than they did with on-premises solutions.

  • IT executives will get better at containing cloud costs in 2017 as their best practices mature. And it’s already happening. Bartoletti says that a cloud architect for a large software company shaved $300,000 off of a $2.5 million cloud bill by monitoring his consumption. And cost management tools from the likes of AWS, Cloudability and Cloudyn are available. “There’s no reason in 2017 for your cloud costs to grow out of control,” Bartoletti says."

3.Lift and shift those cloud apps

Forrester also recommends that companies refactor apps to run on public cloud systems, leveraging migration services, rather than simply dumping existing apps into a public cloud. The optimum option to move an application is to rewrite it to take advantage of cloud’s elasticity, although this lift-and-shift migration can be costly. “In 2017, lift-and-shift migration tools will accelerate the rate of cloud migration, given their low cost for bulk application migrations,” Bartoletti says.

4.Hyperconverge your private cloud

Bartoletti says that while more Forrester clients are citing security as a reason to shift to public cloud services not every CIO wants to accept risks associated with entrusting their customer and other sensitive data to a third-party vendor. Like their public cloud counterparts, private cloud services require advanced virtualization, standardization, automation, self-service access and resource monitoring. Stitching these capabilities together into a cohesive system is daunting and expensive.

Hyperconverged infrastructure (HCI) solutions promise to help, offering preintegrated compute and storage resources that help organizations get their cloud implementations running faster. Forrester recommends that organizations consider HCI as the foundation for their private cloud development, particularly for new workloads that demand rapid, automated scale-out. “HCI is quickly becoming the default infrastructure platform upon which to build the private portion of a hybrid cloud,” Bartoletti says.

5.There’s a container for that

Containers enable developers to manage software code, particularly software developed for cloud apps. Forrester says that Linux containers will be available in every major public and private cloud platform by early 2017. Developers will consume them directly and often build their own stacks to power microservices development. But the new paradigm means new challenges; companies will need to grapple with new security, monitoring, storage and networking issues that arise as containers are deployed broadly in production. “Your first step should be to evaluate the pros and cons of on-premises private PaaS versus a managed public cloud development platform; you might need both,” Bartoletti says.


6.Enterprise apps come to public cloud

Several companies are hosting enterprise applications in AWS, suggesting that CIOs have become more comfortable hosting critical software in the public cloud. Dollar Shave Club runs Spark analytics software in AWS. Cardinal Health runs Splunk in AWS. Several others are running business apps, such as SAP, in AWS. Bartoletti says you can expect this trend to continue as CIOs rely more heavily on public cloud providers. “Enterprise are turning great ideas into software and insights faster and the cloud is the best place to get quick insights out of enterprise data,” Bartoletti says.

Blockchain

A blockchain is a public ledger of all bitcoin transactions that have ever been executed. A block is the “current” part of a blockchain which records some or all of the recent transactions, and once completed, goes into the blockchain as permanent database. Each time a block gets completed, a new block is generated. Blocks are linked to each other (like a chain) in proper linear, chronological order with every block containing a hash of the previous block. To use conventional banking as an analogy, the blockchain is like a full history of banking transactions.


Blockchain Will Play an Imperative Role in Capital Markets:

Blockchain technology has been acknowledged as one of the most disruptive innovations since the advent of the Internet. The financial industry has also started looking to leverage it to store and transfer its value to other financial instruments. Capital Markets is one such industry in the financial space where industry experts are optimistic about the use of blockchain technology.

How Does Blockchain Work?

  • This record is referred to as a “ledger” in the cryptocurrency world, and each dataexchange is a “transaction“. Every verified transaction is added to the ledger as ablock.

  • It utilizes a distributed system to verify each transaction a peer to peer network of node.

  • Once signed and verified, the new transaction is added to the blockchain and can notbe altered.

Blockchain Technology In Various Industries:

Blockchain technology can be utilized in multiple industries including FinancialServices, Healthcare, Government, Travel and Hospitality, Retail and CPG. Financial.

  • In the financial services sector, Blockchain technology has already beenimplemented in many innovative ways. Blockchain technology simplifies andstreamlines the entire process associated with asset management and payments byproviding an automated trade lifecycle where all participants would have access tothe exact same data about a transaction.

  • Blockchain can play a key role in the healthcare sector by increasing theprivacy, security and interoperability of the healthcare data. It holds the potential toaddress many interoperability challenges in the sector and enable secure sharing ofhealthcare data among the various entities and people involved in the process. Iteliminates the interference of a third-party and also avoids the overhead costs.

  • Blockchain technology holds the power to transform Government’s operations and services. It can play a key role in improving the data transactional challenges in the Government sector, which works in siloes currently.

  • The proper linking and sharing of data with Blockchain enable better management of data between multiple departments. It improves the transparency and provides a better way to monitor and audit the transactions

  • There is a huge opportunity for Blockchain technology to be applied in the retail sector.

  • This includes everything from ensuring the authenticity of high value goods, preventing, fraudulent transactions, locating stolen items, enabling virtual warranties, managing loyalty points and streamlining supply chain operations.

  • Hospitality: The application of Blockchain can radically change the travel and hospitality industry. It can be applied in money transactions, storing important documents like passports other identification cards, reservations and managing travel insurance, loyalty and rewards.

Summary

The blockchain technology is both intriguing and exciting. Could it be the revolution that gurus predict? Or is it just a speculative bubble based on an impractical idea? After reading a lot on the matter, we couldn't form a definitive opinion.

Big Data

Big data is a term that describes the large volume of data both structured and unstructured that inundates a business on a day-to-day basis. But it’s not the amount of data that’s important. It’s what organizations do with the data that matters. Big data can be analyzed for insights that lead to better decisions and strategic business moves.

Big data is larger, more complex data sets, especially from new data sources. These data sets are so voluminous that traditional data processing software just can’t manage them. But these massive volumes of data can beused to address business problems you wouldn’t have been able to tackle.

Why Is Big Data Important?

Microservice architecture is a design pattern to build software system composed of multiple isolated, autonomous, context-bound, reusable services that interacts with each other to provide a business capability.


Is it same as SOA or different?

  • The importance of big data doesn’t revolve around how much data you have, but what you do with it. You can take data from any source and analyze it to find answers that enable.

  • 1. cost reductions

    2. Time reductions

    3. New product development and optimized offerings

    4. Smart decision making

  • When you combine big data with high powered analytics you can accomplish business related tasks such as

  • Determining root causes of failures, issues and defects in near-real time.

  • Generating coupons at the point of sale based on the customer’s buying habits.

  • Recalculating entire risk portfolios in minutes.

  • Detecting fraudulent behaviour before it affects your organization.

Who uses big data?

Big data affects organizations across practically every industry. See how each industry can benefit from this onslaught of information.

  • With large amounts of information streaming in from countless sources, banks are faced with finding new and innovative ways to manage big data. While it’s important to understand customers and boost their satisfaction, it’s equally important to minimize risk and fraud while maintaining regulatory compliance. Big data brings big insights.

  • Educators armed with data driven insight can make a significant impact on school systems, students and curriculums. By analyzing big data, they can identify at-risk students, make sure students are making adequate progress, and can implement a better system for evaluation.

  • When government agencies are able to harness and apply analytics to their big data, they gain significant ground when it comes to managing utilities, running agencies, dealing with traffic congestion or preventing crime. But while there are many advantages to big data, governments must also address issues of transparency and privacy.

  • Armed with insight that big data can provide, manufacturers can boost quality and output while minimizing waste processes that are key in today’s highly competitive market.

  • Customer relationship building is critical to the retail industry and the best way to manage that is to manage big data. Retailers need to know the best way to market to customers, the most effective way to handle transactions, and the most strategic way to bring back lapsed business. Big data remains at the heart of all those things.

Big Data Glossary

While we've attempted to define concepts as we've used them throughout the guide, sometimes it's helpful to have specialized terminology available in a single place.

1. Cluster computing:

Clustered computing is the practice of pooling the resources of multiple machines and managing their collective capabilities to complete tasks. Computer clusters require a cluster management layer which handles communication between the individual nodes and coordinates work assignment.

2. Data lake:

Data lake is a term for a large repository of collected data in a relatively raw state. This is frequently used to refer to the data collected in a big data system which might be unstructured and frequently changing.

3. Data mining:

Data mining is a broad term for the practice of trying to find patterns in large sets of data. It is the process of trying to refine a mass of data into a more understandable and cohesive set of information.

4. Data warehouse:

Data warehouses are large, ordered repositories of data that can be used for analysis and reporting. In contrast to a data lake, a data warehouse is composed of data that has been cleaned, integrated with other sources, and is generally well-ordered.

5. ETL:

ETL stands for extract, transform, and load. It refers to the process of taking raw data and preparing it for the system's use. This is traditionally a process associated with data warehouses, but characteristics of this process are also found in the ingestion pipelines of big data systems.

6. Hadoop:

Hadoop is an Apache project that was the early open-source success in big data. It consists of a distributed filesystem called HDFS, with a cluster management and resource scheduler on top called YARN (Yet Another Resource Negotiator). Batch processing capabilities are provided by the MapReduce computation engine. Other computational and analysis systems can be run alongside MapReduce in modern Hadoop deployments.

7. In-memory computing:

In-memory computing is a strategy that involves moving the working datasets entirely within a cluster's collective memory. Intermediate calculations are not written to disk and are instead held in memory. This gives in-memory computing systems like Apache Spark a huge advantage in speed over I/O bound systems like Hadoop's MapReduce.

8. Machine learning:

Machine learning is the study and practice of designing systems that can learn, adjust, and improve based on the data fed to them. This typically involves implementation of predictive and statistical algorithms that can continually zero in on "correct" behavior and insights as more data flows through the system.

9. Map reduce (big data algorithm):

Map reduce (the big data algorithm, not Hadoop's MapReduce computation engine) is an algorithm for scheduling work on a computing cluster. The process involves splitting the problem set up (mapping it to different nodes) and computing over them to produce intermediate results, shuffling the results to align like sets, and then reducing the results by outputting a single value for each set.

10. NoSQL:

NoSQL is a broad term referring to databases designed outside of the traditional relational model. NoSQL databases have different trade-offs compared to relational databases.

11. Stream processing:

Stream processing is the practice of computing over individual data items as they move through a system. This allows for real-time analysis of the data being fed to the system and is useful for time-sensitive operations using high velocity metrics.

What tools are used to analyze big data?

Perhaps the most influential and established tool for analyzing big data is known as Apache Hadoop. Apache Hadoop is a framework for storing and processing data at a large scale, and it is completely open source. Hadoop can run on commodity hardware, making it easy to use with an existing data center, or even to conduct analysis in the cloud. Hadoop is broken into four main parts.

The Hadoop Distributed File System (HDFS), which is a distributed file system designed for very high aggregate bandwidth.

YARN, a platform for managing Hadoop's resources and scheduling programs that will run on the Hadoop infrastructure.

How Big data Works?

how big data can work for your business, you should understand where it comes from. The sources for big data generally fall into one of three categories.

1. Streaming data:

This category includes data that reaches your IT systems from a web of connected devices, often part of the IOT You can analyze this data as it arrives and make decisions on what data to keep, what not to keep and what requires further analysis.

2. Social media data:

The data on social interactions is an increasingly attractive set of information, particularly for marketing, sales and support functions. It's often in unstructured or semistructured forms, so it poses a unique challenge when it comes to consumption and analysis.

3. Publicly available sources:

Massive amounts of data are available through open data sources like the US government’s data. gov, the CIA World Factbook or the European Union Open Data Portal.

4. How to store and manage:

Where as storage would have been a problem several years ago, there are now low-cost options for storing data if that’s the best strategy for your business.

5. How much of it to analyse:

Organizations don't exclude any data from their analyses, which is possible with today’s high-performance technologies such as grid computing or in memory analytics Another approach is to determine upfront which data is relevant before analyzing it.

6. How to use any insights you uncover:

The more knowledge you have, the more confident you’ll be in making business decisions. It’s smart to have a strategy in place once you have an abundance of information at hand.

Conclusion:

Big data is a broad, rapidly evolving . While it is not well suited for all types of computing, many organizations are turning to big data for certain types of work loads and using it to supplement their existing analysis and business tools. Big data systems are uniquely suited for surfacing difficult-to-detect patterns and providing insight into behaviors that are impossible to find through conventional means. By correctly implement systems that deal with big data, organizations can gain incredible value from data that is already available.

Internet of Things

The Internet of Things connects the physical world to the Internet so that you can use data from devices to increase productivity and efficiency. Connecting things to the Internet is possible because different connectivity options are widely available, the cost of connecting is declining, and more devices are capturing data. Consumers expect everything in their home, everything they drive, and even what they wear to have IoT capabilities.For example rotimatic a smart, fully automated flatbreadmaking device, uses AWS IoT to securely transfer data to and from the cloud. Industrial companies use IoT applications to make industrial equipment and processes smarter and safer.

How does the Internet of Things work?

Today, the Internet of Things connects physical devices embedded with electronics, software, sensors, and actuators to the cloud and to each other. Devices communicate through different protocols and many, such as MQTT, were designed to tolerate intermittent connections and reduce network bandwidth requirements. All IoT communication must be secured using preventive security mechanisms. like device identity management, encryption, and access control as well as device auditing monitoring.

Why is the Internet of Things important?

The Internet of Things is important because it makes previously unusable data available. IoT applications tap into device data and let you visualize, explore, and build sophisticated analytics such as machine learning in the cloud. IoT applications can also run on devices so they can respond in real-time as events unfold. IoT applications are deployed across a wide range of use cases including connected homes, connected vehicles, healthcare, industrial, retail, and many more.Connected homes are safer, cleaner, and more energy efficient. Amway has successfully launched its first Internet-connected product the Atmosphere Sky Air Treatment System using AWS IoT to build policies and security throughout the entire architecture.

Using AWS for IOT applications:

AWS IoT services enable you to easily and securely connect and manage billions of devices. You can gather data from, run sophisticated analytics on and take actions in real-time on your diverse fleet of IoT devices from edge to the cloud.

1. AWS IoT Core:

AWS IoT Core is a managed cloud service that lets connected devices easily and securely interact with cloud applications and other devices. AWS IoT Core provides secure communication and data processing across different kinds of connected devices and locations so you can easily build IoT applications.

2. AWS IoT Device Management:

As many IoT deployments consist of hundreds of thousands to millions of devices, it is essential to track, monitor, and manage connected device fleets. We need to ensure your scale and diversity of IoT devices work properly and securely. AWS IoT Device Management makes it easy to securely onboard, organize, monitor, and remotely manage IoT devices at scale.

3. Amazon FreeRTOS:

It is an operating system for microcontrollers that makes small, low-power edge devices easy to program, deploy, secure, connect, and manage. Amazon FreeRTOS is based on the FreeRTOS kernel, a popular open source operating system for microcontrollers, and extends it with software libraries that make it easy to securely connect your small, low-power devices to AWS cloud services.

4. AWS Greengrass:

AWS Greengrass is software that lets you run local compute,messaging, data caching, sync, and ML inference capabilities for connected devices in a secure way. With AWS Greengrass, connected devices can run AWS Lambda functions, keep device data in sync, and communicate with other devices securely.

5. AWS IoT Device Defender:

AWS IoT Device Defender is a fully managed service that helps you secure your fleet of IoT devices. configuration is a set of technical controls you set to help keep information secure when devices are communicating with each other and the cloud. Device Defender makes it easy to maintain and enforce IoT configurations, such as ensuring device identity, authenticating and authorizing devices, and encrypting device data.

6. AWS IoT Analytics:

AWS IoT Analytics is a fully-managed service that makes it easy to run and operationalize sophisticated analytics on massive volumes of IoT data without having to worry about the cost and complexity typically required to build an IoT analytics platform. AWS IoT Analytics automates each of the difficult steps that are required to analyse data from IoT devices. AWS IoT Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store for analysis.

7. Few more IoT Applications:

Home and Security, Energy, IoT Healthcare, Urban Planning, Transportation, Manufacturing etc.

How the Internet of Things (IoT) is Changing Business:

The future is looking pretty exciting thanks to the Internet of things.The greatest thing IoT will change for consumers is their experience with customer service representatives. Companies, instead of customers, will start initiating customer service interactions, and these exchanges will often begin before a problem has even occurred.

Summary:

The promise of 31 billion connected devices by 2020, it’s safe to say that the Internet of Things has arrived, and it’s here to stay. And with the rise of IoT,comes an onslaught of data. While this data will have a significant impact on marketing as we know it, it will also require unprecedented amounts of security.Without the proper protection in place, individuals or entire infrastructures could beat risk.

Machine Learning

Machine learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it learn for themselves. The process of learning begins with observations or data, such as examples, direct experience, or instruction, in order to look for patterns in data and make better decisions in the future based on the examples that we provide. The primary aim is to allow the computers learn automatically without human intervention or assistance and adjust actions accordingly.

Examples of machine learning:

Machine learning is being used in a wide range of applications today. One of the most well-known examples is Facebook's News Feed. The News Feed uses machine learning to personalize each member's feed. Behind the scenes, the software is simply using statistical analysis and predictive analytics to identify patterns in theuser's data and use those patterns to populate the News Feed. Machine learning is also entering an array of enterprise applications. Customer relationship management.

Types of machine learning:

Just as there are nearly limitless uses of machine learning, there is no shortage of machine learning algorithms. They range from the fairly simple to the highly complex. Here are a few of the most commonly used models.

This class of machine learning algorithm involves identifying a correlation generally between two variables and using that correlation to make predictions about future data points.

1. Decision trees:

These models use observations about certain actions and identify an optimal path for arriving at a desired outcome.

2. K-means clustering:

This model groups a specified number of data points into a specific number of groupings based on like characteristics.

3. Neural networks:

These deep learning models utilize large amounts of training data to identify correlations between many variables to learn to process incoming data in the future.

4. Supervised machine learning algorithms:

Starting from the analysis of a known training dataset, the learning algorithm produces an inferred function to make predictions about the output values. The system is able to provide targets for any new input after sufficient training. The learning algorithm can also compare its output with the correct, intended output and find errors in order to modify the model accordingly.

5. unsupervised machine learning algorithms:

when the information used to train is neither classified nor labeled. Unsupervised learning studies how systems can infer a function to describe a hidden structure from unlabeled data. The system doesn’t figure out the right output, but it explores the data and can draw inferences from datasets to describe hidden structures from unlabeled data.

6. Semi-supervised machine learning algorithms:

Fall somewhere in between supervised and unsupervised learning, since they use both labeled and unlabeled data for training typically a small amount of labeled data and a large amount of unlabeled data. The systems that use this method are able to considerably improve learning accuracy. Usually, semi-supervised learning is chosen when the acquired labeled data requires skilled. Otherwise, acquiringunlabeled data generally doesnot require additional resources.

7. Reinforcement machine learning algorithms:

It is a learning method that interacts with its environment by producing actions and discovers errors or rewards. Trial and error search and delayed reward are the most relevant characteristics of reinforcement learning. This method allows machines and software agents to automatically determine the ideal behaviour within a specific context in order to maximize its performance.

Future of machine learning:

Machine learning algorithms have been around for decades, they've attained new popularity as artificial intelligence has grown in prominence. Deep learning models in particular power today's most advanced AI applications Machine platforms are among enterprise technology's most competitive realms, with most major vendors, racing to sign customers up for platform services that cover the spectrum of machine learning activities, including data collection, data preparation, model building, training and application deployment. As machine learning continues to increase in importance to business operations and AI becomes ever more practical in enterprise settings, the machine learning platform wars will only intensify.

Conclusion:

Its all asking the right question, and that acts as a beginning to machine learning process. After that, we need the right and structured data to answer the question, and this is the part which takes most of the time in a complete machine learning process. Then, the process with a number of iterations starts, until we get a desired predictive model. That model is updated from time to time, to adapt the changes that happen periodically, and finally the model is deployed. In the next article, well focus on some terminologies and look at the machine learning process more closely.

Business Intelligence

Business intelligence (BI) is a technology driven process for analysing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions.

Business Intelligence (BI) refers to the tools, technologies, applications and practices used to collect, integrate, analyse and present an organizations raw data in order to create insightful and actionable business information. BI as a discipline and as a technology-driven process is made up of several related activities.(datamining, online analytical processing, Querying, Reporting

BI encompasses a wide variety of tools, applications and methodologies that enable organizations to collect data from internal systems and external sources, prepare it for analysis develop and run queries against that data and create reports, dash boards and data visualizations to make the analytical results available to corporate decision makers, as well as operational workers.

Why is business intelligence important?

The potential benefits of business intelligence tools include accelerating and improving decision making, optimizing internal business processes, increasing operational efficiency, driving new revenues and gaining competitive advantage over business rivals.

Benefits of Using Business Intelligence?

The potential benefits of business intelligence programs include:

  • Accelerating and improving decision making

  • Optimizing internal business processes.

  • Increasing operational efficiency.

  • Driving new revenues.

  • Gaining competitive advantages over business rivals.

  • Identifying market trends.

  • Spotting business problems that need to be addressed.

Why Should we Use Business Intelligence Systems and Tools?

BI Tools are essentially data-driven Decision Support Systems (DSS). BI is sometimes used interchangeably with briefing books, report and query tools and executive information systems. With these tools, business people can start analysing the data themselves, rather than wait for IT to run complex reports. This information access helps users back up business decisions with hard numbers, rather than only gut feelings and anecdotes.

Differentiate Between Business Intelligence and Analytics

Can you use these terms, business intelligence, and analytics, interchangeably? the majority of the time, people are going to know what you are talking about. And regardless of what you call it, your product needs it to stay modern, drive adoption, and reduce ongoing requests to IT for information that will help your customers do their jobs.

But if you really want to differentiate, here is when I think you should use BI vs analytics: Use the term business intelligence if you are doing the stuff to produce a report, dashboard, or pivot table for an executive, middle manager, or analyst And use analytics. when you move past basic BI capabilities and use information and data to help your customers become highly effective at getting a job done.

DevOps

DevOps represents a change in IT culture, focusing on rapid IT service delivery through the adoption of agile, lean practices in the context of a system-oriented approach. DevOps emphasizes people and seeks to improve collaboration between operations and development teams. DevOps implementations utilize technology especially automation tools that can leverage an increasingly programmable and dynamic infrastructure from a life cycle perspective.

How Does DevOps Work?

1. Collaboration:

The foundation of DevOps success is how well teams and individuals collaborate across the enterprise to get things done more rapidly, efficiently and effectively.

2. Automation:

The continuous integration principle of agile development has a cultural implication for the development group. Forcing developers to integrate their work with other developers work frequently at least daily exposes integration issues and conflicts much earlier than is the case with waterfall development. Developers have to communicate with each other much more frequently a process that runs counter to the image of the solitary genius coder working.

3. Continuous Testing:

Continuous testing is not just a QA function in fact, it starts in the development environment. The days are over when developers could simply throw the code over the wall to QA and say, “Have at it.” In a DevOps environment, quality is everyone’s job. Developers build quality into the code and provide test data sets. QA engineers configure automation test cases and the testing environment.

4. Continuous Delivery:

The team at Amazon Web Services defines continuous delivery as a DevOps software development practice where code changes are automatically built, tested, and prepared for a release to production. It expands upon continuous integration by deploying all code changes to a testing environment and a production environment after the build stage.

5. Continuous Monitoring:

With continuous monitoring, teams measure the performance and availability of software to improve stability. Continuous monitoring helps identify root causes of issues quickly to proactively prevent outages and minimize user issues. Some monitoring experts even advocate that the definition of a service must include monitoring they see it as integral to service delivery.

Best DevOps Tools

1. Atlassian JIRA:

JIRA is a commercial software and licenses need to be procured for installing On Premise based on the number of users. The tool can be used for Agile Project Management, Defect and Issue Tracking.

As mentioned before a process is a certain pre-requisite for DevOps implementation, so the Project Managers can use JIRA to create Product Backlog, Sprint Backlogs and maintain the entire traceability starting from EPIC, User Story and so on till Test artifacts like Test Case.

2. Eclipse:

Thus typically developers write code and commit the code to a version control repository such as Git/GitHub which supports team development. Every developer will download the code from a version control repository, make changes and commit the code back to Git/GitHub.

3. Git – Version Control Tool:

One of the fundamental building blocks of any CI setup is to have a strong version control system. Even though there are different version control tools in the market today like SVN, Clear Case, RTC, TFS, Git fits in very well as a popular and distributed version control system for teams located at different geographical locations.

It is a open source tool and supports most of the version control features of check-in, commits, branches, merging, labels, push and pull to/from GitHub.

4. Kiuwan:

Kiuwan adds security into DevOps understanding how SAST and SCA testing should be implemented. With a unique distributable engine, pipelines are never at risk of creating bottlenecks and TTM is improved while ensuring the most stringent security checks are in place. With a DevSecOps approach, Kiuwan achieves outstanding benchmark scores (Owasp, NIST, CWE, etc) and offers a wealth of features that go beyond static analysis, catering to every stakeholder in the SDLC.

5. Jenkins:

Jenkins is a free open source Continuous Integration tool or server which helps to automate the activities as mentioned of build, code analysis and storing the artifacts. schedule for the build is defined in Jenkins for initiating it. This again depends on the completion of tasks assigned to individual developers and the due date for completion. These are all planned in a Sprint Backlog in JIRA as discussed initially.

Typically all builds running on one single build machine is not a good option. Jenkins provides you with a feature of Master-Slave. Jenkins can also help to automate the deployments to app servers like Tomcat, JBoss, Weblogic through plugins and also to container platforms like Docker.

Artificial Intelligence

Artificial intelligence (AI) is the ability of a machine or a computer program to think and learn. The concept of AI is based on the idea of building machines capable of thinking, acting, and learning like humans.

Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans. Research associated with artificial intelligence is highly technical and specialized. While people falsely consider AI a technology, the more accurate approach would be seeing it as a broad concept in which machines are able to deal with tasks in a way we would call intelligent or smart.

Examples of AI technology:

AI is incorporated into a variety of different types of technology. Here are few example

  • What makes a system or process function automatically. For example, robotic process automation can be programmed to perform high volume, repeatable tasks that humans normally performed. RPA is different from IT automation in that it can adapt to changing circumstances.

  • The science of getting a computer to act without programming.Deep learning is a subset of machine learning that, in very simple terms, can be thought of as the automation of predictive analytics. There are three types of machine learning a)Supervised learning b)Unsupervised learning c)Reinforcement learning.

  • The science of allowing computers to see. This technology captures and analyzes visual information using a camera, analog to digital conversion and digital signal processing.

  • A field of engineering focused on the design and manufacturing of robots. Robots are often used to perform tasks that are difficult for humans to perform or perform consistently. They are used in assembly lines for car production or by NASA to move large objects in space.

  • These use a combination of computer vision, Image recognition and deep learning to build automated skill at piloting a vehicle while staying in a given lane and avoiding unexpected obstructions, such as pedestrians.

HOW DOES ARTIFICIAL INTELLIGENCE WORK:

Artificial Intelligence works by effectively combining high volumes of data using intelligent and quick algorithms with iterative processing and there by allowing itself to self learn automatically from the characteristics of such data.The value of AI applications lies mainly in the fact that computers can do things faster than humans interpreting image s and text, finding patterns in figures ,and running intelligent searches.

ARTIFICIAL INTELLIGENCE APPLICATIONS:

1. AI in healthcare:

The biggest bets re on improving patient outcomes and reducing costs Companies are applying machine learning to make better and faster diagnoses than humans. It understands natural language and is capable of responding to questions asked of it.The system mines patient data and other available data sources to form a hypothesis.

2. AI in business:

Robotic process automation is being applied to highly repetitive tasks normally performed by humans. Machine learning algorithms are being integrated into analytics and crm platforms to uncover information on how to better serve customers. Chatbots have been incorporated into websites to provide immediate service to customers.

3. AI in education:

AI can automate grading, giving educators more time. AI can assess students and adapt to their needs, helping them work at their own pace. AI tutors can provide additional support to students, ensuring they stay on track.

4. AI in law:

The discovery process, sifting through of documents, in law is often overwhelming for humans. Automating this process is a more efficient use of time. Startups are also building question and-answer computer assistants that can sift programmed-to-answer questions by examining the taxonomy and ontology associated with a database.

5. AI in finance:

AI in personal finance applications, such as Mint or Turbo Tax, is disrupting financial institutions. Applications such as these collect personal data and provide financial advice. Other programs, such as IBM Watson, have been applied to the process of buying a home.

ARTIFICIAL INTELLIGENCE challenging tasks:

AI successfully diagnoses medical scans, connects trillions of data point to discover new patterns, interprets application letters and CVs, and tracks down fraudster.The general trend is for AI to replace boring, lower cognitive and repetitive human tasks. Small time savings can add up to fairly large amounts. AI can realize an efficiency improvement of two to five for the process step. When fully automating gains up to hundred times can be achieved.

Analytics

Analytics is the scientific process of discovering and communicating the meaningful patterns hich can be found in data.It is concerned with turning raw data into insight for making better decisions. Analytics relies on the application of statistics computer programming and operations research in order to quantify and gain insight to the meanings of data. It is especially useful in areas which record a lot of data or information.

Analytics is an encompassing and multidimensional field thatuses mathematics, statistics, predictive modeling and machine learning techniques to find meaningful patterns and knowledge in recorded data.

Why is analytics important?

This can change how we look at the world and usually for the better.Some times we think that a process is already working at its best but sometimes data tells us otherwise so analytics helps us to improve our world. In the world organizations would usually apply nalytics in order to describe, predict and then improve the performance of the organizations. Specifically it would help in the following areas: Web analytics, Fraud analysis, Risk analysis, Advertisement and marketing, Enterprise Decision management, Market optimization, Market modelling.

Life cycle of analyticals

  • Business units specify the need, scope market conditions and goal related to the business question they want to solve, which will lead to the selection of one or more modeling techniques.

  • Depending on the business question and proposed analysis methods, this step involves using specialized techniques to locate, access, clean and prepare the data for optimal results. In our multifaceted data world, that could mean data from transactional systems, unstructured text files and data warehouses.

  • Now its time to explore the data in an interactive and visual fashion to quickly identify relevant variables, trends and relationships.(The shape of the data when variables are plotted out is called distribution of data. You can use shapes to identify the patterns.)

  • A skilled analyst or modeler builds the model using statistical, data mining or text mining software, including the critical capability of transforming and selecting key variables. Models need to be built rapidly so modelers can use trial and error to find the model that produces the best results.

  • Once built, he model is registered, tested(or validated), approved and declared ready for use against your data. With a centralized model repository, you can store extensive documentation about the model, scoring code and associated metadata (data about the data) for collaborative sharing and version control necessary for auditing purposes.

  • When approved for production use the model is applied to new data to generate predictive insights.
  • The predictive performance of the model is monitored to ensure it is up to date and delivering valid results. If the model performance degrades, its time to make changes. When it no longer works or serves a business need, it is retired.

Advanced Analytics

DC research shows SAS with a c ommanding 30.8 % market share in advanced analytics more than twice that of our nearest competitor.We dominate the market because we know it's not just how advanced the technology is that matters.it's how far it can advance our organization.

1 Data Mining:

Want to know what will happen in the future? Find the most lucrative opportunities? Get insights into impending outcomes.

a)SAS Econometrics

Analyze complex business and economic scenarios, providing a scientific basis for better decision making.

b)SAS Forecast Server

Produce large numbers of forecasts quickly and automatically to improve planning and decision making.

c)SAS Visual Forecasting

Generate large numbers of reliable forecasts quickly and automatically in an open environment.

2 Statistical Analysis:

Whether you're analysing customer data, runching sales numbers, monitoring supply chain operations or trying to detect fraud, apply powerful statistical analysis to all your data to get the most accurate answers.

a) SAS Visual Statistics

Create and modify predictive models faster than ever using a visual interface and in memory processing.

b)SAS® In-Memory Statistics

Find insights in big data with a single environment that moves you quickly through each phase of the analytical life cycle.

c) SAS® Analytics Pro

Access, manipulate, analyse and present information with a comprehensive analytical toolset that combines statistical analysis, reporting and highimpact visuals.

3 Forecasting:

Generate large quantities of high quality forecasts quickly and automatically no need for human intervention. And streamline your forecasting processes so you can focus efforts on the most important, high value decisions.

a)SAS Econometrics

Analyze complex business and economic scenarios, providing a scientific basis for better decision making.

b)SAS Visual Forecasting

Generate large numbers of reliable forecasts quickly and automatically in an open environment.

4 Optimization & Simulation:

Identify the scenarios that will produce the best outcomes.

a)SAS/OR

Optimize business processes and address challenges with enhanced operations research methods.

b)SAS® Optimization

Find optimal solutions to complex business and planning problems faster than ever.

c)SAS Simulation Studio

Build models that mimic complex, real-life systems so you can better understand and optimize them using discrete event simulation.

Gaming

Gaming is the running of specialized applications known as electronic games or video Games on game consoles like X-box and Play station or on personal computers(in which case the activity is known as online gaming). The term "gaming" originated as a synonym for "gambling" although most electronic games today do not involve gambling in the traditional sense. Gaming refers to playing electronic games, whether through consoles, computers, mobile phones or another medium altogether. Gaming is a nuanced term that suggests regular gameplay, possibly as a hobby.

The independent game industry has seen a substantial rise in recent years with the growth of new online distribution systems, such as Steam and Uplay, as well as the mobile game market, such as for Android and iOS devices.

Roles and Responsibilities of Development team:

  • A game designer is a person who designs game play conceiving and designing the rules and structure of a game. Development teams usually have a lead designer who coordinates the work of other designers. They are the main visionary of the game.

  • who creates video game art. The artist's job may be 2D oriented or 3Doriented. 2D artists may produce concept art sprites, textures environmental backdrops or terrain images, and user interface. 3D artists may produce models or animation, 3D environment and cinematics. Artists sometimes occupy both roles.

  • A game programmer is a software engineer who develops video games or related software. The game's code base development is handled by programmers. There are usually one to several lead programmers.

  • The quality assurance is carried out by game testers. A game tester analyzes video games to document software defects as part of a quality control Testing is a highly technical field requiring computing xpertise and analytic competence.

Tools for game design and development:

The world of gaming is fast paced and hugely exciting, especially with the ongoing developments and projects being created using virtual reality applications. But this can be a daunting environment if you are thinking of building your experience. There are a number of tools and resources around to help make your game or app a success.

1. Visual Studio:

Visual Studio,Microsofts developer tool suite has been around for years. which is a clear indication of just how popular the software is among designers and developers. A fully featured integrated developer environment(IDE) for Android, iOS, Windows, web and cloud, Visual Studio offers productive development tools and powerful services.

2.Assembla:

A highly efficient and capable project management tool with built in code repositories, Assembla is quickly becoming the resource of choice to build the latest games. Boasting a number of incredibly handy features, Assembla allows individual developers and teams to manage every aspect of a projects, from ideation to production, coding to communication, all in one place. Assembla is also the number one SVN provider in the world and features integration with leading communication app Slack, meaning the project team and clients can work together out of one platform helping games launch on time. Assembla provides everything you need to manage all your tasks, team and code in just one place a must-have tool for any game developer.

3.Unreal Engine:

This renowned of game development tools has been used to create hit games on just about any platform. Made by game developers, for game developers, Unreal Engine provides everything you need to make your next project, whether it be a simple 2D game, console blockbuster or virtual reality app, a success. Designed for mobile, now and in the future, Unreal Engine 4’s features gives you the power to develop your game and seamlessly deploy to iOS and Android devices.

4.Evernote:

It may seem obvious to say but every designer and developer needs a place to record even the smallest of ideas and be able to access them anywhere. Evernote,one of the most popular note taking apps around,allows users to do just that, enabling the capture, nurture,and sharing of ideas across any device.

5.Blender:

Blender is an open source 3D content creation suite, available for all major operating systems. Started by Blender Foundation founder Ton Roosendaal back in 2002, Blender is now the largest open source tool for 3D creation. Its makers are constantly working on its development,but you can pretty much do anything 3D related with this software, including modelling, texturing, animation, rendering and compositing.

Photoshop:

Speaking of 3D models,you’re going to need to texture said assets and theres no better program to start with than Photoshop.The go to tool for creative professionals, Photoshop provides an extensive and dedicated toolset for the creation and texturing of your game assets.

Substance:

3D painting software Substance also offers a way to paint your 3D assets in a procedural and natural way. A popular tool among 3D and digital artists,Substance features a complete and familiar set of texturing tools PBR and particle painting,Substance procedural effects,smart materials and smart masks, and fast baking.

Corona SDK:

It is a software development kit that is available on Windows and OS X and uses Lua as a scripting language. Using Corona, one can develop mobile games for free. However, to create a game or app with more elaborate features, you need to opt for enterprise model that offers native libraries and APIs. Corona SDK uses its rendering engine called OpenGL. Its built-in scene management and transition library helps adjust and modify your game’s graphical qualities. To make the game development easier,Corona Labs offers a Corona Editor and Text plugin.You can create a graphical environment to design different levels and understand the interaction between the objects.

SpriteKit:

Available on iOS and OS X, It is Apple’s proprietary 2D game development framework that supports both Swift and Objective C languages.SpriteKIt offers great convenience to developers.With SKView, scene management is made easy.SKAction class can be leveraged to move,scale or rotate various game objects.It also supports sound and developing custom codes.SpriteKit offers scene editor that enables designing of levels.Particle editor helps with developing different particle systems.It has Box2D for its physics engine,and a built-in camera through SkCamera Node class which makes navigation easy.

Unity:

It is a mobile game development engine that supports C# and UnityScript which is Unity’s own language like JavaScript. It comes with free as well as professional editions. It is a cross-platform tool and is deployable to many platforms. Like other tools, its built-in editor allows you to edit images and organize animations from animator window.

Marmalade :

It is a fast, high-performance cross-platform engine for creation of 2D and 3D games. The SDK can be used to code in C++. Marmalade Quick version supports app development using Lua scripting language whereas Marmalade Web facilitates creating hybrid apps using HTML5, CSS and JavaScript.Marmalade is an award winning platform that is popular among top game developers for its ability to build native games for both mobile and desktop.Marmalade Juice is another mobile game development tool that supports easy porting of iOS games to Android ecosystem.

NextPeer:

NextPeer is a Multiplayer social SDK that addresses the issue of multiplayer gaming on mobile.It supports both types of gameplay synchronous and asynchronous. NextPeer enhances the quality of gaming experience and player engagement, and helps developers achieve maximum user retention. It facilitates real-time interactions and live rankings in order to make it more interesting and real. It has a delayed synchronous technology.

Summary:

These are some of the points we can adopt in game development process. If we find it impractical to manage the technological resources. we explored some of the popular mobile game development tools, engines and framework as preferred by today’s developers. We can compare and have better insight about which gaming, best your development requirements. This will be useful for next game development plans.

Amazon Web Services

Amazon Web Services (AWS) is a secure cloud services platform, offering compute power, database storage, content delivery and other functionality to help businesses scale and grow. Explore how millions of customers are currently leveraging AWS cloud products and solutions to build sophisticated applications with increased flexibility, scalability and reliability.


A Broad IT Infrastructure Platform :

The AWS Cloud provides a broad set of infrastructure services, such as computing power, storage options, networking and databases, delivered as a utility: on-demand, available in seconds, with pay-as-you-go pricing.

  • From data warehousing to deployment tools, directories to content delivery, over 50 services are available in just a few mouse clicks with AWS. New services are quick to provision, without upfront capital expense, allowing enterprises, start-ups, SMBs and customers

  • After almost a decade of working closely with organizations as diverse as Pinterest, GE and MLB, the AWS Cloud allows customers to pin, power and play ball in entirely new ways.

  • Security in the cloud is recognized as better than on-premises. Broad security certification and accreditation, data encryption at rest and in-transit, hardware security modules and strong physical security all contribute to a more secure way to manage your business.

  • Security in the cloud is recognized as better than on-premises. Broad security certification and accreditation, data encryption at rest and in-transit, hardware security modules and strong physical security all contribute to a more secure way to manage your business.

Hybrid Capabilities :

Choosing between your existing investment in infrastructure and moving to the cloud is not a binary decision. Deep features, dedicated connectivity, identity federation and integrated tools allow you to run ‘hybrid’ applications across on-premises and cloud services.


Popular Products :

  • Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides secure, resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers.It's simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazon’s proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.

  • Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while automating time-consuming administration tasks such as hardware provisioning, database setup, patching and backups. It frees you to focus on your applications so you can give them the fast performance, high availability, security and compatibility they need.


  • Amazon RDS is available on several database instance types - optimized for memory, performance or I/O - and provides you with six familiar database engines to choose from, including Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle Database, and SQL Server.


  • Companies today need the ability to simply and securely collect, store, and analyze their data at a massive scale. Amazon S3 is object storage built to store and retrieve any amount of data from anywhere – web sites and mobile apps, corporate applications, and data from IoT sensors or devices. It is designed to deliver 99.999999999% durability, and stores data for millions of applications used by market leaders in every industry. S3 provides comprehensive security and compliance capabilities that meet even the most stringent regulatory requirements. It gives customers flexibility in the way they manage data for cost optimization, access control, and compliance. S3 provides query-in-place functionality, allowing you to run powerful analytics directly on your data at rest in S3. And Amazon S3 is the most supported cloud storage service available, with integration from the largest community of third-party solutions, systems integrator partners, and other AWS services.


  • AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume - there is no charge when your code is not running. With Lambda, you can run code for virtually any type of application or backend service - all with zero administration. Just upload your code and Lambda takes care of everything required to run and scale your code with high availability. You can set up your code to automatically trigger from other AWS services or call it directly from any web or mobile app.


Platform Foundations :

    1.Amazon VPC

    Amazon Virtual Private Cloud (Amazon VPC) lets you provision a logically isolated section of the AWS Cloud where you can launch AWS resources in a virtual network that you define. You have complete control over your virtual networking environment, including selection of your own IP address range, creation of subnets, and configuration of route tables and network gateways. You can use both IPv4 and IPv6 in your VPC for secure and easy access to resources and applications.You can easily customize the network configuration for your Amazon VPC. For example, you can create a public-facing subnet for your web servers that has access to the Internet, and place your backend systems such as databases or application servers in a private-facing subnet with no Internet access.

    2.AWS IAM

    AWS Identity and Access Management (IAM) enables you to manage access to AWS services and resources securely. Using IAM, you can create and manage AWS users and groups, and use permissions to allow and deny their access to AWS resources IAM is a feature of your AWS account offered at no additional charge. You will be charged only for use of other AWS services by your users.

    3.AWS CloudWatch

    Amazon CloudWatch is a monitoring and management service built for developers, system operators, site reliability engineers (SRE), and IT managers. CloudWatch provides you with data and actionable insights to monitor your applications, understand and respond to system-wide performance changes, optimize resource utilization, and get a unified view of operational health. CloudWatch collects monitoring and operational data in the form of logs, metrics, and events, providing you with a unified view of AWS resources, applications and services that run on AWS, and on-premises servers. You can use CloudWatch to set high resolution alarms, visualize logs and metrics side by side, take automated actions, troubleshoot issues, and discover insights to optimize your applications, and ensure they are running smoothly.

    4.Billing Alarm

    You can monitor your estimated AWS charges using Amazon CloudWatch. When you enable the monitoring of estimated charges for your AWS account, the estimated charges are calculated and sent several times daily to CloudWatch as metric data.