Cloud Computing:

Cloud computing has helped many enterprises transform themselves over the last five years, but experts agree that the market is entering something of a second wave, both for public cloud and private cloud services built and hosted in corporate data centers. The cloud market will accelerate faster in 2020 as enterprises seek to gain efficiencies as they scale their compute resources to better serve customers, says Forrester Research in a new report.

“The No. 1 trend is here come the enterprises,” says Forrester analyst Dave Bartoletti, primary author of the research. “Enterprises with big budgets, data centers and complex applications are now looking at cloud as a viable place to run core business applications. ” Forrester says the first wave of cloud computing was created by Amazon Web Services, which launched with a few simple compute and storage services in 2006. A decade later, AWS is operating at an $11 billion run rate.

Forrester found that 38 percent of 1,000-plus North American and European enterprise infrastructure technology decision-makers said that they are building private clouds, with 32 percent procuring public cloud services and the remainder planning to implement some form of cloud technology in the next 12 months. Also, 59 percent of respondents said they were adopting a hybrid cloud model.

Forrester offered the following cloud predictions for 2020:

1.Regional players complement ’mega cloud providers’

CIOs who initially elected to build private clouds may find themselves switching to public clouds as they realize just how time-consuming and costly the work will prove. Capital One shuttered private cloud efforts in favor of Amazon Web Services a few years ago, says its CIO Rob Alexander. “We recognized that we were spending a lot of time, energy, effort and management bandwidth to create infrastructure that already exists out there in a much better state and is evolving at a furious pace,” Alexander says.

  • The global public cloud market will top $146 billion in 2020, up from just $87 billion in 2015 and is growing at a 22 percent compound annual growth rate. The lion’s share of this growth will come from Amazon.com, Microsoft, Google and IBM, which have emerged as "mega-cloud providers,” Bartoletti says. They are opening new data centers and making concessions, such as Microsoft’s agreement to have T-Systems manage its cloud in Germany to meet data localization requirements. But the big players won’t be able to service every unique request, which means smaller regional players will see an uptick in adoption in 2020. Bartoletti recommends: "Keep you options open and don't be afraid to use multiple providers."

2.Cloud cost containment

COne popular theory is that CIOs will save money by investing in public cloud software, but that’s not always the case. The fact that most CIOs leverage multiple cloud providers means enterprises are already waist-deep in complex cloud vendor management. Also, if companies leave public cloud instances running through the weekend when they don’t need them, CIOs can actually spend more money than they did with on-premises solutions.

  • IT executives will get better at containing cloud costs in 2020 as their best practices mature. And it’s already happening. Bartoletti says that a cloud architect for a large software company shaved $300,000 off of a $2.5 million cloud bill by monitoring his consumption. And cost management tools from the likes of AWS, Cloudability and Cloudyn are available. “There’s no reason in 2020 for your cloud costs to grow out of control,” Bartoletti says."

3.Lift and shift those cloud apps

Forrester also recommends that companies refactor apps to run on public cloud systems, leveraging migration services, rather than simply dumping existing apps into a public cloud. The optimum option to move an application is to rewrite it to take advantage of cloud’s elasticity, although this lift-and-shift migration can be costly. “In 2020, lift-and-shift migration tools will accelerate the rate of cloud migration, given their low cost for bulk application migrations,” Bartoletti says.

4.Hyperconverge your private cloud

Bartoletti says that while more Forrester clients are citing security as a reason to shift to public cloud services not every CIO wants to accept risks associated with entrusting their customer and other sensitive data to a third-party vendor. Like their public cloud counterparts, private cloud services require advanced virtualization, standardization, automation, self-service access and resource monitoring. Stitching these capabilities together into a cohesive system is daunting and expensive.

Hyperconverged infrastructure (HCI) solutions promise to help, offering preintegrated compute and storage resources that help organizations get their cloud implementations running faster. Forrester recommends that organizations consider HCI as the foundation for their private cloud development, particularly for new workloads that demand rapid, automated scale-out. “HCI is quickly becoming the default infrastructure platform upon which to build the private portion of a hybrid cloud,” Bartoletti says.

5.There’s a container for that

Containers enable developers to manage software code, particularly software developed for cloud apps. Forrester says that Linux containers will be available in every major public and private cloud platform by early 2020. Developers will consume them directly and often build their own stacks to power microservices development. But the new paradigm means new challenges; companies will need to grapple with new security, monitoring, storage and networking issues that arise as containers are deployed broadly in production. “Your first step should be to evaluate the pros and cons of on-premises private PaaS versus a managed public cloud development platform; you might need both,” Bartoletti says.


6.Enterprise apps come to public cloud

Several companies are hosting enterprise applications in AWS, suggesting that CIOs have become more comfortable hosting critical software in the public cloud. Dollar Shave Club runs Spark analytics software in AWS. Cardinal Health runs Splunk in AWS. Several others are running business apps, such as SAP, in AWS. Bartoletti says you can expect this trend to continue as CIOs rely more heavily on public cloud providers. “Enterprise are turning great ideas into software and insights faster and the cloud is the best place to get quick insights out of enterprise data,” Bartoletti says.

Blockchain

A blockchain is a public ledger of all bitcoin transactions that have ever been executed. A block is the “current” part of a blockchain which records some or all of the recent transactions, and once completed, goes into the blockchain as permanent database. Each time a block gets completed, a new block is generated. Blocks are linked to each other (like a chain) in proper linear, chronological order with every block containing a hash of the previous block. To use conventional banking as an analogy, the blockchain is like a full history of banking transactions.


Blockchain Will Play an Imperative Role in Capital Markets:

Blockchain technology has been acknowledged as one of the most disruptive innovations since the advent of the Internet. The financial industry has also started looking to leverage it to store and transfer its value to other financial instruments. Capital Markets is one such industry in the financial space where industry experts are optimistic about the use of blockchain technology.

How Does Blockchain Work?

  • This record is referred to as a “ledger” in the cryptocurrency world, and each dataexchange is a “transaction“. Every verified transaction is added to the ledger as ablock.

  • It utilizes a distributed system to verify each transaction a peer to peer network of node.

  • Once signed and verified, the new transaction is added to the blockchain and can notbe altered.

Blockchain Technology In Various Industries:

Blockchain technology can be utilized in multiple industries including FinancialServices, Healthcare, Government, Travel and Hospitality, Retail and CPG. Financial.

  • In the financial services sector, Blockchain technology has already beenimplemented in many innovative ways. Blockchain technology simplifies andstreamlines the entire process associated with asset management and payments byproviding an automated trade lifecycle where all participants would have access tothe exact same data about a transaction.

  • Blockchain can play a key role in the healthcare sector by increasing theprivacy, security and interoperability of the healthcare data. It holds the potential toaddress many interoperability challenges in the sector and enable secure sharing ofhealthcare data among the various entities and people involved in the process. Iteliminates the interference of a third-party and also avoids the overhead costs.

  • Blockchain technology holds the power to transform Government’s operations and services. It can play a key role in improving the data transactional challenges in the Government sector, which works in siloes currently.

  • The proper linking and sharing of data with Blockchain enable better management of data between multiple departments. It improves the transparency and provides a better way to monitor and audit the transactions

  • There is a huge opportunity for Blockchain technology to be applied in the retail sector.

  • This includes everything from ensuring the authenticity of high value goods, preventing, fraudulent transactions, locating stolen items, enabling virtual warranties, managing loyalty points and streamlining supply chain operations.

  • Hospitality: The application of Blockchain can radically change the travel and hospitality industry. It can be applied in money transactions, storing important documents like passports other identification cards, reservations and managing travel insurance, loyalty and rewards.

Summary

The blockchain technology is both intriguing and exciting. Could it be the revolution that gurus predict? Or is it just a speculative bubble based on an impractical idea? After reading a lot on the matter, we couldn't form a definitive opinion.

DevOps

DevOps represents a change in IT culture, focusing on rapid IT service delivery through the adoption of agile, lean practices in the context of a system-oriented approach. DevOps emphasizes people and seeks to improve collaboration between operations and development teams. DevOps implementations utilize technology especially automation tools that can leverage an increasingly programmable and dynamic infrastructure from a life cycle perspective.

How Does DevOps Work?

1. Collaboration:

The foundation of DevOps success is how well teams and individuals collaborate across the enterprise to get things done more rapidly, efficiently and effectively.

2. Automation:

The continuous integration principle of agile development has a cultural implication for the development group. Forcing developers to integrate their work with other developers work frequently at least daily exposes integration issues and conflicts much earlier than is the case with waterfall development. Developers have to communicate with each other much more frequently a process that runs counter to the image of the solitary genius coder working.

3. Continuous Testing:

Continuous testing is not just a QA function in fact, it starts in the development environment. The days are over when developers could simply throw the code over the wall to QA and say, “Have at it.” In a DevOps environment, quality is everyone’s job. Developers build quality into the code and provide test data sets. QA engineers configure automation test cases and the testing environment.

4. Continuous Delivery:

The team at Amazon Web Services defines continuous delivery as a DevOps software development practice where code changes are automatically built, tested, and prepared for a release to production. It expands upon continuous integration by deploying all code changes to a testing environment and a production environment after the build stage.

5. Continuous Monitoring:

With continuous monitoring, teams measure the performance and availability of software to improve stability. Continuous monitoring helps identify root causes of issues quickly to proactively prevent outages and minimize user issues. Some monitoring experts even advocate that the definition of a service must include monitoring they see it as integral to service delivery.

Best DevOps Tools

1. Atlassian JIRA:

JIRA is a commercial software and licenses need to be procured for installing On Premise based on the number of users. The tool can be used for Agile Project Management, Defect and Issue Tracking.

As mentioned before a process is a certain pre-requisite for DevOps implementation, so the Project Managers can use JIRA to create Product Backlog, Sprint Backlogs and maintain the entire traceability starting from EPIC, User Story and so on till Test artifacts like Test Case.

2. Eclipse:

Thus typically developers write code and commit the code to a version control repository such as Git/GitHub which supports team development. Every developer will download the code from a version control repository, make changes and commit the code back to Git/GitHub.

3. Git – Version Control Tool:

One of the fundamental building blocks of any CI setup is to have a strong version control system. Even though there are different version control tools in the market today like SVN, Clear Case, RTC, TFS, Git fits in very well as a popular and distributed version control system for teams located at different geographical locations.

It is a open source tool and supports most of the version control features of check-in, commits, branches, merging, labels, push and pull to/from GitHub.

4. Kiuwan:

Kiuwan adds security into DevOps understanding how SAST and SCA testing should be implemented. With a unique distributable engine, pipelines are never at risk of creating bottlenecks and TTM is improved while ensuring the most stringent security checks are in place. With a DevSecOps approach, Kiuwan achieves outstanding benchmark scores (Owasp, NIST, CWE, etc) and offers a wealth of features that go beyond static analysis, catering to every stakeholder in the SDLC.

5. Jenkins:

Jenkins is a free open source Continuous Integration tool or server which helps to automate the activities as mentioned of build, code analysis and storing the artifacts. schedule for the build is defined in Jenkins for initiating it. This again depends on the completion of tasks assigned to individual developers and the due date for completion. These are all planned in a Sprint Backlog in JIRA as discussed initially.

Typically all builds running on one single build machine is not a good option. Jenkins provides you with a feature of Master-Slave. Jenkins can also help to automate the deployments to app servers like Tomcat, JBoss, Weblogic through plugins and also to container platforms like Docker.

Artificial Intelligence

Artificial intelligence (AI) is the ability of a machine or a computer program to think and learn. The concept of AI is based on the idea of building machines capable of thinking, acting, and learning like humans.

Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans. Research associated with artificial intelligence is highly technical and specialized. While people falsely consider AI a technology, the more accurate approach would be seeing it as a broad concept in which machines are able to deal with tasks in a way we would call intelligent or smart.

Examples of AI technology:

AI is incorporated into a variety of different types of technology. Here are few example

  • What makes a system or process function automatically. For example, robotic process automation can be programmed to perform high volume, repeatable tasks that humans normally performed. RPA is different from IT automation in that it can adapt to changing circumstances.

  • The science of getting a computer to act without programming.Deep learning is a subset of machine learning that, in very simple terms, can be thought of as the automation of predictive analytics. There are three types of machine learning a)Supervised learning b)Unsupervised learning c)Reinforcement learning.

  • The science of allowing computers to see. This technology captures and analyzes visual information using a camera, analog to digital conversion and digital signal processing.

  • A field of engineering focused on the design and manufacturing of robots. Robots are often used to perform tasks that are difficult for humans to perform or perform consistently. They are used in assembly lines for car production or by NASA to move large objects in space.

  • These use a combination of computer vision, Image recognition and deep learning to build automated skill at piloting a vehicle while staying in a given lane and avoiding unexpected obstructions, such as pedestrians.

HOW DOES ARTIFICIAL INTELLIGENCE WORK:

Artificial Intelligence works by effectively combining high volumes of data using intelligent and quick algorithms with iterative processing and there by allowing itself to self learn automatically from the characteristics of such data.The value of AI applications lies mainly in the fact that computers can do things faster than humans interpreting image s and text, finding patterns in figures ,and running intelligent searches.

ARTIFICIAL INTELLIGENCE APPLICATIONS:

1. AI in healthcare:

The biggest bets re on improving patient outcomes and reducing costs Companies are applying machine learning to make better and faster diagnoses than humans. It understands natural language and is capable of responding to questions asked of it.The system mines patient data and other available data sources to form a hypothesis.

2. AI in business:

Robotic process automation is being applied to highly repetitive tasks normally performed by humans. Machine learning algorithms are being integrated into analytics and crm platforms to uncover information on how to better serve customers. Chatbots have been incorporated into websites to provide immediate service to customers.

3. AI in education:

AI can automate grading, giving educators more time. AI can assess students and adapt to their needs, helping them work at their own pace. AI tutors can provide additional support to students, ensuring they stay on track.

4. AI in law:

The discovery process, sifting through of documents, in law is often overwhelming for humans. Automating this process is a more efficient use of time. Startups are also building question and-answer computer assistants that can sift programmed-to-answer questions by examining the taxonomy and ontology associated with a database.

5. AI in finance:

AI in personal finance applications, such as Mint or Turbo Tax, is disrupting financial institutions. Applications such as these collect personal data and provide financial advice. Other programs, such as IBM Watson, have been applied to the process of buying a home.

ARTIFICIAL INTELLIGENCE challenging tasks:

AI successfully diagnoses medical scans, connects trillions of data point to discover new patterns, interprets application letters and CVs, and tracks down fraudster.The general trend is for AI to replace boring, lower cognitive and repetitive human tasks. Small time savings can add up to fairly large amounts. AI can realize an efficiency improvement of two to five for the process step. When fully automating gains up to hundred times can be achieved.

Analytics

Analytics is the scientific process of discovering and communicating the meaningful patterns hich can be found in data.It is concerned with turning raw data into insight for making better decisions. Analytics relies on the application of statistics computer programming and operations research in order to quantify and gain insight to the meanings of data. It is especially useful in areas which record a lot of data or information.

Analytics is an encompassing and multidimensional field thatuses mathematics, statistics, predictive modeling and machine learning techniques to find meaningful patterns and knowledge in recorded data.

Why is analytics important?

This can change how we look at the world and usually for the better.Some times we think that a process is already working at its best but sometimes data tells us otherwise so analytics helps us to improve our world. In the world organizations would usually apply nalytics in order to describe, predict and then improve the performance of the organizations. Specifically it would help in the following areas: Web analytics, Fraud analysis, Risk analysis, Advertisement and marketing, Enterprise Decision management, Market optimization, Market modelling.

Life cycle of analyticals

  • Business units specify the need, scope market conditions and goal related to the business question they want to solve, which will lead to the selection of one or more modeling techniques.

  • Depending on the business question and proposed analysis methods, this step involves using specialized techniques to locate, access, clean and prepare the data for optimal results. In our multifaceted data world, that could mean data from transactional systems, unstructured text files and data warehouses.

  • Now its time to explore the data in an interactive and visual fashion to quickly identify relevant variables, trends and relationships.(The shape of the data when variables are plotted out is called distribution of data. You can use shapes to identify the patterns.)

  • A skilled analyst or modeler builds the model using statistical, data mining or text mining software, including the critical capability of transforming and selecting key variables. Models need to be built rapidly so modelers can use trial and error to find the model that produces the best results.

  • Once built, he model is registered, tested(or validated), approved and declared ready for use against your data. With a centralized model repository, you can store extensive documentation about the model, scoring code and associated metadata (data about the data) for collaborative sharing and version control necessary for auditing purposes.

  • When approved for production use the model is applied to new data to generate predictive insights.
  • The predictive performance of the model is monitored to ensure it is up to date and delivering valid results. If the model performance degrades, its time to make changes. When it no longer works or serves a business need, it is retired.

Advanced Analytics

DC research shows SAS with a c ommanding 30.8 % market share in advanced analytics more than twice that of our nearest competitor.We dominate the market because we know it's not just how advanced the technology is that matters.it's how far it can advance our organization.

1 Data Mining:

Want to know what will happen in the future? Find the most lucrative opportunities? Get insights into impending outcomes.

a)SAS Econometrics

Analyze complex business and economic scenarios, providing a scientific basis for better decision making.

b)SAS Forecast Server

Produce large numbers of forecasts quickly and automatically to improve planning and decision making.

c)SAS Visual Forecasting

Generate large numbers of reliable forecasts quickly and automatically in an open environment.

2 Statistical Analysis:

Whether you're analysing customer data, runching sales numbers, monitoring supply chain operations or trying to detect fraud, apply powerful statistical analysis to all your data to get the most accurate answers.

a) SAS Visual Statistics

Create and modify predictive models faster than ever using a visual interface and in memory processing.

b)SAS® In-Memory Statistics

Find insights in big data with a single environment that moves you quickly through each phase of the analytical life cycle.

c) SAS® Analytics Pro

Access, manipulate, analyse and present information with a comprehensive analytical toolset that combines statistical analysis, reporting and highimpact visuals.

3 Forecasting:

Generate large quantities of high quality forecasts quickly and automatically no need for human intervention. And streamline your forecasting processes so you can focus efforts on the most important, high value decisions.

a)SAS Econometrics

Analyze complex business and economic scenarios, providing a scientific basis for better decision making.

b)SAS Visual Forecasting

Generate large numbers of reliable forecasts quickly and automatically in an open environment.

4 Optimization & Simulation:

Identify the scenarios that will produce the best outcomes.

a)SAS/OR

Optimize business processes and address challenges with enhanced operations research methods.

b)SAS® Optimization

Find optimal solutions to complex business and planning problems faster than ever.

c)SAS Simulation Studio

Build models that mimic complex, real-life systems so you can better understand and optimize them using discrete event simulation.

Gaming

Gaming is the running of specialized applications known as electronic games or video Games on game consoles like X-box and Play station or on personal computers(in which case the activity is known as online gaming). The term "gaming" originated as a synonym for "gambling" although most electronic games today do not involve gambling in the traditional sense. Gaming refers to playing electronic games, whether through consoles, computers, mobile phones or another medium altogether. Gaming is a nuanced term that suggests regular gameplay, possibly as a hobby.

The independent game industry has seen a substantial rise in recent years with the growth of new online distribution systems, such as Steam and Uplay, as well as the mobile game market, such as for Android and iOS devices.

Roles and Responsibilities of Development team:

  • A game designer is a person who designs game play conceiving and designing the rules and structure of a game. Development teams usually have a lead designer who coordinates the work of other designers. They are the main visionary of the game.

  • who creates video game art. The artist's job may be 2D oriented or 3Doriented. 2D artists may produce concept art sprites, textures environmental backdrops or terrain images, and user interface. 3D artists may produce models or animation, 3D environment and cinematics. Artists sometimes occupy both roles.

  • A game programmer is a software engineer who develops video games or related software. The game's code base development is handled by programmers. There are usually one to several lead programmers.

  • The quality assurance is carried out by game testers. A game tester analyzes video games to document software defects as part of a quality control Testing is a highly technical field requiring computing xpertise and analytic competence.

Tools for game design and development:

The world of gaming is fast paced and hugely exciting, especially with the ongoing developments and projects being created using virtual reality applications. But this can be a daunting environment if you are thinking of building your experience. There are a number of tools and resources around to help make your game or app a success.

1. Visual Studio:

Visual Studio,Microsofts developer tool suite has been around for years. which is a clear indication of just how popular the software is among designers and developers. A fully featured integrated developer environment(IDE) for Android, iOS, Windows, web and cloud, Visual Studio offers productive development tools and powerful services.

2.Assembla:

A highly efficient and capable project management tool with built in code repositories, Assembla is quickly becoming the resource of choice to build the latest games. Boasting a number of incredibly handy features, Assembla allows individual developers and teams to manage every aspect of a projects, from ideation to production, coding to communication, all in one place. Assembla is also the number one SVN provider in the world and features integration with leading communication app Slack, meaning the project team and clients can work together out of one platform helping games launch on time. Assembla provides everything you need to manage all your tasks, team and code in just one place a must-have tool for any game developer.

3.Unreal Engine:

This renowned of game development tools has been used to create hit games on just about any platform. Made by game developers, for game developers, Unreal Engine provides everything you need to make your next project, whether it be a simple 2D game, console blockbuster or virtual reality app, a success. Designed for mobile, now and in the future, Unreal Engine 4’s features gives you the power to develop your game and seamlessly deploy to iOS and Android devices.

4.Evernote:

It may seem obvious to say but every designer and developer needs a place to record even the smallest of ideas and be able to access them anywhere. Evernote,one of the most popular note taking apps around,allows users to do just that, enabling the capture, nurture,and sharing of ideas across any device.

5.Blender:

Blender is an open source 3D content creation suite, available for all major operating systems. Started by Blender Foundation founder Ton Roosendaal back in 2002, Blender is now the largest open source tool for 3D creation. Its makers are constantly working on its development,but you can pretty much do anything 3D related with this software, including modelling, texturing, animation, rendering and compositing.

Photoshop:

Speaking of 3D models,you’re going to need to texture said assets and theres no better program to start with than Photoshop.The go to tool for creative professionals, Photoshop provides an extensive and dedicated toolset for the creation and texturing of your game assets.

Substance:

3D painting software Substance also offers a way to paint your 3D assets in a procedural and natural way. A popular tool among 3D and digital artists,Substance features a complete and familiar set of texturing tools PBR and particle painting,Substance procedural effects,smart materials and smart masks, and fast baking.

Corona SDK:

It is a software development kit that is available on Windows and OS X and uses Lua as a scripting language. Using Corona, one can develop mobile games for free. However, to create a game or app with more elaborate features, you need to opt for enterprise model that offers native libraries and APIs. Corona SDK uses its rendering engine called OpenGL. Its built-in scene management and transition library helps adjust and modify your game’s graphical qualities. To make the game development easier,Corona Labs offers a Corona Editor and Text plugin.You can create a graphical environment to design different levels and understand the interaction between the objects.

SpriteKit:

Available on iOS and OS X, It is Apple’s proprietary 2D game development framework that supports both Swift and Objective C languages.SpriteKIt offers great convenience to developers.With SKView, scene management is made easy.SKAction class can be leveraged to move,scale or rotate various game objects.It also supports sound and developing custom codes.SpriteKit offers scene editor that enables designing of levels.Particle editor helps with developing different particle systems.It has Box2D for its physics engine,and a built-in camera through SkCamera Node class which makes navigation easy.

Unity:

It is a mobile game development engine that supports C# and UnityScript which is Unity’s own language like JavaScript. It comes with free as well as professional editions. It is a cross-platform tool and is deployable to many platforms. Like other tools, its built-in editor allows you to edit images and organize animations from animator window.

Marmalade :

It is a fast, high-performance cross-platform engine for creation of 2D and 3D games. The SDK can be used to code in C++. Marmalade Quick version supports app development using Lua scripting language whereas Marmalade Web facilitates creating hybrid apps using HTML5, CSS and JavaScript.Marmalade is an award winning platform that is popular among top game developers for its ability to build native games for both mobile and desktop.Marmalade Juice is another mobile game development tool that supports easy porting of iOS games to Android ecosystem.

NextPeer:

NextPeer is a Multiplayer social SDK that addresses the issue of multiplayer gaming on mobile.It supports both types of gameplay synchronous and asynchronous. NextPeer enhances the quality of gaming experience and player engagement, and helps developers achieve maximum user retention. It facilitates real-time interactions and live rankings in order to make it more interesting and real. It has a delayed synchronous technology.

Summary:

These are some of the points we can adopt in game development process. If we find it impractical to manage the technological resources. we explored some of the popular mobile game development tools, engines and framework as preferred by today’s developers. We can compare and have better insight about which gaming, best your development requirements. This will be useful for next game development plans.

Amazon Web Services

Amazon Web Services (AWS) is a secure cloud services platform, offering compute power, database storage, content delivery and other functionality to help businesses scale and grow. Explore how millions of customers are currently leveraging AWS cloud products and solutions to build sophisticated applications with increased flexibility, scalability and reliability.


A Broad IT Infrastructure Platform :

The AWS Cloud provides a broad set of infrastructure services, such as computing power, storage options, networking and databases, delivered as a utility: on-demand, available in seconds, with pay-as-you-go pricing.

  • From data warehousing to deployment tools, directories to content delivery, over 50 services are available in just a few mouse clicks with AWS. New services are quick to provision, without upfront capital expense, allowing enterprises, start-ups, SMBs and customers

  • After almost a decade of working closely with organizations as diverse as Pinterest, GE and MLB, the AWS Cloud allows customers to pin, power and play ball in entirely new ways.

  • Security in the cloud is recognized as better than on-premises. Broad security certification and accreditation, data encryption at rest and in-transit, hardware security modules and strong physical security all contribute to a more secure way to manage your business.

  • Security in the cloud is recognized as better than on-premises. Broad security certification and accreditation, data encryption at rest and in-transit, hardware security modules and strong physical security all contribute to a more secure way to manage your business.

Hybrid Capabilities :

Choosing between your existing investment in infrastructure and moving to the cloud is not a binary decision. Deep features, dedicated connectivity, identity federation and integrated tools allow you to run ‘hybrid’ applications across on-premises and cloud services.


Popular Products :

  • Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides secure, resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers.It's simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazon’s proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.

  • Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while automating time-consuming administration tasks such as hardware provisioning, database setup, patching and backups. It frees you to focus on your applications so you can give them the fast performance, high availability, security and compatibility they need.


  • Amazon RDS is available on several database instance types - optimized for memory, performance or I/O - and provides you with six familiar database engines to choose from, including Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle Database, and SQL Server.


  • Companies today need the ability to simply and securely collect, store, and analyze their data at a massive scale. Amazon S3 is object storage built to store and retrieve any amount of data from anywhere – web sites and mobile apps, corporate applications, and data from IoT sensors or devices. It is designed to deliver 99.999999999% durability, and stores data for millions of applications used by market leaders in every industry. S3 provides comprehensive security and compliance capabilities that meet even the most stringent regulatory requirements. It gives customers flexibility in the way they manage data for cost optimization, access control, and compliance. S3 provides query-in-place functionality, allowing you to run powerful analytics directly on your data at rest in S3. And Amazon S3 is the most supported cloud storage service available, with integration from the largest community of third-party solutions, systems integrator partners, and other AWS services.


  • AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume - there is no charge when your code is not running. With Lambda, you can run code for virtually any type of application or backend service - all with zero administration. Just upload your code and Lambda takes care of everything required to run and scale your code with high availability. You can set up your code to automatically trigger from other AWS services or call it directly from any web or mobile app.


Platform Foundations :

    1.Amazon VPC

    Amazon Virtual Private Cloud (Amazon VPC) lets you provision a logically isolated section of the AWS Cloud where you can launch AWS resources in a virtual network that you define. You have complete control over your virtual networking environment, including selection of your own IP address range, creation of subnets, and configuration of route tables and network gateways. You can use both IPv4 and IPv6 in your VPC for secure and easy access to resources and applications.You can easily customize the network configuration for your Amazon VPC. For example, you can create a public-facing subnet for your web servers that has access to the Internet, and place your backend systems such as databases or application servers in a private-facing subnet with no Internet access.

    2.AWS IAM

    AWS Identity and Access Management (IAM) enables you to manage access to AWS services and resources securely. Using IAM, you can create and manage AWS users and groups, and use permissions to allow and deny their access to AWS resources IAM is a feature of your AWS account offered at no additional charge. You will be charged only for use of other AWS services by your users.

    3.AWS CloudWatch

    Amazon CloudWatch is a monitoring and management service built for developers, system operators, site reliability engineers (SRE), and IT managers. CloudWatch provides you with data and actionable insights to monitor your applications, understand and respond to system-wide performance changes, optimize resource utilization, and get a unified view of operational health. CloudWatch collects monitoring and operational data in the form of logs, metrics, and events, providing you with a unified view of AWS resources, applications and services that run on AWS, and on-premises servers. You can use CloudWatch to set high resolution alarms, visualize logs and metrics side by side, take automated actions, troubleshoot issues, and discover insights to optimize your applications, and ensure they are running smoothly.

    4.Billing Alarm

    You can monitor your estimated AWS charges using Amazon CloudWatch. When you enable the monitoring of estimated charges for your AWS account, the estimated charges are calculated and sent several times daily to CloudWatch as metric data.

Software as a Service (SaaS)

Software as a Service (SaaS)It is a software distribution model in which a third-party provider hosts applications and makes them available to customers over the Internet. SaaS is one of three main categories of cloud computing, alongside infrastructure as a service (IaaS) and platform as a service (PaaS). SaaS is closely related to the application service provider (ASP) and on demand computing software delivery models. The hosted application management model of SaaS is similar to ASP, where the provider hosts the customer’s software and delivers it to approved end users over the internet. In the software on demand SaaS model, the provider gives customers network-based access to a single copy of an application that the provider created specifically for SaaS distribution. SaaS applications for fundamental business technologies, such as email, sales management, customer relationship management (CRM), financial management, human resource management (HRM), billing and collaboration. Leading SaaS providers include Salesforce, Oracle, SAP, Intuit and Microsoft. SaaS is a natural evolution of software. The old model of getting physical DVDs and installing on local servers was the only realistic solution for many years. In fact, the client-server model is still required for many scenarios. That said, in recent years a number of developments have allowed SaaS to become mainstream. One factor is bandwidth; the internet is simply faster than it was a decade ago. Other major factors include the evolution of both virtualization and tools in big data. All these advances have made it much easier for providers to scale and manage their own infrastructure and thus provide SaaS solutions.


Advantages Of Saas

SaaS removes the need for organizations to install and run applications on their own computers or in their own data centers. This eliminates the expense of hardware acquisition, provisioning and maintenance, as well as software licensing, installation and support.

  • SaaS solutions reside in cloud environments that are scalable and have integration with other SaaS offerings. Comparing with the traditional model, users do not have to buy another server or software. They only need to enable a new SaaS offering and, in terms of server capacity planning, the SaaS provider will own that. SaaS providers generally offer many subscription options and flexibility to change subscriptions as and when needed, eg when your business grows, or more users need to access the service. SaaS, and more widely cloud computing, can help you make the most of a limited IT budget while giving you access to the latest technology and professional support

  • With the software housed on the server, it can be upgraded centrally, as opposed to the traditional model where the software would need to be upgraded on each machine. In other words, SaaS can easily be maintained with the latest version of the software at all times. customers can rely on a SaaS provider to automatically perform updates and patch management.

  • It isn’t just semantics. The cloud refers to a set of incredibly complex infrastructure technology. At a fundamental level, it’s a collection of computers, servers, and databases that are connected together in a way that users can lease access to share their combined power. The cloud can refer to anything that’s hosted remotely and delivered via the Internet. While all cloud programs are run by underlying software, SaaS refers specifically to business software applications that are delivered via the cloud. Given the widespread growth of cloud accessibility, it’s been easier, faster and less expensive for SaaS developers to roll out applications as compared to traditional on-premise software development. Today, nearly every type of core business function from human resources to enterprise resource planning is available via SaaS.

  • SaaS has a differential regarding costs since it usually resides in a shared or multitenant environment where the hardware and software license costs are low compared with the traditional model.Maintenance costs are reduced as well, since the SaaS provider owns the environment and it is split among all customers that use that solution.

    Generally, they pay for this service on a monthly basis using a pay-as-you-go model. Transitioning costs to a recurring operating expense allows many businesses to exercise better and more predictable budgeting. Users can also terminate SaaS offerings at any time to stop those recurring costs.

  • In the SaaS model, the software application is already installed and configured. Users can provision the server for the cloud and quickly have the application ready for use. This cuts the time to benefit and allows for rapid demonstrations and prototyping. With many SaaS companies offering free trials, this means a painless proof of concept and discovery phase to prove the benefit to the organization. Whether in demo mode or actually going live, intuitive interfaces for order writing and rapid uploading of electronic catalogs and customer lists give sales people immediate benefits without long wait times or steep learning curves.


Internet of Things

The Internet of Things connects the physical world to the Internet so that you can use data from devices to increase productivity and efficiency. Connecting things to the Internet is possible because different connectivity options are widely available, the cost of connecting is declining, and more devices are capturing data. Consumers expect everything in their home, everything they drive, and even what they wear to have IoT capabilities.For example rotimatic a smart, fully automated flatbreadmaking device, uses AWS IoT to securely transfer data to and from the cloud. Industrial companies use IoT applications to make industrial equipment and processes smarter and safer.

How does the Internet of Things work?

Today, the Internet of Things connects physical devices embedded with electronics, software, sensors, and actuators to the cloud and to each other. Devices communicate through different protocols and many, such as MQTT, were designed to tolerate intermittent connections and reduce network bandwidth requirements. All IoT communication must be secured using preventive security mechanisms. like device identity management, encryption, and access control as well as device auditing monitoring.

Why is the Internet of Things important?

The Internet of Things is important because it makes previously unusable data available. IoT applications tap into device data and let you visualize, explore, and build sophisticated analytics such as machine learning in the cloud. IoT applications can also run on devices so they can respond in real-time as events unfold. IoT applications are deployed across a wide range of use cases including connected homes, connected vehicles, healthcare, industrial, retail, and many more.Connected homes are safer, cleaner, and more energy efficient. Amway has successfully launched its first Internet-connected product the Atmosphere Sky Air Treatment System using AWS IoT to build policies and security throughout the entire architecture.

Using AWS for IOT applications:

AWS IoT services enable you to easily and securely connect and manage billions of devices. You can gather data from, run sophisticated analytics on and take actions in real-time on your diverse fleet of IoT devices from edge to the cloud.

1. AWS IoT Core:

AWS IoT Core is a managed cloud service that lets connected devices easily and securely interact with cloud applications and other devices. AWS IoT Core provides secure communication and data processing across different kinds of connected devices and locations so you can easily build IoT applications.

2. AWS IoT Device Management:

As many IoT deployments consist of hundreds of thousands to millions of devices, it is essential to track, monitor, and manage connected device fleets. We need to ensure your scale and diversity of IoT devices work properly and securely. AWS IoT Device Management makes it easy to securely onboard, organize, monitor, and remotely manage IoT devices at scale.

3. Amazon FreeRTOS:

It is an operating system for microcontrollers that makes small, low-power edge devices easy to program, deploy, secure, connect, and manage. Amazon FreeRTOS is based on the FreeRTOS kernel, a popular open source operating system for microcontrollers, and extends it with software libraries that make it easy to securely connect your small, low-power devices to AWS cloud services.

4. AWS Greengrass:

AWS Greengrass is software that lets you run local compute,messaging, data caching, sync, and ML inference capabilities for connected devices in a secure way. With AWS Greengrass, connected devices can run AWS Lambda functions, keep device data in sync, and communicate with other devices securely.

5. AWS IoT Device Defender:

AWS IoT Device Defender is a fully managed service that helps you secure your fleet of IoT devices. configuration is a set of technical controls you set to help keep information secure when devices are communicating with each other and the cloud. Device Defender makes it easy to maintain and enforce IoT configurations, such as ensuring device identity, authenticating and authorizing devices, and encrypting device data.

6. AWS IoT Analytics:

AWS IoT Analytics is a fully-managed service that makes it easy to run and operationalize sophisticated analytics on massive volumes of IoT data without having to worry about the cost and complexity typically required to build an IoT analytics platform. AWS IoT Analytics automates each of the difficult steps that are required to analyse data from IoT devices. AWS IoT Analytics filters, transforms, and enriches IoT data before storing it in a time-series data store for analysis.

7. Few more IoT Applications:

Home and Security, Energy, IoT Healthcare, Urban Planning, Transportation, Manufacturing etc.

How the Internet of Things (IoT) is Changing Business:

The future is looking pretty exciting thanks to the Internet of things.The greatest thing IoT will change for consumers is their experience with customer service representatives. Companies, instead of customers, will start initiating customer service interactions, and these exchanges will often begin before a problem has even occurred.

Summary:

The promise of 31 billion connected devices by 2020, it’s safe to say that the Internet of Things has arrived, and it’s here to stay. And with the rise of IoT,comes an onslaught of data. While this data will have a significant impact on marketing as we know it, it will also require unprecedented amounts of security.Without the proper protection in place, individuals or entire infrastructures could beat risk.

Machine Learning

Machine learning is an application of artificial intelligence (AI) that provides systems the ability to automatically learn and improve from experience without being explicitly programmed. Machine learning focuses on the development of computer programs that can access data and use it learn for themselves. The process of learning begins with observations or data, such as examples, direct experience, or instruction, in order to look for patterns in data and make better decisions in the future based on the examples that we provide. The primary aim is to allow the computers learn automatically without human intervention or assistance and adjust actions accordingly.

Examples of machine learning:

Machine learning is being used in a wide range of applications today. One of the most well-known examples is Facebook's News Feed. The News Feed uses machine learning to personalize each member's feed. Behind the scenes, the software is simply using statistical analysis and predictive analytics to identify patterns in theuser's data and use those patterns to populate the News Feed. Machine learning is also entering an array of enterprise applications. Customer relationship management.

Types of machine learning:

Just as there are nearly limitless uses of machine learning, there is no shortage of machine learning algorithms. They range from the fairly simple to the highly complex. Here are a few of the most commonly used models.

This class of machine learning algorithm involves identifying a correlation generally between two variables and using that correlation to make predictions about future data points.

1. Decision trees:

These models use observations about certain actions and identify an optimal path for arriving at a desired outcome.

2. K-means clustering:

This model groups a specified number of data points into a specific number of groupings based on like characteristics.

3. Neural networks:

These deep learning models utilize large amounts of training data to identify correlations between many variables to learn to process incoming data in the future.

4. Supervised machine learning algorithms:

Starting from the analysis of a known training dataset, the learning algorithm produces an inferred function to make predictions about the output values. The system is able to provide targets for any new input after sufficient training. The learning algorithm can also compare its output with the correct, intended output and find errors in order to modify the model accordingly.

5. unsupervised machine learning algorithms:

when the information used to train is neither classified nor labeled. Unsupervised learning studies how systems can infer a function to describe a hidden structure from unlabeled data. The system doesn’t figure out the right output, but it explores the data and can draw inferences from datasets to describe hidden structures from unlabeled data.

6. Semi-supervised machine learning algorithms:

Fall somewhere in between supervised and unsupervised learning, since they use both labeled and unlabeled data for training typically a small amount of labeled data and a large amount of unlabeled data. The systems that use this method are able to considerably improve learning accuracy. Usually, semi-supervised learning is chosen when the acquired labeled data requires skilled. Otherwise, acquiringunlabeled data generally doesnot require additional resources.

7. Reinforcement machine learning algorithms:

It is a learning method that interacts with its environment by producing actions and discovers errors or rewards. Trial and error search and delayed reward are the most relevant characteristics of reinforcement learning. This method allows machines and software agents to automatically determine the ideal behaviour within a specific context in order to maximize its performance.

Future of machine learning:

Machine learning algorithms have been around for decades, they've attained new popularity as artificial intelligence has grown in prominence. Deep learning models in particular power today's most advanced AI applications Machine platforms are among enterprise technology's most competitive realms, with most major vendors, racing to sign customers up for platform services that cover the spectrum of machine learning activities, including data collection, data preparation, model building, training and application deployment. As machine learning continues to increase in importance to business operations and AI becomes ever more practical in enterprise settings, the machine learning platform wars will only intensify.

Conclusion:

Its all asking the right question, and that acts as a beginning to machine learning process. After that, we need the right and structured data to answer the question, and this is the part which takes most of the time in a complete machine learning process. Then, the process with a number of iterations starts, until we get a desired predictive model. That model is updated from time to time, to adapt the changes that happen periodically, and finally the model is deployed. In the next article, well focus on some terminologies and look at the machine learning process more closely.

Business Intelligence

Business intelligence (BI) is a technology driven process for analysing data and presenting actionable information to help executives, managers and other corporate end users make informed business decisions.

Business Intelligence (BI) refers to the tools, technologies, applications and practices used to collect, integrate, analyse and present an organizations raw data in order to create insightful and actionable business information. BI as a discipline and as a technology-driven process is made up of several related activities.(datamining, online analytical processing, Querying, Reporting

BI encompasses a wide variety of tools, applications and methodologies that enable organizations to collect data from internal systems and external sources, prepare it for analysis develop and run queries against that data and create reports, dash boards and data visualizations to make the analytical results available to corporate decision makers, as well as operational workers.

Why is business intelligence important?

The potential benefits of business intelligence tools include accelerating and improving decision making, optimizing internal business processes, increasing operational efficiency, driving new revenues and gaining competitive advantage over business rivals.

Benefits of Using Business Intelligence?

The potential benefits of business intelligence programs include:

  • Accelerating and improving decision making

  • Optimizing internal business processes.

  • Increasing operational efficiency.

  • Driving new revenues.

  • Gaining competitive advantages over business rivals.

  • Identifying market trends.

  • Spotting business problems that need to be addressed.

Why Should we Use Business Intelligence Systems and Tools?

BI Tools are essentially data-driven Decision Support Systems (DSS). BI is sometimes used interchangeably with briefing books, report and query tools and executive information systems. With these tools, business people can start analysing the data themselves, rather than wait for IT to run complex reports. This information access helps users back up business decisions with hard numbers, rather than only gut feelings and anecdotes.

Differentiate Between Business Intelligence and Analytics

Can you use these terms, business intelligence, and analytics, interchangeably? the majority of the time, people are going to know what you are talking about. And regardless of what you call it, your product needs it to stay modern, drive adoption, and reduce ongoing requests to IT for information that will help your customers do their jobs.

But if you really want to differentiate, here is when I think you should use BI vs analytics: Use the term business intelligence if you are doing the stuff to produce a report, dashboard, or pivot table for an executive, middle manager, or analyst And use analytics. when you move past basic BI capabilities and use information and data to help your customers become highly effective at getting a job done.

DevOps

DevOps represents a change in IT culture, focusing on rapid IT service delivery through the adoption of agile, lean practices in the context of a system-oriented approach. DevOps emphasizes people and seeks to improve collaboration between operations and development teams. DevOps implementations utilize technology especially automation tools that can leverage an increasingly programmable and dynamic infrastructure from a life cycle perspective.

How Does DevOps Work?

1. Collaboration:

The foundation of DevOps success is how well teams and individuals collaborate across the enterprise to get things done more rapidly, efficiently and effectively.

2. Automation:

The continuous integration principle of agile development has a cultural implication for the development group. Forcing developers to integrate their work with other developers work frequently at least daily exposes integration issues and conflicts much earlier than is the case with waterfall development. Developers have to communicate with each other much more frequently a process that runs counter to the image of the solitary genius coder working.

3. Continuous Testing:

Continuous testing is not just a QA function in fact, it starts in the development environment. The days are over when developers could simply throw the code over the wall to QA and say, “Have at it.” In a DevOps environment, quality is everyone’s job. Developers build quality into the code and provide test data sets. QA engineers configure automation test cases and the testing environment.

4. Continuous Delivery:

The team at Amazon Web Services defines continuous delivery as a DevOps software development practice where code changes are automatically built, tested, and prepared for a release to production. It expands upon continuous integration by deploying all code changes to a testing environment and a production environment after the build stage.

5. Continuous Monitoring:

With continuous monitoring, teams measure the performance and availability of software to improve stability. Continuous monitoring helps identify root causes of issues quickly to proactively prevent outages and minimize user issues. Some monitoring experts even advocate that the definition of a service must include monitoring they see it as integral to service delivery.

Best DevOps Tools

1. Atlassian JIRA:

JIRA is a commercial software and licenses need to be procured for installing On Premise based on the number of users. The tool can be used for Agile Project Management, Defect and Issue Tracking.

As mentioned before a process is a certain pre-requisite for DevOps implementation, so the Project Managers can use JIRA to create Product Backlog, Sprint Backlogs and maintain the entire traceability starting from EPIC, User Story and so on till Test artifacts like Test Case.

2. Eclipse:

Thus typically developers write code and commit the code to a version control repository such as Git/GitHub which supports team development. Every developer will download the code from a version control repository, make changes and commit the code back to Git/GitHub.

3. Git – Version Control Tool:

One of the fundamental building blocks of any CI setup is to have a strong version control system. Even though there are different version control tools in the market today like SVN, Clear Case, RTC, TFS, Git fits in very well as a popular and distributed version control system for teams located at different geographical locations.

It is a open source tool and supports most of the version control features of check-in, commits, branches, merging, labels, push and pull to/from GitHub.

4. Kiuwan:

Kiuwan adds security into DevOps understanding how SAST and SCA testing should be implemented. With a unique distributable engine, pipelines are never at risk of creating bottlenecks and TTM is improved while ensuring the most stringent security checks are in place. With a DevSecOps approach, Kiuwan achieves outstanding benchmark scores (Owasp, NIST, CWE, etc) and offers a wealth of features that go beyond static analysis, catering to every stakeholder in the SDLC.

5. Jenkins:

Jenkins is a free open source Continuous Integration tool or server which helps to automate the activities as mentioned of build, code analysis and storing the artifacts. schedule for the build is defined in Jenkins for initiating it. This again depends on the completion of tasks assigned to individual developers and the due date for completion. These are all planned in a Sprint Backlog in JIRA as discussed initially.

Typically all builds running on one single build machine is not a good option. Jenkins provides you with a feature of Master-Slave. Jenkins can also help to automate the deployments to app servers like Tomcat, JBoss, Weblogic through plugins and also to container platforms like Docker.

Artificial Intelligence

Artificial intelligence (AI) is the ability of a machine or a computer program to think and learn. The concept of AI is based on the idea of building machines capable of thinking, acting, and learning like humans.

Artificial intelligence (AI) is an area of computer science that emphasizes the creation of intelligent machines that work and react like humans. Research associated with artificial intelligence is highly technical and specialized. While people falsely consider AI a technology, the more accurate approach would be seeing it as a broad concept in which machines are able to deal with tasks in a way we would call intelligent or smart.

Examples of AI technology:

AI is incorporated into a variety of different types of technology. Here are few example

  • What makes a system or process function automatically. For example, robotic process automation can be programmed to perform high volume, repeatable tasks that humans normally performed. RPA is different from IT automation in that it can adapt to changing circumstances.

  • The science of getting a computer to act without programming.Deep learning is a subset of machine learning that, in very simple terms, can be thought of as the automation of predictive analytics. There are three types of machine learning a)Supervised learning b)Unsupervised learning c)Reinforcement learning.

  • The science of allowing computers to see. This technology captures and analyzes visual information using a camera, analog to digital conversion and digital signal processing.

  • A field of engineering focused on the design and manufacturing of robots. Robots are often used to perform tasks that are difficult for humans to perform or perform consistently. They are used in assembly lines for car production or by NASA to move large objects in space.

  • These use a combination of computer vision, Image recognition and deep learning to build automated skill at piloting a vehicle while staying in a given lane and avoiding unexpected obstructions, such as pedestrians.

HOW DOES ARTIFICIAL INTELLIGENCE WORK:

Artificial Intelligence works by effectively combining high volumes of data using intelligent and quick algorithms with iterative processing and there by allowing itself to self learn automatically from the characteristics of such data.The value of AI applications lies mainly in the fact that computers can do things faster than humans interpreting image s and text, finding patterns in figures ,and running intelligent searches.

ARTIFICIAL INTELLIGENCE APPLICATIONS:

1. AI in healthcare:

The biggest bets re on improving patient outcomes and reducing costs Companies are applying machine learning to make better and faster diagnoses than humans. It understands natural language and is capable of responding to questions asked of it.The system mines patient data and other available data sources to form a hypothesis.

2. AI in business:

Robotic process automation is being applied to highly repetitive tasks normally performed by humans. Machine learning algorithms are being integrated into analytics and crm platforms to uncover information on how to better serve customers. Chatbots have been incorporated into websites to provide immediate service to customers.

3. AI in education:

AI can automate grading, giving educators more time. AI can assess students and adapt to their needs, helping them work at their own pace. AI tutors can provide additional support to students, ensuring they stay on track.

4. AI in law:

The discovery process, sifting through of documents, in law is often overwhelming for humans. Automating this process is a more efficient use of time. Startups are also building question and-answer computer assistants that can sift programmed-to-answer questions by examining the taxonomy and ontology associated with a database.

5. AI in finance:

AI in personal finance applications, such as Mint or Turbo Tax, is disrupting financial institutions. Applications such as these collect personal data and provide financial advice. Other programs, such as IBM Watson, have been applied to the process of buying a home.

ARTIFICIAL INTELLIGENCE challenging tasks:

AI successfully diagnoses medical scans, connects trillions of data point to discover new patterns, interprets application letters and CVs, and tracks down fraudster.The general trend is for AI to replace boring, lower cognitive and repetitive human tasks. Small time savings can add up to fairly large amounts. AI can realize an efficiency improvement of two to five for the process step. When fully automating gains up to hundred times can be achieved.

Analytics

Analytics is the scientific process of discovering and communicating the meaningful patterns hich can be found in data.It is concerned with turning raw data into insight for making better decisions. Analytics relies on the application of statistics computer programming and operations research in order to quantify and gain insight to the meanings of data. It is especially useful in areas which record a lot of data or information.

Analytics is an encompassing and multidimensional field thatuses mathematics, statistics, predictive modeling and machine learning techniques to find meaningful patterns and knowledge in recorded data.

Why is analytics important?

This can change how we look at the world and usually for the better.Some times we think that a process is already working at its best but sometimes data tells us otherwise so analytics helps us to improve our world. In the world organizations would usually apply nalytics in order to describe, predict and then improve the performance of the organizations. Specifically it would help in the following areas: Web analytics, Fraud analysis, Risk analysis, Advertisement and marketing, Enterprise Decision management, Market optimization, Market modelling.

Life cycle of analyticals

  • Business units specify the need, scope market conditions and goal related to the business question they want to solve, which will lead to the selection of one or more modeling techniques.

  • Depending on the business question and proposed analysis methods, this step involves using specialized techniques to locate, access, clean and prepare the data for optimal results. In our multifaceted data world, that could mean data from transactional systems, unstructured text files and data warehouses.

  • Now its time to explore the data in an interactive and visual fashion to quickly identify relevant variables, trends and relationships.(The shape of the data when variables are plotted out is called distribution of data. You can use shapes to identify the patterns.)

  • A skilled analyst or modeler builds the model using statistical, data mining or text mining software, including the critical capability of transforming and selecting key variables. Models need to be built rapidly so modelers can use trial and error to find the model that produces the best results.

  • Once built, he model is registered, tested(or validated), approved and declared ready for use against your data. With a centralized model repository, you can store extensive documentation about the model, scoring code and associated metadata (data about the data) for collaborative sharing and version control necessary for auditing purposes.

  • When approved for production use the model is applied to new data to generate predictive insights.
  • The predictive performance of the model is monitored to ensure it is up to date and delivering valid results. If the model performance degrades, its time to make changes. When it no longer works or serves a business need, it is retired.

Advanced Analytics

DC research shows SAS with a c ommanding 30.8 % market share in advanced analytics more than twice that of our nearest competitor.We dominate the market because we know it's not just how advanced the technology is that matters.it's how far it can advance our organization.

1 Data Mining:

Want to know what will happen in the future? Find the most lucrative opportunities? Get insights into impending outcomes.

a)SAS Econometrics

Analyze complex business and economic scenarios, providing a scientific basis for better decision making.

b)SAS Forecast Server

Produce large numbers of forecasts quickly and automatically to improve planning and decision making.

c)SAS Visual Forecasting

Generate large numbers of reliable forecasts quickly and automatically in an open environment.

2 Statistical Analysis:

Whether you're analysing customer data, runching sales numbers, monitoring supply chain operations or trying to detect fraud, apply powerful statistical analysis to all your data to get the most accurate answers.

a) SAS Visual Statistics

Create and modify predictive models faster than ever using a visual interface and in memory processing.

b)SAS® In-Memory Statistics

Find insights in big data with a single environment that moves you quickly through each phase of the analytical life cycle.

c) SAS® Analytics Pro

Access, manipulate, analyse and present information with a comprehensive analytical toolset that combines statistical analysis, reporting and highimpact visuals.

3 Forecasting:

Generate large quantities of high quality forecasts quickly and automatically no need for human intervention. And streamline your forecasting processes so you can focus efforts on the most important, high value decisions.

a)SAS Econometrics

Analyze complex business and economic scenarios, providing a scientific basis for better decision making.

b)SAS Visual Forecasting

Generate large numbers of reliable forecasts quickly and automatically in an open environment.

4 Optimization & Simulation:

Identify the scenarios that will produce the best outcomes.

a)SAS/OR

Optimize business processes and address challenges with enhanced operations research methods.

b)SAS® Optimization

Find optimal solutions to complex business and planning problems faster than ever.

c)SAS Simulation Studio

Build models that mimic complex, real-life systems so you can better understand and optimize them using discrete event simulation.

Gaming

Gaming is the running of specialized applications known as electronic games or video Games on game consoles like X-box and Play station or on personal computers(in which case the activity is known as online gaming). The term "gaming" originated as a synonym for "gambling" although most electronic games today do not involve gambling in the traditional sense. Gaming refers to playing electronic games, whether through consoles, computers, mobile phones or another medium altogether. Gaming is a nuanced term that suggests regular gameplay, possibly as a hobby.

The independent game industry has seen a substantial rise in recent years with the growth of new online distribution systems, such as Steam and Uplay, as well as the mobile game market, such as for Android and iOS devices.

Roles and Responsibilities of Development team:

  • A game designer is a person who designs game play conceiving and designing the rules and structure of a game. Development teams usually have a lead designer who coordinates the work of other designers. They are the main visionary of the game.

  • who creates video game art. The artist's job may be 2D oriented or 3Doriented. 2D artists may produce concept art sprites, textures environmental backdrops or terrain images, and user interface. 3D artists may produce models or animation, 3D environment and cinematics. Artists sometimes occupy both roles.

  • A game programmer is a software engineer who develops video games or related software. The game's code base development is handled by programmers. There are usually one to several lead programmers.

  • The quality assurance is carried out by game testers. A game tester analyzes video games to document software defects as part of a quality control Testing is a highly technical field requiring computing xpertise and analytic competence.

Tools for game design and development:

The world of gaming is fast paced and hugely exciting, especially with the ongoing developments and projects being created using virtual reality applications. But this can be a daunting environment if you are thinking of building your experience. There are a number of tools and resources around to help make your game or app a success.

1. Visual Studio:

Visual Studio,Microsofts developer tool suite has been around for years. which is a clear indication of just how popular the software is among designers and developers. A fully featured integrated developer environment(IDE) for Android, iOS, Windows, web and cloud, Visual Studio offers productive development tools and powerful services.

2.Assembla:

A highly efficient and capable project management tool with built in code repositories, Assembla is quickly becoming the resource of choice to build the latest games. Boasting a number of incredibly handy features, Assembla allows individual developers and teams to manage every aspect of a projects, from ideation to production, coding to communication, all in one place. Assembla is also the number one SVN provider in the world and features integration with leading communication app Slack, meaning the project team and clients can work together out of one platform helping games launch on time. Assembla provides everything you need to manage all your tasks, team and code in just one place a must-have tool for any game developer.

3.Unreal Engine:

This renowned of game development tools has been used to create hit games on just about any platform. Made by game developers, for game developers, Unreal Engine provides everything you need to make your next project, whether it be a simple 2D game, console blockbuster or virtual reality app, a success. Designed for mobile, now and in the future, Unreal Engine 4’s features gives you the power to develop your game and seamlessly deploy to iOS and Android devices.

4.Evernote:

It may seem obvious to say but every designer and developer needs a place to record even the smallest of ideas and be able to access them anywhere. Evernote,one of the most popular note taking apps around,allows users to do just that, enabling the capture, nurture,and sharing of ideas across any device.

5.Blender:

Blender is an open source 3D content creation suite, available for all major operating systems. Started by Blender Foundation founder Ton Roosendaal back in 2002, Blender is now the largest open source tool for 3D creation. Its makers are constantly working on its development,but you can pretty much do anything 3D related with this software, including modelling, texturing, animation, rendering and compositing.

Photoshop:

Speaking of 3D models,you’re going to need to texture said assets and theres no better program to start with than Photoshop.The go to tool for creative professionals, Photoshop provides an extensive and dedicated toolset for the creation and texturing of your game assets.

Substance:

3D painting software Substance also offers a way to paint your 3D assets in a procedural and natural way. A popular tool among 3D and digital artists,Substance features a complete and familiar set of texturing tools PBR and particle painting,Substance procedural effects,smart materials and smart masks, and fast baking.

Corona SDK:

It is a software development kit that is available on Windows and OS X and uses Lua as a scripting language. Using Corona, one can develop mobile games for free. However, to create a game or app with more elaborate features, you need to opt for enterprise model that offers native libraries and APIs. Corona SDK uses its rendering engine called OpenGL. Its built-in scene management and transition library helps adjust and modify your game’s graphical qualities. To make the game development easier,Corona Labs offers a Corona Editor and Text plugin.You can create a graphical environment to design different levels and understand the interaction between the objects.

SpriteKit:

Available on iOS and OS X, It is Apple’s proprietary 2D game development framework that supports both Swift and Objective C languages.SpriteKIt offers great convenience to developers.With SKView, scene management is made easy.SKAction class can be leveraged to move,scale or rotate various game objects.It also supports sound and developing custom codes.SpriteKit offers scene editor that enables designing of levels.Particle editor helps with developing different particle systems.It has Box2D for its physics engine,and a built-in camera through SkCamera Node class which makes navigation easy.

Unity:

It is a mobile game development engine that supports C# and UnityScript which is Unity’s own language like JavaScript. It comes with free as well as professional editions. It is a cross-platform tool and is deployable to many platforms. Like other tools, its built-in editor allows you to edit images and organize animations from animator window.

Marmalade :

It is a fast, high-performance cross-platform engine for creation of 2D and 3D games. The SDK can be used to code in C++. Marmalade Quick version supports app development using Lua scripting language whereas Marmalade Web facilitates creating hybrid apps using HTML5, CSS and JavaScript.Marmalade is an award winning platform that is popular among top game developers for its ability to build native games for both mobile and desktop.Marmalade Juice is another mobile game development tool that supports easy porting of iOS games to Android ecosystem.

NextPeer:

NextPeer is a Multiplayer social SDK that addresses the issue of multiplayer gaming on mobile.It supports both types of gameplay synchronous and asynchronous. NextPeer enhances the quality of gaming experience and player engagement, and helps developers achieve maximum user retention. It facilitates real-time interactions and live rankings in order to make it more interesting and real. It has a delayed synchronous technology.

Summary:

These are some of the points we can adopt in game development process. If we find it impractical to manage the technological resources. we explored some of the popular mobile game development tools, engines and framework as preferred by today’s developers. We can compare and have better insight about which gaming, best your development requirements. This will be useful for next game development plans.

Amazon Web Services

Amazon Web Services (AWS) is a secure cloud services platform, offering compute power, database storage, content delivery and other functionality to help businesses scale and grow. Explore how millions of customers are currently leveraging AWS cloud products and solutions to build sophisticated applications with increased flexibility, scalability and reliability.


A Broad IT Infrastructure Platform :

The AWS Cloud provides a broad set of infrastructure services, such as computing power, storage options, networking and databases, delivered as a utility: on-demand, available in seconds, with pay-as-you-go pricing.

  • From data warehousing to deployment tools, directories to content delivery, over 50 services are available in just a few mouse clicks with AWS. New services are quick to provision, without upfront capital expense, allowing enterprises, start-ups, SMBs and customers

  • After almost a decade of working closely with organizations as diverse as Pinterest, GE and MLB, the AWS Cloud allows customers to pin, power and play ball in entirely new ways.

  • Security in the cloud is recognized as better than on-premises. Broad security certification and accreditation, data encryption at rest and in-transit, hardware security modules and strong physical security all contribute to a more secure way to manage your business.

  • Security in the cloud is recognized as better than on-premises. Broad security certification and accreditation, data encryption at rest and in-transit, hardware security modules and strong physical security all contribute to a more secure way to manage your business.

Hybrid Capabilities :

Choosing between your existing investment in infrastructure and moving to the cloud is not a binary decision. Deep features, dedicated connectivity, identity federation and integrated tools allow you to run ‘hybrid’ applications across on-premises and cloud services.


Popular Products :

  • Amazon Elastic Compute Cloud (Amazon EC2) is a web service that provides secure, resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers.It's simple web service interface allows you to obtain and configure capacity with minimal friction. It provides you with complete control of your computing resources and lets you run on Amazon’s proven computing environment. Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change. Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use.

  • Amazon Relational Database Service (Amazon RDS) makes it easy to set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while automating time-consuming administration tasks such as hardware provisioning, database setup, patching and backups. It frees you to focus on your applications so you can give them the fast performance, high availability, security and compatibility they need.


  • Amazon RDS is available on several database instance types - optimized for memory, performance or I/O - and provides you with six familiar database engines to choose from, including Amazon Aurora, PostgreSQL, MySQL, MariaDB, Oracle Database, and SQL Server.


  • Companies today need the ability to simply and securely collect, store, and analyze their data at a massive scale. Amazon S3 is object storage built to store and retrieve any amount of data from anywhere – web sites and mobile apps, corporate applications, and data from IoT sensors or devices. It is designed to deliver 99.999999999% durability, and stores data for millions of applications used by market leaders in every industry. S3 provides comprehensive security and compliance capabilities that meet even the most stringent regulatory requirements. It gives customers flexibility in the way they manage data for cost optimization, access control, and compliance. S3 provides query-in-place functionality, allowing you to run powerful analytics directly on your data at rest in S3. And Amazon S3 is the most supported cloud storage service available, with integration from the largest community of third-party solutions, systems integrator partners, and other AWS services.


  • AWS Lambda lets you run code without provisioning or managing servers. You pay only for the compute time you consume - there is no charge when your code is not running. With Lambda, you can run code for virtually any type of application or backend service - all with zero administration. Just upload your code and Lambda takes care of everything required to run and scale your code with high availability. You can set up your code to automatically trigger from other AWS services or call it directly from any web or mobile app.


Platform Foundations :

    1.Amazon VPC

    Amazon Virtual Private Cloud (Amazon VPC) lets you provision a logically isolated section of the AWS Cloud where you can launch AWS resources in a virtual network that you define. You have complete control over your virtual networking environment, including selection of your own IP address range, creation of subnets, and configuration of route tables and network gateways. You can use both IPv4 and IPv6 in your VPC for secure and easy access to resources and applications.You can easily customize the network configuration for your Amazon VPC. For example, you can create a public-facing subnet for your web servers that has access to the Internet, and place your backend systems such as databases or application servers in a private-facing subnet with no Internet access.

    2.AWS IAM

    AWS Identity and Access Management (IAM) enables you to manage access to AWS services and resources securely. Using IAM, you can create and manage AWS users and groups, and use permissions to allow and deny their access to AWS resources IAM is a feature of your AWS account offered at no additional charge. You will be charged only for use of other AWS services by your users.

    3.AWS CloudWatch

    Amazon CloudWatch is a monitoring and management service built for developers, system operators, site reliability engineers (SRE), and IT managers. CloudWatch provides you with data and actionable insights to monitor your applications, understand and respond to system-wide performance changes, optimize resource utilization, and get a unified view of operational health. CloudWatch collects monitoring and operational data in the form of logs, metrics, and events, providing you with a unified view of AWS resources, applications and services that run on AWS, and on-premises servers. You can use CloudWatch to set high resolution alarms, visualize logs and metrics side by side, take automated actions, troubleshoot issues, and discover insights to optimize your applications, and ensure they are running smoothly.

    4.Billing Alarm

    You can monitor your estimated AWS charges using Amazon CloudWatch. When you enable the monitoring of estimated charges for your AWS account, the estimated charges are calculated and sent several times daily to CloudWatch as metric data.

Software as a Service (SaaS)

Software as a Service (SaaS)It is a software distribution model in which a third-party provider hosts applications and makes them available to customers over the Internet. SaaS is one of three main categories of cloud computing, alongside infrastructure as a service (IaaS) and platform as a service (PaaS). SaaS is closely related to the application service provider (ASP) and on demand computing software delivery models. The hosted application management model of SaaS is similar to ASP, where the provider hosts the customer’s software and delivers it to approved end users over the internet. In the software on demand SaaS model, the provider gives customers network-based access to a single copy of an application that the provider created specifically for SaaS distribution. SaaS applications for fundamental business technologies, such as email, sales management, customer relationship management (CRM), financial management, human resource management (HRM), billing and collaboration. Leading SaaS providers include Salesforce, Oracle, SAP, Intuit and Microsoft. SaaS is a natural evolution of software. The old model of getting physical DVDs and installing on local servers was the only realistic solution for many years. In fact, the client-server model is still required for many scenarios. That said, in recent years a number of developments have allowed SaaS to become mainstream. One factor is bandwidth; the internet is simply faster than it was a decade ago. Other major factors include the evolution of both virtualization and tools in big data. All these advances have made it much easier for providers to scale and manage their own infrastructure and thus provide SaaS solutions.


Advantages Of Saas

SaaS removes the need for organizations to install and run applications on their own computers or in their own data centers. This eliminates the expense of hardware acquisition, provisioning and maintenance, as well as software licensing, installation and support.

  • SaaS solutions reside in cloud environments that are scalable and have integration with other SaaS offerings. Comparing with the traditional model, users do not have to buy another server or software. They only need to enable a new SaaS offering and, in terms of server capacity planning, the SaaS provider will own that. SaaS providers generally offer many subscription options and flexibility to change subscriptions as and when needed, eg when your business grows, or more users need to access the service. SaaS, and more widely cloud computing, can help you make the most of a limited IT budget while giving you access to the latest technology and professional support

  • With the software housed on the server, it can be upgraded centrally, as opposed to the traditional model where the software would need to be upgraded on each machine. In other words, SaaS can easily be maintained with the latest version of the software at all times. customers can rely on a SaaS provider to automatically perform updates and patch management.

  • It isn’t just semantics. The cloud refers to a set of incredibly complex infrastructure technology. At a fundamental level, it’s a collection of computers, servers, and databases that are connected together in a way that users can lease access to share their combined power. The cloud can refer to anything that’s hosted remotely and delivered via the Internet. While all cloud programs are run by underlying software, SaaS refers specifically to business software applications that are delivered via the cloud. Given the widespread growth of cloud accessibility, it’s been easier, faster and less expensive for SaaS developers to roll out applications as compared to traditional on-premise software development. Today, nearly every type of core business function from human resources to enterprise resource planning is available via SaaS.

  • SaaS has a differential regarding costs since it usually resides in a shared or multitenant environment where the hardware and software license costs are low compared with the traditional model.Maintenance costs are reduced as well, since the SaaS provider owns the environment and it is split among all customers that use that solution.

    Generally, they pay for this service on a monthly basis using a pay-as-you-go model. Transitioning costs to a recurring operating expense allows many businesses to exercise better and more predictable budgeting. Users can also terminate SaaS offerings at any time to stop those recurring costs.

  • In the SaaS model, the software application is already installed and configured. Users can provision the server for the cloud and quickly have the application ready for use. This cuts the time to benefit and allows for rapid demonstrations and prototyping. With many SaaS companies offering free trials, this means a painless proof of concept and discovery phase to prove the benefit to the organization. Whether in demo mode or actually going live, intuitive interfaces for order writing and rapid uploading of electronic catalogs and customer lists give sales people immediate benefits without long wait times or steep learning curves.


    Work anywhere & Easy to use and perform proof of concepts

    Since the software is hosted in the cloud and accessible over the internet, users can access it via mobile devices wherever they are connected. This includes checking customer order histories prior to a sales call, as well as having access to real time data and real time order taking with the customer. For road warriors, the ability to access the software and data when they need it can change the nature of a sale. with the rise of the internet, SaaS saw deployment in enterprise for employees to access company resources, including software located on the company’s central server. In the consumer space, SaaS was also deployed with popular software including webmail and photo sharing services.

    SaaS offerings are easy to use since they already come with best practices and samples inside it. Users can do proof of concepts and test the software functionality or a new release feature in advance. Also, they can have more than one instance with different versions and do a smooth migration.

    Balancing pros and cons

    companies, the pluses have outweighed the minuses, and there has been a trend towards running more software via SaaS with the cloud computing model, and away from hosting it locally. The SaaS market continues to expand, with expected growth to surpass $112.8 billion by 2020. While the internet has made this possible, SaaS is hardly a new phenomenon, and its origins can be traced back to the 1960s, with IBM and other corporations using distributed software packages to connect users with mainframes to provide utility computing.

    Slacking on

    While office suites have been around for decades at this point, a newer type of SaaS focuses on office communication. Slack bills itself as a collaboration hub and provides a single place for messaging, tools and files.Rather than have communication between team members distributed between multiple platforms, such as shared storage, emails, texts and instant messages, Slack provides a unified platform for providing communication and file exchange between team members working on projects. attractive is that new channels can be created, each focused on its own project, topic,or whatever is needed, to neatly organize the communication, all in a searchable database, secured with two-factor authentication. Slack plans kick-off at the useful free tier that limits your account to the most recent 10,000 messages, with the next tier Standard plan costing $8 and this is unrestricted, plus it’s discounted when paid for annually.

Quantum computing

Quantum computers could spur the development of new breakthroughs in science, medications to save lives, machine learning methods to diagnose illnesses sooner, materials to make more efficient devices and structures, financial strategies to live well in retirement, and algorithms to quickly direct resources such as ambulances. We experience the benefits of classical computing every day. However, there are challenges that today’s systems will never be able to solve. For problems above a certain size and complexity, we don’t have enough computational power on Earth to tackle them. To stand a chance at solving some of these problems, we need a new kind of computing. Universal quantum computers leverage the quantum mechanical phenomena of superposition and entanglement to create states that scale exponentially with number of qubits, or quantum bits.


How do quantum computers work?

In quantum computing, a qubit (short for quantum bit) is a unit of quantum information—similar to a classical bit. Where classical bits hold a single binary value such as a 0 or 1, a qubit can hold both values at the same time in what's known as a superposition state. When multiple qubits act coherently, they can process multiple options simultaneously. This allows them to process information in a fraction of the time it would take even the fastest nonquantum systems. There are a few different ways to create a qubit. One method uses superconductivity to create and maintain a quantum state. To work with these superconducting qubits for extended periods of time, they must be kept very cold. Any heat in the system can introduce error, which is why quantum computers operate at temperatures close to absolute zero, colder than the vacuum of space.

  • Quantum computing can provide solutions to challenges that are out of reach for today's fastest computers. This new paradigm creates endless possibilities across a variety of quantum computing applications, provided the quantum computer has enough error-corrected qubits to complete algorithms successfully. Microsoft is developing a topological qubit to create a scalable quantum system that can complete the algorithms for the solutions the world needs most. While quantum computers can offer an exponential boost in computational power, they can’t be programmed in the same way as a classical computer. The instruction set and algorithms change, and the resulting output is different as well. On a classical computer, the solution is found by checking possibilities one at a time. Depending upon the problem, this can take too long. A quantum computer can explore all possibilities at the same time, but there are a few challenges. Getting the right answer out of the computer isn’t easy, and because the answers are probabilistic, you may need to do extra work to uncover the desired answer.

  • Finding solutions to challenges like global warming and world hunger may require a quantum system with thousands or millions of qubits. Microsoft is pursuing a topological qubit for its ability to scale—allowing us to solve more complex problems with fewer numbers of qubits overall. Paired with our full-stack solution, the topological qubit will help Microsoft offer a quantum system that scales to greater complexity, bringing solutions to some of the world's greatest challenges within reach. Microsoft is achieving its vision for scalable quantum computing through topological qubits. Qubits are fragile by nature, easily collapsing from outside interference. Topological qubits feature increased stability and resistance to interference, a performance improvement that allows the quantum computer to scale.

  • With the aid of quantum computers, chemists can work to identify a new catalyst for fertiliser to help reduce greenhouse emissions and improve global food production. This solution requires the ability to model molecular interactions which are too complex for classical computers, but well-suited for quantum computers. The field of chemistry is an area in which quantum computers will have significant impact.

  • Quantum computers will help advance materials science, creating superior new alternatives and greener technologies. One potential quantum computing application is the development of high-temperature superconductors which could enable lossless transmission of energy. New discoveries enabled by quantum computers will help identify materials with properties suitable for high-temperature superconductivity—a level of complexity that is out of reach for the computers we use today..

  • Quantum computing can bring speed and efficiency to complex optimisation problems in machine learning. For example, large factories aiming to maximise output require optimisation of each individual process, as well as all participating components. Quantum computers can help deliver optimisation insights for streamlined output, reduced waste, and lowered costs..


    Quantum outperforms classical

    Quantum computers are expected to be better at solving certain computational problems than classical computers. This expectation is based on conjectures in computational complexity theory, but rigorous comparisons between the capabilities of quantum and classical algorithms are difficult to perform. Bravyi et al. proved theoretically that whereas the number of steps needed by parallel quantum circuits to solve certain linear algebra problems was independent of the problem size, this number grew logarithmically with size for analogous classical circuits.This so-called quantum advantage stems from the quantum correlations present in quantum circuits that cannot be reproduced in analogous classical circuits.

Augmented reality(AR)/virtual reality(VR)

Augmented reality is the integration of digital information with the user's environment in real time. Unlike virtual reality, which creates a totally artificial environment, augmented reality uses the existing environment and overlays new information on top of it. One of the first commercial applications of AR technology was the yellow "first down" line that began appearing in televised football games sometime in 1998. Today, Google glass and heads-up displays in car windshields are perhaps the most well-known consumer AR products, but the technology is used in many industries including healthcare, public safety, gas and oil, tourism and marketing. Augmented reality apps are written in special 3D programs that allow the developer to tie animation or contextual digital information in the computer program to an augmented reality "marker" in the real world. When a computing device's AR app or browser plug-in receives digital information from a known marker, it begins to execute the marker's code and layer the correct image or images.


Virtual Reality (VR) is the use of computer technology to create a simulated environment. Unlike traditional user interfaces, VR places the user inside an experience. Instead of viewing a screen in front of them, users are immersed and able to interact with 3D worlds. By simulating as many senses as possible, such as vision, hearing, touch, even smell, the computer is transformed into a gatekeeper to this artificial world. The only limits to near-real VR experiences are the availability of content and cheap computing power. Virtual Reality’s most immediately-recognizable component is the head-mounted display (HMD). Human beings are visual creatures, and display technology is often the single biggest difference between immersive Virtual Reality systems and traditional user interfaces. For instance, CAVE automatic virtual environments actively display virtual content onto room-sized screens. While they are fun for people in universities and big labs, consumer and industrial wearables are the wild west.


How Augmented Reality Works?

The idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. Sounds pretty simple. Besides, haven't television networks been doing that with graphics for decades? However, augmented reality is more advanced than any technology you've seen in television broadcasts, although some new TV effects come close, such as RACEf/x and the super-imposed first down line on televised U.S. football games, both created by Sportvision.

  • Marker-based augmented reality uses a camera and some type of visual marker, such as a QR/2D code, to produce a result only when the marker is sensed by a reader. Marker based applications use a camera on the device to distinguish a marker from any other real world object. Distinct, but simple patterns are used as the markers, because they can be easily recognized and do not require a lot of processing power to read. The position and orientation is also calculated, in which some type of content and information is then overlaied the marker.

  • As one of the most widely implemented applications of augmented reality, markerless (also called location-based, position-based, or GPS) augmented reality, uses a GPS, digital compass, velocity meter, or accelerometer which is embedded in the device to provide data based on your location. A strong force behind markerless augmented reality technology is the wide availability of smartphones and location detection features they provide. It is most commonly used for mapping directions, finding nearby businesses, and other location-centric mobile applications.

  • Projection based augmented reality works by projecting artificial light onto real world surfaces. Projection based augmented reality applications allow for human interaction by sending light onto a real world surface and then sensing the human interaction (i.e. touch) of that projected light. Detecting the user’s interaction is done by differentiating between an expected (or known) projection and the altered projection (caused by the user’s interaction). Another interesting application of projection based augmented reality utilizes laser plasma technology to project a three-dimensional (3D) interactive holograminto mid-air.

  • Superimposition based augmented reality either partially or fully replaces the original view of an object with a newly augmented view of that same object. In superimposition based augmented reality, object recognition plays a vital role because the application cannot replace the original view with an augmented one if it cannot determine what the object is. A strong consumer-facing example of superimposition based augmented reality could be found in the Ikea augmented reality furniture catalogue. By downloading an app and scanning selected pages in their printed or digital catalogue, users can place virtual ikea furniture in their own home with the help of augmented reality.

How Virtual Reality Works?

Unsurprisingly, the video games industry is one of the largest proponents of Virtual Reality. Support for the Oculus Rift headsets has already been jerry-rigged into games like Skyrim and Grand Theft Auto, but newer games like Elite: Dangerous come with headset support built right in. Many tried-and-true user interface metaphors in gaming have to be adjusted for VR (after all, who wants to have to pick items out of a menu that takes up your entire field of vision?), but the industry has been quick to adapt as the hardware for true Virtual Reality gaming has become more widely available. and Scientific and engineering data visualization has benefited for years from Virtual Reality, although recent innovation in display technology has generated interest in everything from molecular visualization to architecture to weather models.

  • The complete VR experience, we need three things. First, a plausible, and richly detailed virtual world to explore a computer model or simulation, in other words. Second, a powerful computer that can detect what we're going and adjust our experience accordingly, in real time (so what we see or hear changes as fast as we move just like in real reality). Third,hardware linked to the computer that fully immerses us in the virtual world as we roam around. Usually, we'd need to put on what's called a head-mounted display with two screens and stereo sound, and wear one or more sensory gloves. Alternatively, we could move around inside a room, fitted out with surround-sound loudspeakers, onto which changing images are projected from outside.We'll explore VR equipment in more detail in a moment.

  • A highly realistic flight simulator on a home PC might qualify as nonimmersive virtual reality, especially if it uses a very wide screen, with headphones or surround sound, and a realistic joystick and other controls. Not everyone wants or needs to be fully immersed in an alternative reality. An architect might build a detailed 3D model of a new building to show to clients that can be explored on a desktop computer by moving a mouse. Most people would classify that as a kind of virtual reality, even if it doesn't fully immerse you. In the same way, computer archaeologists often create engaging 3D reconstructions of long-lost settlements that you can move around and explore.

  • virtual world games like Second Life and Minecraft? Do they count as virtual reality? Although they meet the first four of our criteria (believable, interactive, computer-created and explorable), they don't really meet the fifth: they don't fully immerse you. But one thing they do offer that cutting-edge VR typically doesn't is collaboration: the idea of sharing an experience in a virtual world with other people, often in real time or something very close to it. Collaboration and sharing are likely to become increasingly important features of VR in future.

  • Virtual reality was one of the hottest, fastest-growing technologies in the late 1980s and early 1990s, but the rapid rise of the World Wide Web largely killed off interest after that. Even though computer scientists developed a way of building virtual worlds on the Web (using a technology analogous to HTML called Virtual Reality Markup Language,VRML) ordinary people were much more interested in the way the Web gave them new ways to access real reality—new ways to find and publish information, shop, and share thoughts, ideas, and experiences with friends through social media. With Facebook's growing interest in the technology, the future of VR seems likely to be both Web-based and collaborative.

  • Mobile devices like smartphones and tablets have put what used to be supercomputer power in our hands and pockets. If we're wandering round the world, maybe visiting a heritage site like the pyramids or a fascinating foreign city we've never been to before, what we want is typically not virtual reality but an enhanced experience of the exciting reality we can see in front of us. That's spawned the idea of augmented reality (AR), where, for example, you point your smartphone at a landmark or a striking building and interesting information about it pops up automatically.

    Difference Between AR and VR

    Virtual reality and augmented reality accomplish two very different things in two very different ways, despite the similar designs of the devices themselves. VR replaces reality, taking you somewhere else. AR adds to reality, projecting information on top of what you're already seeing. They're both powerful technologies that have yet to make their mark with consumers, but show a lot of promise. They can completely change how we use computers in the future, but whether one or both will succeed is anyone's guess right now.

Microservices

Microservice architecture or simply microservices, is a distinctive method of developing software systems that tries to focus on building single-function modules with well-defined interfaces and operations. The trend has grown popular in recent years as Enterprises look to become more Agile and move towards a DevOps and continuous testing. Microservices also known as the microservice architecture is an architectural style that structures an application as a collection of services that are Highly maintainable and testable,Loosely coupled,Independently deployable,Organized around business capabilities. It is enables the continuous delivery/deployment of large, complex applications. It also enables an organization to evolve its technology stack.


Microservices have many benefits for Agile and DevOps teams - as Martin Fowler points out, Netflix, eBay, Amazon, Twitter, PayPal, and other tech stars have all evolved from monolithic to microservices architecture. Unlike microservices, a monolith application is built as a single, autonomous unit. This make changes to the application slow as it affects the entire system. A modification made to a small section of code might require building and deploying an entirely new version of software. Scaling specific functions of an application, also means you have to scale the entire application.


Future of Microservice

microservice becomes the preferred style of developers in future, it’s clearly a potent idea that offers serious benefits for designing and implementing enterprise applications. Many developers and organizations, without ever using the name or even labeling their practice as SOA, have been using an approach toward leveraging APIs that could be classified as microservices. We’ve seen a number of existing technologies try to address parts of the segmentation and communication problems that microservices aim to resolve. SOAP does well at describing the operations available on a given endpoint and where to discover it via WSDLs. UDDI is theoretically a good step toward advertising what a service can do and where it can be found. But these technologies have been compromised by a relatively complex implementation, and tend not to be adopted in newer projects.

  • Services within a system are largely decoupled. So the application as a whole can be easily built, altered, and scaled and Microservices are treated as independent components that can be easily replaced and upgraded.

  • Microservices are very simple and focus on a single capability and Developers and teams can work independently of each other, thus increasing speed.

  • Allows frequent releases of software, through systematic automation of software creation, testing, and approval and Microservices do not focus on applications as projects. Instead, they treat applications as products for which they are responsible.

  • The focus is on using the right tool for the right job. That means there is no standardized pattern or any technology pattern. Developers have the freedom to choose the best useful tools to solve their problems and Microservices support agile development. Any new feature can be quickly developed and discarded again.

How Microservices Works?

Microservice architecture has now found its way into large company systems. The companies have subsequently been able to fix certain problems or optimize their processes. Examples like Netflix, Spotify, and eBay show large companies with established monolithic systems changing to a microservice model. Other major IT companies like Google and Amazon also work like this. Some of them were already using modular systems when there was no term for them.

Netflix used to be based on a monolithic system (back when Netflix was not an online streaming service, but only sent DVDs through the mail). In 2008, there was an error in a database that caused the entire service to fail for four days. The decision was then taken to break up the old system and split them into microservices. The result was that the company was able to make live changes much faster, and repairs were carried out much more quickly. Since the Netflix system is enormously extensive, a separate program was developed to organize the individual microservices among themselves.

The streaming service Spotify also relies on microservices.Spotify’s main daily development challenge is keeping ahead of the strong competition.The audio streaming services market has some of the largest IT companies in the world as its main players such as Amazon, Apple, and Google. Due to the increasing number of users,Spotify developers are constantly having to meet higher demands and comply with certain business rules. Microservices are a good solution for Spotify, allowing them to react quickly to new developments their competitors might make, and publish their own developments faster forcing the competitors to react in turn

The fact that eBay and other companies have successfully gone from a monolithic to microservice architecture is a clear sign of the benefits of a more modern approach. While the monolith structure is perfectly sufficient in the early days of a website, with a small number of active users and a manageable range of products, it can become growth-inhibiting when demands start to increase.

  • let’s explore Conway’s Law, which states Organizations which design systems are constrained to produce designs which are copies of the communication structures of these organizations. Instinctively, we separate out the high risk activities it’s only difficult deciding responsibilities like customer refunds. Consider how we might answer questions like Does the Accounting team have enough people to process both customer refunds and credits or Wouldn’t it be a better outcome to have our Support people be able to apply credits and deal with frustrated customers.

  • How do we go from monoliths to services One way is through service objects. Without removing code from your application, you effectively just begin to structure it as though it were completely external. To do that, you’ll first need to differentiate the actions that can be done and the data that is present as inputs and outputs of those actions. we’ve distinguished two distinct classes one that models the data, and one that performs the operations. Importantly, our JobService class has little or no state you can call the same actions over and over, changing only the data, and expect to get consistent results. If JobService somehow started taking place over a network, our otherwise monolithic application wouldn’t care. Shifting these types of classes into a library, and substituting a network client for the previous implementation, would allow you to transform the existing code into a scalable external service.

  • Perhaps the most important distinction is side effects.Microservices avoid them. ls | wc -l and ls | less Composing small pieces of functionality relies on repeatable results, a standard mechanism for input and output, and an exit code for a program to indicate success or lack thereof. We know this works from observational evidence, and we also know that a Unix pipe is a “dumb” interface because it has no control statements. The pipe applies SRP by pushing data from A to B, and it’s up to members of the pipeline to decide if the input is unacceptable. The invoice calculation related steps are idempotent, and it’s then trivial to compose a draft invoice or preview the amounts payable by the customer by leveraging our new dedicated microservices.

    SOA vs Microservices

    Service Oriented Architecture (SOA) sprung up during the first few years of this century, and microservice architecture (abbreviated by some as MSA) bears a number of similarities. Traditional SOA, however, is a broader framework and can mean a wide variety of things. Some microservices advocates reject the SOA tag altogether, while others consider microservices to be simply an ideal, refined form of SOA. In any event, we think there are clear enough differences to justify a distinct microservice concept. SOA model, for example, usually has more dependent ESBs, with microservices using faster messaging mechanisms. SOA also focuses on imperative programming, whereas microservices architecture focuses on a responsive-actor programming style. Moreover, SOA models tend to have an outsized relational database, while microservices frequently use NoSQL or micro-SQL databases (which can be connected to conventional databases). But the real difference has to do with the architecture methods used to arrive at an integrated set of services in the first place.