SM 6th-SEM Cse Cloud-Computing
SM 6th-SEM Cse Cloud-Computing
Lecture Note
On
Cloud Computing
Prepared by
Amalendu Kumar Pradhan, Lecturer
Department of Computer Science & Engineering
Email- [email protected]
CONTENTS
2 UNIT-2 15-21
CLOUD COMPUTING ARCHITECTURE
3 UNIT-3 22-28
SCALABILITY AND FAULT TOLERANCE
4 UNIT-4 28-33
CLOUD MANAGEMENT AND VIRTUALISATION TECHNOLOGY
6 UNIT-6 39-43
CLOUD SECURITY
7 UNIT-7 44-51
CLOUD COMPUTING SECURITY ARCHITECTURE
8 UNIT-8 52-56
MARKET BASED MANAGEMT OF CLOUDS
9 UNIT-9 57-58
HADOOP
KIIT POLYTECHNIC
UNIT-1
Introduction to Cloud Computing
Cloud Computing is the combination of Network with Internet. It is a technology which
is manipulating, configuring, and accessing the hardware and software resources remotely. It
offers online data storage, infrastructure, and application. It is the on-demand delivery of IT
resources over the Internet.
Instead of buying, owning, and maintaining
physical data centres and servers, one can
access technology services, such as
computing power, storage, and databases,
on an as-needed basis from a cloud server.
Cloud computing offers platform
independency, that means software is not
required to be installed in a local PC.
In a cloud computing system the Remote
Servers are responsible for running
everything from e-mail to word processing
to complex data analysis programs for the
client users and all the computing process owned by another company.
Cloud can provide services over public and private networks, i.e., WAN, LAN or VPN.
Applications like e-mail, web conferencing, customer relationship management (CRM) executes
on cloud.
• Back-up and restore data: Once the data is stored in the cloud, it is easier to get back-
up and restore that data using the cloud computing technology.
• Improved collaboration: Cloud applications have improved collaboration by allowing
groups of people to share information in the cloud quickly and easily.
• Excellent accessibility: It allows us to access and store data or information quickly and
easily from anywhere and anytime using internet connection. Ultimately it increases
the productivity and efficiency of the organization.
• Low maintenance cost: Cloud computing reduces both hardware and software
maintenance costs for an organization.
• Mobility: Cloud computing allows us to easily access all cloud data while on roaming.
• Unlimited storage capacity: Cloud offers us a huge amount of storage capacity for
storing our data such as documents, images, audio, video, etc. in one place.
• Data security: Data security is one of the biggest advantages of cloud computing. Cloud
offers many advanced features related to security and ensures that the data is safe.
• Internet Connectivity: Cloud Server can be accessed only through internet. So if there is
no good internet connectivity or no internet connection, than the data cannot be
accessed properly.
• Vendor lock-in: Vendor lock-in is the biggest disadvantage of cloud computing.
Organizations may face problems when transferring their services from one vendor to
another. As different vendors provide different platforms, that creates a problem to move
data from one cloud to another.
• Limited Control: As we know, cloud infrastructure is completely owned, managed, and
monitored by the service provider, so the cloud users have less control on the cloud
servers.
• Security: Although cloud service providers implement the best security standards, but
before adopting cloud technology, the organization must be aware that they are
handing over all the organization's sensitive information to a third party, which is a
cloud computing service provider. While sending the data on the cloud, there may be a
chance that your organization's information is hacked by Hackers.
Types of Cloud
There are the following 4 types of cloud that
you can deploy according to the
organization's requirements.
Public Cloud
Public cloud is open to all to store and access information through Internet using the pay-per-
usage method. In public cloud, computing resources are managed and operated by the Cloud
Service Provider (CSP).
o Public cloud is owned at a lower cost than the private and hybrid cloud.
o Public cloud is maintained by the cloud service provider, so do not need to worry
about the maintenance.
o Public cloud is easier to integrate. Hence it offers a better flexibility approach to
consumers.
o Public cloud is location independent because its services are delivered through the
internet.
o Performance depends upon the high-speed internet network link to the cloud
provider.
Private Cloud
o Private cloud provides a high level of security and privacy to the users.
o Private cloud offers better performance with improved speed and space capacity.
o It allows the IT team to quickly allocate and deliver on-demand IT resources.
o The organization has full control over the cloud because it is managed by the
organization itself. So, there is no need for the organization to depends on anybody.
o It is suitable for organizations that require a separate cloud for their personal use and
data security is the first priority.
Hybrid Cloud
Community Cloud
o Community cloud is cost-effective because the whole cloud is being shared by several
organizations or communities.
o Community cloud is suitable for organizations that want to have a collaborative cloud
with more
o security features than the public cloud.
o It provides better security than the public cloud.
o It provides collaborative and distributive environment.
o Community cloud allows us to share cloud resources, infrastructure, and other
capabilities among various organizations.
Historical development
Before emerging the cloud computing, there was Client/Server computing which is basically a
centralized storage in which all the software applications, all the data and all the controls are
resided on the server side. If a single user wants to access specific data or run a program, he/she
need to connect to the server and then gain appropriate access, and then he/she can do his/her
business.
The concept of Cloud Computing came into existence in the year 1950 with implementation of
mainframe computers, accessible via thin/static clients. Since then, cloud computing has been
evolved from static clients to dynamic ones and from software to services.
Vision of Cloud Computing
The vision of cloud computing are
1. Cloud computing provides the facility of virtual hardware, runtime environment and
services to an individual or an organization.
2. The service of cloud server can be accessed as long as the user needed. There is no
requirement of any upfront commitment.
4. The long term vision of a cloud computing is that, IT services and business can be traded
as an utilities in an open market without any technological and legal barriers.
5. Due to the existence of a global platform for trading cloud services will also help service
providers to potentially increase their revenue.
6. A cloud provider can also become a consumer of a competitor service in order to fulfil
its promises to customers.
1. On Demand Self Service: User gets on demand computer services like email, applications etc.
without interacting with the service provider. Some of the cloud service providers are- Amazon
Web Service, Microsoft, IBM, Salesforce.com
2. Broad network access: Cloud services is available over the network and can be accessed by
different clients through Cell phone, IPAD, TAB, Laptops etc.
3. Resource pooling: Same resources can be used by more than one customer at a same time.
For example- storage and network bandwidth can be used by any number of customers and
without knowing the exact location of that resource.
4. Rapid elasticity: On users demand cloud services can be available and released. Cloud service
capabilities are unlimited and can be accessed at any time.
5. Measured service: Resources used by the users can be monitored, controlled. The reports are
available for both cloud providers and consumers. On the basis of this measured reports cloud
system automatically controls and optimizes the resources based on the type of services.
The cloud computing reference model is an conceptual model that characterize and standardize
the functions of a cloud computing environment by partitioning it into conceptual layers and
cross-layer functions. This reference model groups the cloud computing functions and activities
into five logical layers and three cross-layer functions.
Physical Layer
• It is the Foundation layer of the cloud infrastructure.
• It Specifies entities that operate at this layer : Compute systems, network devices and
storage devices. Operating environment, protocol, tools and processes.
• It executes the request which is generated by the virtualization and control layer.
Virtual Layer
Control Layer
• Specifies the entities that operate at this layer i.e. Orchestration software.
• It provides workflows for executing automated tasks.
Service Layer
Cross-layer function
Business continuity
It is responsible for any kind faults as well as responsible for data replication and
backup.
Security
It provides secure data transmission between Cloud and consumer. It protects
consumer’s information
Service Management
Specifies adoption of activities related to service portfolio management and service operation
management.
In a cloud environment, consumers can deploy and run their software applications on a
sophisticated infrastructure that is owned and managed by cloud provider (e.g., Amazon Web
Services, Microsoft Azure, and Google Cloud Platform). Following are the Cloud Computing
Environments.
1. Efficiency / cost reduction: By using cloud infrastructure, you don't have to spend huge
amounts of money on purchasing and maintaining equipment.
2. Data security: Cloud offers many advanced security features that secured the stored
data. Cloud storage providers implement baseline protections for their platforms like
authentication, access, control, and encryption.
5. Disaster recovery: Data loss is a major concern for all organizations, along with data
security. Storing the data in the cloud guarantees that data is always available, even if
the client equipment like laptops or PCs, is damaged. Cloud-based services provides
quick data recovery for all kinds of emergency situation.
6. Control: Cloud enables you complete visibility and control over the data. One can easily
decide which users have what type of data can be accessed.
7. Market reach: The development of cloud technology ensures the Market reach very
easily and quickly for the new IT companies.
5. Energy-Efficiency: Due to energy efficiency, it is not likely to damage or effect any other
things of the IT infrastructure or organization.
6. Security: The Cloud infrastructure is responsible for the risk management. Risk
management Refers to the risks involved in the services which are being provided by the
cloud-service providers.
7. Resilience (Flexibility): Due to flexibility the infrastructure is safe from all sides and the
IT operations will not be easily get affected.
Cloud Adoption
Cloud adoption means adopting
a service or technology from
another cloud service provider.
• It supports some interactive applications that combines two or more data sources. For
example:-if a company requires to grow his business in the whole country in a short span
of time then it must need a quick promotion or short promotion across the country
adopting cloud technology.
• Cloud Adoption is useful when the recovery management, backup recovery based
implementations are required.
• It will work well with research and development projects. It means the testing of new
services, design models and also the applications that can be get adjusted on small
servers.
Cloud applications
Cloud Computing has its applications in almost all the fields such as business, entertainment,
data storage, social networking, management, entertainment, education, art and GPS (Global
Positioning System), etc. Some of the widely famous cloud computing applications are
• Business Applications
Cloud computing has made businesses more collaborative and easy by incorporating
various apps such as MailChimp, Chatter, Google Apps for business, and Quickbooks.
• MailChimp: MailChimp is an email publishing platform which provides various
options to design, send, and save templates for emails.
Cloud computing allows us to store information (data, files, images, audios, and videos) on
the cloud and access this information using an internet connection. As the cloud provider is
responsible for providing security, so they offer various backup recovery application for
retrieving the lost data. A list of data storage and backup applications in the cloud are given below -
• Mozy: Mozy provides powerful online backup solutions for our personal and
business data. It schedules automatically back up for each day at a specific time.
• Oukuu: Joukuu provides the simplest way to share and track cloud-based backup
files. Many users use joukuu to search files, folders, and collaborate on documents.
Cloud computing offers various cloud management tools which help admins to manage
all types of cloud activities, such as resource deployment, data integration, and disaster
recovery. These management tools also provide administrative control over the
platforms, applications, and infrastructure. Some important management applications
are -
• Toggl: Toggl helps users to track allocated time period for a particular project.
• Evernote: Evernote allows you to sync and save your recorded notes, typed
notes, and other notes in one convenient place. It is available for both free as well
as a paid version. It uses platforms like Windows, macOS, Android, iOS, Browser,
and Unix.
the tasks related to the management such as join meetings in seconds, view
presentations on the shared screen, get alerts for upcoming meetings, etc.
• Social Applications
Social cloud applications allow a large number of users to connect with each other using
social networking applications such as Facebook, Twitter, Linkedln, etc.
• Yammer: Yammer is the best team collaboration tool that allows a team of
employees to chat, share images, documents, and videos.
Cloud computing offers various art applications for quickly and easily design attractive
cards, booklets, and images.
Some most commonly used cloud art applications are given below:
• Moo: Moo is one of the best cloud art applications. It is used for designing and
printing business cards, postcards, and mini cards.
• Adobe Creative Cloud: Adobe creative cloud is made for designers, artists,
filmmakers, and other creative professionals. It is a suite of apps which includes
PhotoShop image editing programming, Illustrator, InDesign, TypeKit,
Dreamweaver, XD, and Audition.
• Education Applications
Cloud computing in the education sector becomes very popular. It offers various online distance
learning platforms and student information portals to the students. The advantage of using
cloud in the field of education is that it offers strong virtual classroom environments, Ease of
accessibility, secure data storage, scalability, greater reach for the students, and minimal
hardware requirements for the applications.
• Google Apps for Education: Google Apps for Education is the most widely used
platform for free web-based email, calendar, documents, and collaborative study.
• Tablets with Google Play for Education: It allows educators to quickly implement
the latest technology solutions into the classroom and make it available to their
students.
• Entertainment Applications
• Online games: Today, cloud gaming becomes one of the most important
entertainment media. It offers various online games that run remotely from the
cloud. The best cloud gaming services are GeForce Now, Vortex, Project xCloud,
and PlayStation Now.
UNIT-2
Cloud Computing Architecture
Front End
The front end is used by the client. It contains client-side interfaces and applications that are
required to access the cloud computing platforms. The front end includes web browsers
(including Chrome, Firefox, internet explorer, etc.), thin & fat clients, tablets, and mobile
devices.
Back End
The back end is used by the service provider. It manages all the resources that are required to
provide cloud computing services. It includes a huge amount of data storage, security
mechanism, virtual machines, deploying models, servers, traffic control mechanisms, etc.
2. Application: The application may be any software or platform that a client wants to access.
3. Service: A Cloud Services manages that which type of service you access according to the
client’s requirement.
ii. Platform as a Service (PaaS) – It is also known as cloud platform services. It is quite
similar to SaaS, but the difference is that PaaS provides a platform for software creation,
but using SaaS, we can access software over the internet without the need of any
platform. Example: Windows Azure, Force.com, Magento Commerce Cloud, OpenShift.
4. Runtime Cloud: Runtime Cloud provides the execution and runtime environment to the
virtual machines.
5. Storage: Storage is one of the most important components of cloud computing. It provides a
huge amount of storage capacity in the cloud to store and manage data.
6. Infrastructure: It provides services on the host level, application level, and network level.
Cloud infrastructure includes hardware and software components such as servers, storage,
network devices, virtualization software, and other storage resources that are needed to
support the cloud computing model.
9. Internet: The Internet is medium through which front end and back end can interact and
communicate with each other.
Switching cloud service providers : the customer wants to move an application and data from
Cloud 1 to Cloud 2
Use of multiple cloud service providers : the customer subscribes to the same or different
services from two or more clouds (Clouds 1 and 2 in the diagram);
Directly linked cloud services: the customer needs Cloud 1 to be linked to Cloud 3 to make use
of its services.
Hybrid cloud configuration: the customer connects legacy systems to an internal private cloud
(e.g., Cloud 1) which is linked to a public cloud service (e.g., Cloud 3); and
Cloud migration: the customer moves one or more in-house applications and/or data to Cloud
1.
Benefits
IaaS allows the cloud provider to freely locate the infrastructure over the Internet in a cost-
effective manner. Some of the key benefits of IaaS are listed below:
• Full control over computing resources: IaaS allows the customer to access computing
resources using administrative rights from virtual machines in the following manner:
• Flexible and efficient renting of computer hardware: IaaS resources such as virtual machines,
storage devices, bandwidth, IP addresses, firewalls, etc. are made available to the customers
on rent. Also with administrative access to virtual machines, the customer can run any type
of software.
Issues
• Data erase practices: The customer uses virtual machines that in turn use the common
disk resources provided by the cloud provider. When the customer releases the
resource, the cloud provider must ensure that next customer to rent the resource does
not observe data residue from previous customer.
Characteristics
Benefits
• More current system software: It is the responsibility of the cloud provider to maintain
software versions and patch installations.
Issues
PaaS has significant burdens on customer's browsers to maintain reliable and secure
connections to the provider’s systems. However, there are some specific issues associated with
PaaS are-
• Lack of portability between PaaS clouds: Although standard languages are used, yet the
implementations of platform services may vary. For example, file, queue, or hash table
interfaces of one platform may differ from another, making it difficult to transfer the
workloads from one platform to another.
• Event based processor scheduling: The PaaS applications are event-oriented i.e., they have
to answer a request in a given interval of time.
Characteristics
Characteristics
• SaaS applications are cost-effective since they do not require any maintenance at end
user side.
• They are available on demand.
• They can be scaled up or down on demand.
• They are automatically upgraded and updated.
• SaaS offers shared data model. Therefore, multiple users can share single instance of
infrastructure. It is not required to hard code the functionality for individual users.
• All users run the same version of the software.
Benefits
Using SaaS has proved to be beneficial in terms of scalability, efficiency and performance. Some
of the benefits are listed below:
• Modest software tools: The SaaS application deployment requires a little or no client
side software installation, which results in the following benefits:
o No requirement for complex software packages at client side
o Little or no risk of configuration at client side
o Low distribution cost
• Efficient use of software licenses: The customer can have single license for multiple
computers running at different locations which reduces the licensing cost. Also, there is
no requirement for license servers because the software runs in the provider's
infrastructure.
• Centralized management and data: The cloud provider stores data centrally. However,
the cloud providers may store data in a decentralized manner for the sake of
redundancy and reliability.
• Platform responsibilities managed by providers: All platform responsibilities such as
backups, system maintenance, security, hardware refresh, power management, etc. are
performed by the cloud provider. The customer does not need to bother about them.
Issues
There are several issues associated with SaaS, some of them are listed below:
Browser based risks: If the customer visits malicious website and browser becomes infected,
the subsequent access to SaaS application might compromise the customer's data. To avoid
such risks, the customer can use a specific browser to access SaaS applications or can use virtual
desktop while accessing the SaaS applications.
Network dependence: The SaaS application can be delivered only when network is
continuously available. Also network should be reliable but the network reliability cannot be
guaranteed either by cloud provider or by the customer.
Lack of portability between SaaS clouds: Transferring workloads from one SaaS cloud to
another is not so easy because work flow, business logics, user interfaces, support scripts can
be provider specific.
UNIT-3
Introduction
Fault tolerance in cloud computing is very important to continue the service whenever a few
devices or components are down or unavailable. This helps the service provider to evaluate their
infrastructure requirements, and provide services when the associated devices are unavailable
due to some cause.
• Cloud Scalability is the ability to scale on-demand the facilities and services as and when
they are required by the user.
• Cloud Fault Tolerance is tolerating the faults by the cloud that are done by mistake by the
user.
• Cloud middleware is designed on the principle of scalability along with different dimensions
in mind e.g.:- performance, size and load.
• The cloud middleware manages a huge number of resources and users which depends on
the cloud.
• So in this overall scenario the ability to tolerate the failure is normal but sometimes it
becomes more important than providing an efficient & optimized system.
• The overall conclusion says that “it is a challenging task for the cloud providers to develop
such high scalable and fault tolerance systems and at the same time they will have to provide
a competitive performance.
Replication: The fault-tolerant system works on the concept of running several other replicates
for each and every service. Thus, if one part of the system goes wrong, than the other instances
that can be placed instead of it to keep it running.
Redundancy: When any system part fails or moves towards a downstate, then it is important to
have backup type systems.
System Failure: This may be either software or hardware issue. The software failure results in a
system crash situation that may be due to data overflow or other reasons. Any improper
maintenance of the physical hardware machines will result in hardware system failure.
Security Breach Occurrences: There are several reasons why fault tolerance occurs due to
security failures. The hacking of the server negatively impacts the server and results in a data
lost. Other reasons for the necessity of fault tolerance in the form of security cracks include
phishing, virus attack, etc.
Cloud solutions
Any cloud-based solution refers to provide Applications SOFTWARE, Storage Soace, On-Demand
services, Computer networks, and other resources that are associated with cloud computing.
• Cloud providers use a pay-as-you-go model, so that the client can pay to cloud as per the
requirements only. This is very much helpful for a start-ups.
• For end-users, cloud computing means they can access everything like Files, Emails,
Business applications and many more from any device and from anywhere if there is an
internet connection irrespective of place and environment.
• As the cloud-based technology is growing and the SOFTWARE as A SERVICE (SaaS) solution
is available in affordable price. So the clients of a Small Business (SMB) are interested in
cloud computing.
Cloud Ecosystem
A cloud ecosystem is a complex system of inter-dependent components that all works together
to enable the cloud services. In cloud computing, the ecosystem consists of hardware and
software as well as cloud customers, cloud engineers, consultants, integrators and partners.
A robust ecosystem provides a cloud provider's customers with an easy way to find and purchase
business applications and respond to changing business needs. When the apps are sold through
a provider’s app store such as AWS (Amazon Web Services) Marketplace, Microsoft Azure
Marketplace (for cloud software) or Microsoft AppSource (for business applications), the
customer access the catalogue of different vendors' software and services that have already
been scrutinized and reviewed for security, risk and cost.
• Companies can use a cloud ecosystem to build new business models. They can promote
their business using cloud eco system than they sell their product to the customer.
Specially in medical equipment.
• In a cloud ecosystem, it is also easier to review data and analyse how the each part of
the system affects the other parts. For example a doctor can examine a patient over the
cloud because all the previous data and present problems of the patient available in the
cloud .
• Cloud ecosystem is helpful for complex system of interdependent components that
work together to enable the cloud services.
• The centre of a cloud ecosystem is a public cloud provider. It might be an IaaS provider
such as Amazon Web Services (AWS) or a SaaS vendor such as Salesforce.
• There is no vendor lock-in in the cloud ecosystem. That means a client can switch over
its business one cloud to other cloud without any restriction. Ex. Mobile No. portability.
Anywhere, anytime access: Cloud BPM, stores information in a centralized database thereby
making access possible any time from any location. Further, stakeholders can access the
application from any device.
Secure data: Data security is most essential factor for any organization. Cloud BPM application
comes with a wide range of security features such as role-based access, conditional visibility, data
encryption, and more.
Reputed cloud business process management service providers host their applications on reliable
platforms such as Amazon Web Services or Google Cloud Platform, which in turn improves the
security of sensitive information.
Reliable, consistent experience: In older client-server system users were constantly threatened
by the possibility of server downtime and virus or malware attacks. With cloud BPM, vendors
provide ample backup to ensure that there’s minimal downtime and protect data using built-in
firewalls.
Better collaboration: Collaboration is incredibly easy with cloud BPM, irrespective of whether the
users are in the same office or at different offices . Centralized documentation, digital checklists,
and automated process flow make it possible for information to be accessed by stakeholders
whenever the need arises.
Improved insights: Cloud BPM applications feature has capabilities to store all in a central
database. It becomes simpler to monitor and analyse the data.
The goal of cloud portability and interoperability is to enable cloud service users to avoid vendor-
lock- in and allow for customers to make best use of multiple cloud services.
Basic scenarios
The Cloud Standards Customer Council (CSCC) guide to cloud portability and interoperability
has identified five major scenarios requiring interoperability and portability:
Switching cloud service providers: the customer can move an application and data from one
Cloud to other Cloud.
Use of multiple cloud service providers: the customer subscribes to the same or different
services from two or more cloud service provider.
Directly linked cloud services: the customer needs Cloud 1 to be linked to Cloud 3 to make use
of its services
Hybrid cloud configuration: the customer connects traditional systems to an internal private
cloud which is linked to a public cloud service.
Cloud migration: the customer moves one or more in-house applications and/or data to Cloud.
Cloud portability is the ability to transfer applications between cloud environments without losing
any data. Several cloud providers have portability facility.
Cloud interoperability refers to the ability of customers to use the same management tools, server
images and other software with a variety of cloud computing providers and platforms.
• Data Portability: is the ability to easily transfer data from one cloud service to another cloud
service.
Data Flow of the System: The managers are responsible to develop a technology for data flow.
This process describes the movement of data between the organization and the cloud server.
Vendor Lock-In Awareness and Solutions: The managers must know the procedure to exit from
services of a particular cloud provider. The procedures must be defined to enable the cloud
managers to export data of an organization from their system to another cloud provider.
Knowing Provider’s Security Procedures: The managers should know the security plans of the
provider for the following services:
• Multi users
• E-commerce processing
• Employee screening
• Encryption policy
Monitor Audit Log Use: In order to identify errors in the system, managers must audit the logs
on a regular basis.
Solution Testing and Validation: When the cloud provider offers a solution, it is essential to
test it in order to ensure that it gives the correct result and it is error-free. This is necessary for
a system to be robust and reliable.
Cloud Offerings
It offers various servers, storage, databases, networking, software, analytics, and intelligence
over the Internet (“the cloud”) to the client in an innovative, faster and flexible way. The various
offerings are:
• Hypervisor: In this process the time required to access and terminate server is
reduced through hardware virtualization.
• Map Reduce: Large data sets to be processed are divided into smaller data chunks
and distributed among users. Individual results are later consolidated.
• Blob Storage: A large amount of data can be stored just like a file system. That
means data can be stored in a specified folder assigned for a particular type of file
like Audio Folder, Video Folder, Image Folder etc.
• Virtual Networking: It is responsible for how can the physical networking resources,
such as networking interface cards, switches, routers etc. can be used in a virtual
mode . These Virtual Networking resources may share the same physical networking
resources.
Load and performance testing conducted on the applications and services provided via cloud
computing in order to ensure maximum performance and scalability under a wide variety of
conditions.
Testing under the cloud decreases the manual intervention of technical persons for testing the
network condition.
• Reduces capital investment and operational costs and without effecting the business
targets.
• Offers new and attractive services to the clients and provides an opportunity to speed
cycles of innovations and improve the solution quality.
In Other words it is a firewall for Google Cloud Service (GCS), BigQuery, Bigtable and other
supported services. It gives information security teams peace of mind that no-one can access
data contained in these services from unauthorized networks.
Virtual Desktop Infrastructure (VDI) is a concept in which a server based computing model used
to deliver applications to remote users.
Virtual Desktop Infrastructure or VDI is the name given to a collection of technologies and
processes that extends the concept of a remote desktop.
The idea behind the Virtual desktop infrastructure is that, companies can virtualize their desktop
operating systems like Windows XP or Vista and run the same OS in the desktops from and within
the secured datacentre.
UNIT-4
Cloud Management and Virtualisation Technology
Virtualization is a technique, which allows to
share a single physical demand of a resource
or an application among multiple customers
and organizations.
Virtualization is commonly hypervisor-based. The hypervisor (In hypervisor process the time
required to access and terminate server is reduced through hardware virtualization) isolates
operating systems and applications from the underlying computer hardware so that the host
machine can run multiple virtual machines.
Data Centre
A Virtual Datacentre is a huge cloud infrastructure designed for enterprise business needs.
Virtual Datacentres are hosted in the public cloud which provides full compatibility with any
environment.
A virtualized data centre is a logical software abstraction of a physical data centre that provides
a collection of cloud infrastructure components including servers, storage clusters, and other
networking components, to business enterprises
Resilience
Resilient means "having the ability to spring back. "Resiliency is the ability of a server, network, storage
system, or an entire data centre, to recover quickly and continue operating even when there has been an
equipment failure, power outage or other disruption.
Data centre resiliency is a planned part of a cloud architecture and is usually associated with other disaster
planning and data centre disaster-recovery like data protection.
Agility
Cloud agility refers to the addition of business value. When it comes to the cloud context, agility is all
about the ability of an organization to rapidly develop, test, and launch software applications that drive
business growth.
Greater Business Continuity and Flexibility: Due to Agility Cloud services can be rolled up or down as per
business requirements without increasing the bunch of IT equipment. For example, you can start with a 10
node cluster and then easily increase to 50 nodes as your requirements change.
Infrastructure Agility: Cloud allows companies to significantly decrease the time it takes to provision and
de-provision IT infrastructure.
Storage
Cloud Storage is a service that allows to save data on offsite storage system managed by third-
party and is made accessible through web based API.
Storage Devices
Block Storage Devices: The block storage devices offer raw storage to the clients. These raw
storage are partitioned to create volumes.
File Storage Devices: The file Storage Devices offer storage to clients in the form of files,
maintaining its own file system. This storage is in the form of Network Attached Storage (NAS).
Unmanaged Cloud Storage: Unmanaged cloud storage means the storage is preconfigured for
the customer. The customer can neither format, nor install his own file system or change drive
properties.
Managed Cloud Storage: Managed cloud storage offers online storage space on-demand. The
managed cloud storage system appears to the user to be a raw disk that the user can partition
and format.
Provisioning
Cloud provisioning is the allocation of a cloud provider's resources and services to a customer.
Cloud provisioning is the key feature of the cloud computing model, relating to how a customer
procures cloud services and resources from a cloud provider. Cloud provision includes
infrastructure as a service (IaaS), software as a service (SaaS) and platform as a service (PaaS).
• Dynamic provisioning: With dynamic provisioning, cloud resources are deployed flexibly
to match a customer's fluctuating demands.
• User self-provisioning: With user self-provisioning, also called cloud self-service, the
customer buys resources from the cloud provider through a web interface or portal. This
usually involves creating a user account and paying for resources with a credit card.
Asset Management
Cloud asset management (CAM) is a component of cloud management services focused on the
management of business in cloud environment, such as the products or services that are used
in cloud. Cloud asset management delivers visibility and control of all the assets and
infrastructure that make up your cloud environment. It's a crucial first step towards a better
optimised, more secure cloud.
o It can be used in various application like document clustering, distributed storage and web link.
o It can be used for distributed pattern-based searching.
o We can also use MapReduce in machine learning.
o It was used by Google to regenerate Google's index of the World Wide Web.
o It can be used in multiple computing environments such as multi-cluster, multi-core, and mobile
environment.
Cloud Governance
Cloud Governance is a set of rules. It applies specific policies or principles to the use of cloud
computing services. This model aims to secure applications and data even if located distantly.
The best Cloud Governance solutions include People, Processes, and Technology. It basically
refers to the decision making processes, criteria, and policies involved in the planning,
architecture, acquisition, deployment, operation, architecture, acquisition, implementation,
operation, and management of a Cloud computing capability. Cloud Governance best practices
help to optimize the organization’s:
Load Balancing
Cloud load balancing is the process of distributing workloads and computing resources in a cloud
computing environment. Load balancing allows enterprises to manage application or workload
demands by allocating resources among multiple computers, networks or servers. Cloud load
balancing involves hosting the distribution of workload traffic and demands that reside over the
Internet. Cloud load balancing helps enterprises achieve high performance levels for potentially
lower costs than traditional on-premises load balancing technology. Cloud load balancing takes
advantage of the cloud's scalability and agility to meet rerouted workload demands and to
improve overall availability. In addition to workload and traffic distribution, cloud load balancing
technology can provide health checks for cloud applications.
High Availability
High availability is a type of computing infrastructure that allows to continue the functioning
of computer even when some of its components fail. This is very important for a cloud
customer who cannot tolerate interruption in service, and any downtime can cause damage
or result in financial loss.
High Availability in the cloud is achieved by creating clusters. A high availability cluster is a group
of servers that act as a single server to provide continuous service. These servers have common
access to the same shared storage space for data. So if a server is unavailable, the other servers
pick up the load. A high availability cluster can be anything from two to dozens of servers. As
well as providing failover, high availability clusters also allow load balancing of workloads so that
anyone server within the cluster will not get overloaded and you can provide more consistent
performance.
• Failover—a mechanism that can switch automatically from the currently active
component to a redundant component, if monitoring shows a failure of the active
component.
Disaster Recovery
Cloud disaster recovery (cloud DR) is a combination of strategies and services intended to back
up data, applications and other resources to public cloud or dedicated service providers. When
disaster occurs, the affected data, applications and other resources can be restored to the local
data centre or a cloud provider and resume normal operation for the enterprise.
Cloud disaster recovery is primarily an infrastructure as a service (IaaS) solution that backs up
designated system data on a remote offsite cloud server. It provides updated recovery point
objective (RPO) and recovery time objective (RTO) in case of a disaster or system restore.
Cisco is the only vendor that delivers a complete architecture with advanced services, support,
and industry-leading products. Cisco can help design the optimal end-state data centre
architecture and meet each tactical deployment phase of network evolution with the best
products and services to achieve it.
Benefits
• Lower-priced server and storage infrastructure
• Increased business agility and adaptability
• Ability to meet regulatory compliance standards with integrated network security and
support for business continuance
• Tested and verified design and extensive service offerings for lower implementation costs
and reduced risk
• Investment protection for core data centre platforms offering multiyear deployment
lifecycles
• Rapid application development and time to market of business-critical services
UNIT-5
Virtualisation
Virtualization in Cloud Computing is making a virtual platform of server operating system
and storage devices. This will help the user by providing multiple machines at the same time it
also allows to share the physical resource and an application to multiple users.
Cloud Virtualizations also manage the workload by transforming traditional computing and
make it more scalable, economical and efficient. One of the important features of virtualization
is that it allows sharing of applications to multiple customers and companies. The various type
of Virtualizations are:
Network Virtualisation: Network virtualization helps to manage and monitor the entire
computer network as a single administrative entity. Admins can keep a track of various
components of network infrastructure such as routers and switches through a single software-
based administrator’s console. Network virtualization helps the network for transferring data
perfectly, flexibly, reliably and securely. It improves the overall network’s productivity and
efficiency. It becomes easier for administrators to allocate and distribute resources conveniently
and ensure high and stable network performance.
Desktop Virtualisation: Desktop virtualization is when the host server can run virtual machines
using a hypervisor (a software program). A hypervisor can directly be installed on the host
machine or over the operating system (like Windows, Mac, and Linux). Virtualized desktops
don’t use the host system’s hard drive; instead, they run on a remote central server. This type
of virtualization is useful for development and testing teams who need to develop or test
applications on different operating systems.
Local desktop Virtualisation : Local desktop virtualization means the operating system runs on
a client device using local hardware virtualization. This type of desktop virtualization works well
when users do not need a continuous network connection and can meet application computing
requirements with local system resources. However, this technique can be implemented locally
only.
Application Virtualisation: The process of installing an application on a central server that can
virtually be operated on multiple systems is known as application virtualization. For end users,
the virtualized application works exactly like a original application installed on a physical
machine. With application virtualization, it’s easier for organizations to update, maintain, and
fix applications centrally. Admins can control and modify access permissions to the application
without logging in to the user’s desktop. Another benefit of application virtualization is
portability. It allows users to access virtualized applications even on non-Windows devices, such
as iOS or Android.
Server Virtualisation: Server virtualization is a process of partitioning the resources of a single server
into multiple virtual servers. These virtual servers can run as separate machines. Server virtualization
allows businesses to run multiple independent tasks with different configurations using a single (host)
server. The process also saves the hardware cost involved in keeping a host of physical servers.
Block and File level Storage Virtualisation: Storage virtualization is an series of servers that are
managed by a virtual storage system. The servers aren’t aware of exactly where their data is
stored. This technology manages the storage of data from multiple users and utilized as a
single storage system. storage virtualization software maintains smooth operations and
consistent performance despite changes, break down and differences in the connected
equipment.
Data virtualization: This is a kind of virtualization in which the data is collected from various
sources and managed that in a single server without knowing more about the technical
information like how data is collected, stored & formatted. Then the stored data can be
arranged in sauch a way so that its virtual view can be accessed by interested users by using
the various cloud services remotely. Many big giant companies are providing their services like
Oracle, IBM, At scale, Cdata etc.
1. Persistent desktop: Users have the ability to customize and save the desktop so that it
will look the same way each time when a particular user logs in. Persistent desktop
requires more storage than non-persistent desktop.
2. Non-persistent desktop: Desktops are wiped each time the user logs out—they are
merely a way to access shared cloud services.
DaaS advantages:
• Easy platform migration
• Total cost reduction
• Minimized complexity
• Disaster recovery
• Uninterrupted connectivity
• Increased performance
• Personalization
• Reliability
• Data security
VMs are made possible through virtualization technology. Virtualization uses software to
simulate virtual hardware that allows multiple VMs to run on a single machine. The physical
machine is known as the host while the VMs running on it are called guests.
This process is managed by software known as a hypervisor. The hypervisor is responsible for
managing and provisioning resources like memory and storage from the host to guests.
Infrastructure Requirements
In virtualization, the server and the
software application which are required
by the cloud providers maintain by the
third party and in this, the cloud
provider gives some amount to the third
party.
Virtualisation benefits
• Security: During the process of virtualization security is one of the important factor. The
security can be provided with the help of firewalls, which will help to prevent
unauthorized access and will keep the data confidential. Moreover, with the help of
firewall and security, the data can protect from harmful viruses malware and other
cyber threats.
• Flexible operations: With the help of a virtual network, the work of IT professional is
becoming more efficient and active. The network switch implement today is very easy
to use, flexible and saves time.
With the help of virtualization in Cloud Computing, technical problems can solve in
physical systems. It eliminates the problem of recovering the data from crashed or
corrupted devices and hence saves time.
• Economical: Virtualization in Cloud Computing, save the cost for a physical system such
as hardware and servers. It stores all the data in the virtual server, which are quite
economical. It reduces the wastage, decreases the electricity bills along with the
maintenance cost. Due to this, the business can run multiple operating system and apps
in a particular server.
• Eliminates the risk of system failure: While performing some task there are chances that
the system might crash down at the wrong time. This failure can cause damage to the
company but the virtualizations help you to perform the same task in multiple devices
at the same time.
It is possible because the data is stored in the cloud and it can be retrieve anytime and
with the help of any device. Moreover, there is two working server side by side which
makes the data accessible every time. Even if a server crashes with the help of the
second server the customer can access the data.
• Flexible transfer of data: The data can transfer to the virtual server and retrieve anytime.
The customers or cloud provider don’t have to waste time finding out hard drives to find
data. With the help of virtualization, it will very easy to locate the required data and
transfer them to the allotted authorities.
A virtual local area network (VLAN) is a logical group of workstations, servers and network devices that
appear to be on the same LAN despite their geographical distribution. A VLAN allows a network of
computers and users to communicate in a simulated environment as if they exist in a single LAN and are
sharing a single broadcast and multicast domain. VLANs are implemented to achieve scalability, security
and ease of network management and can quickly adapt to changes in network requirements and
relocation of workstations and server nodes.
A VLAN allows several networks to work virtually as one LAN. One of the most beneficial elements of a
VLAN is that it removes latency in the network, which saves network resources and increases network
efficiency. In addition, VLANs are created to provide segmentation and assist in issues like security,
network management and scalability. Data Traffic can also easily be controlled by using VLANs.
• High risk of virus issues because one infected system may spread a virus through the whole logical
network
• It requires additional routers in very large networks to control the network and workload
A virtual storage area network (VSAN) is a logical partitioning created within a physical storage area
network. This implementation model of a storage virtualization technique divides and allocates some
or an entire storage area network into one or more logical SANs to be used by internal or external IT
services and solutions.
A virtual storage area network (VSAN) is primarily implemented in cloud computing and virtualization
environments. A VSAN allows end users to create a logical storage area network in the physical SAN
(Storage Area Network) through storage virtualization.
A VSAN provides similar services and features as a typical SAN, but because it is virtualized, it allows
for the addition and relocation of subscribers without changing the network's physical layout. It also
provides flexible storage capacity that can be increased or decreased over time.
UNIT-6
Cloud Security
Cloud security refers to the technologies, policies, controls, and services that protect cloud data,
applications, and infrastructure from hackers and threats. Cloud security is essential for many
users who are concerned about the safety of their data which is store in the cloud. Data stored
in the cloud is more secured because cloud service providers have superior security measures,
and their employees are highly security experts.
• Integrity: Data integrity in the cloud is that the cloud service provider can be guaranteed
that the data transmission between the user and the server must be secure. Integrity
can extend to how data is stored, processed, and retrieved by cloud services and cloud-
based IT resources.
• Risk: Risk is the possibility of loss or harm arising while performing an activity. Risk is
typically measured according to its threat level and the number of possible or known
vulnerabilities.
Auditing: To maintain the operational process in cloud , organizations use two basic methods:
1. system audits and monitoring. These methods can be employed by the cloud
customer, the cloud provider, or both, depending on the architecture and deployment
of the cloud computing. A system audit is a one-time or periodic event to evaluate
security.
2. Information technology (IT) audit: This audit is often divided into two types: internal
and external. Internal auditors are typically performing their task inside the
organization, whereas external auditors are auditing the external network
infrastructure.
Accountability: Accountability is the ability to determine the actions and behaviours of a single
individual within a cloud system. Accountability can be fixed on an individual employ. Employ’s
performance can be tracked and judged through accountability.
Design Principles
The NCSC (National Cyber Security Centre) published some cloud security principles. These
principles are designed to give guidance to cloud service providers in order to protect their
customers.
Data in transit protection: User data which is transitioning between networks should be
protected against any interference.
Asset protection and resilience: User data, and the assets storing or processing it, should be
protected against physical tampering, loss, damage or seizure.
Operational security: In order to prevent and detect attacks, the service must be operated
securely.
Personnel security: Service provider personnel should be thoroughly screened, followed by in-
depth training to reduce the possibility of accidental or malicious compromise.
Supply chain security: The service provider should ensure that their supply chain adheres to all
of the same security principles.
Secure user management: Service provider should ensure that the client should have the
relevant tools to securely manage the use of their services.
Identity and authentication: Access to the service interfaces should only be granted to specific
individuals and should all be guarded by adequate authentication measures – two way
authentication if possible.
External interface protection: Any external or less trustworthy service interfaces must be
identified and defended appropriately.
Audit information for users: A service provider should supply their customers with the audit
record to monitor the service and who is able to access your data. This is vital as it gives you a
means to identify inappropriate or malicious activity.
Secure use of service: You have a responsibility to ensure the service is used properly, to
ensure your data is kept safe and protected.
The requirements for secure cloud software are concerned with non-functional issues such as
minimizing or eliminating vulnerabilities and ensuring that the software will perform as
required, even under attack.
• It must be trustworthy in its own behaviour and it should able to handle the outside
attack
Below figure illustrates the major elements of the software requirements engineering process.
Policy Implementation
Security policies are the foundation of a sound cloud system security implementation. According
to the Data and Analysis Centre for Software (DACS), three main objectives common to all
system security policies and the mechanisms and countermeasures used to enforce those
policies:
• They must allow authorized person to connect and access the system to prevent
unauthorized access or connections, especially by unknown or suspicious user.
• They must be allowed to read, modify, destroy or delete of data while preventing
unauthorized users
• They must block the entry of content like user input, executable code, system
commands, etc. suspected of containing attack patterns or malicious logic that could
threaten the system’s ability to operate according to its security policy and its ability to
protect the information.
Implementation Issues : Before implementing the security policy it is very much important to
consider the following security issues.
• Access controls
• Data protection
• Confidentiality
• Integrity
• Identification and authentication
• Communication security and Accountability
Data Loss: Data loss is the most common cloud security risks of cloud computing. It is also known
as data leakage. Data loss is the process in which data is being deleted, corrupted, and
unreadable by a user, software, or application. In a cloud computing environment, data loss
occurs when our sensitive data is somebody else's hands, one or more data elements cannot be
utilized by the data owner, hard disk is not working properly, and software is not updated.
Hacked Interfaces and Insecure APIs (Application Program Interface): As we all know, cloud
computing is completely depends on Internet, so it is compulsory to protect interfaces and APIs
that are used by external users. APIs are the easiest way to communicate with most of the cloud
services. In cloud computing, few services are available in the public domain. These services can
be accessed by third parties, so there may be a chance that these services easily harmed and
hacked by hackers.
API is an application program interface that allows the end user to interact with a cloud provider's service
Data Breach: Data Breach is the process in which the confidential data is viewed, accessed, or
stolen by the third party without any authorization, so organization's data is hacked by the
hackers.
Vendor lock-in: Vendor lock-in is the of the biggest security risks in cloud computing.
Organizations may face problems when transferring their services from one vendor to another.
As different vendors provide different platforms, that can cause difficulty moving one cloud to
another.
Account hijacking: Account hijacking is a serious security risk in cloud computing. It is the
process in which individual user's or organization's cloud account (bank account, e-mail account,
and social media account) is stolen by hackers. The hackers use the stolen account to perform
unauthorized activities.
UNIT – 7
Cloud Computing Security Architecture
Cloud security architecture describes all the hardware and technologies designed to protect
data, workloads, and systems within cloud platforms. The security architecture design in cloud
computing may change just from the company to the company based on their requirement.
Nowadays several enterprises are willing to adapt hybrid cloud security architecture. It is an
advanced version of the cloud security architecture that helps to reduce the workload and
exposure of data.
Architectural Considerations
A variety of factors affect the implementation and performance of cloud security architecture.
There are general issues involving regulatory requirements, adherence to standards, security
management, information classification, and security awareness. A variety of topics influence
and directly affect the cloud security architecture. They include such factors as compliance,
security management, controls, and security awareness.
Compliance: In a public cloud environment, the provider does not normally inform the clients
about the storage location of their data. In fact, the distribution of processing and data storage
is one of the cloud’s fundamental characteristics. However, the cloud provider should cooperate
to consider the client’s data location requirements. In addition, the cloud vendor should provide
transparency to the client by supplying information about storage used, processing
characteristics, and other relevant account information.
Controls: The objective of cloud security controls is to reduce vulnerabilities to a tolerable level
and minimize the effects of the data hack.
Security Awareness: The purpose of computer security awareness, training, and education is to
enhance security by adopting the following steps:
Information Classification:
The information that an organization process must be classified according to the organization’s
sensitivity to its loss or disclosure. The information system owner is responsible for defining the
sensitivity level of the data. Classification according to a defined classification scheme enables
security controls to be properly implemented. The information classification process also
supports disaster recovery planning and business continuity planning.
Classification Criteria
Several criteria may be used to determine the classification of an information object:
• Value — Value is the number one commonly used criterion for classifying data in the
private sector. If the information is valuable to an organization or its competitors, then
it needs to be classified.
• Useful life — If the information has been made obsolete due to new information,
substantial changes in the company, or other reasons, the information can often be
declassified.
1. Identify the appropriate administrator and data custodian. The data custodian is
responsible for protecting the information, running backups, and performing data
restoration.
5. Specify the termination procedures for declassifying the information or for transferring
custody of the information to another entity.
Features of VPN
• Encryption of IP address: The primary job of a VPN is to hide the IP address from the
ISP and other third parties. This allows you to send and receive information online
without the risk of anyone but you and the VPN provider seeing it.
• Encryption of protocols: A VPN should also prevent you from leaving any proof, in the
form of internet history, search history and cookies. The encryption of cookies is
especially important because it prevents third parties from gaining access to confidential
information such as personal data, financial information and other content on websites.
• Kill switch: If your VPN connection is suddenly interrupted, your secure connection will
also be interrupted. A good VPN can detect this sudden downtime and terminate all the
activities of the user.
• Secure encryption: With the help of a VPN, online activities can be hidden even on
public networks.
• Disguising your whereabouts : VPN servers essentially act as your proxies on the
internet. Because the actual location cannot be determined. In addition, most VPN
services do not store logs of your activities.
• Access to regional content: Regional web content is not always accessible from
everywhere. Services and websites often contain content that can only be accessed
from certain parts of the world. Standard connections use local servers in the country
to determine your location. This means that you cannot access content at home while
traveling, and you cannot access international content from home.
• Secure data transfer: If you work remotely, you may need to access important files on
your company’s network. For security reasons, this kind of information requires a secure
connection. To gain access to the network, a VPN connection is often required. VPN
services connect to private servers and use encryption methods to reduce the risk of
data leakage.
Key management
Key management refers to managing cryptographic keys within a cryptosystem. It deals with
generating, exchanging, storing, using and replacing keys as needed at the user level. A key
management system will also include key servers, user procedures and protocols, including
cryptographic protocol design. The security of the cryptosystem is dependents upon the
successful key management.
When designing a Key Management System, a system designer may be not necessarily be a
member of the organization that will be using the system. Therefore, he may not have access to
the policies of the organization. Often the designer will create a set of policies and features that
are commonplace for the organization’s market. The designer will normally then provide
documentation to explain how these policies and features are used within the Security Policy.
• Physical security – the most visible form of compliance, which may include locked doors
to secure system equipment and surveillance cameras. These safeguards can prevent
unauthorized access to printed copies of key material and computer systems that run
key management software.
• Logical security – protects the organization against the theft or unauthorized access of
information. This is where the use of cryptographic keys comes in by encrypting data,
which is then rendered useless to those who do not have the key to decrypt it.
Managing keys can be a challenge, especially for larger organizations that rely upon
cryptography for various applications. The primary problems that are associated with managing
cryptographic keys include:
Public key
Public key is a class of cryptographic protocol based on algorithms. This method of cryptography
requires two separate keys, one that is private or secret, and one that is public. Public key
cryptography uses a pair of keys to encrypt and decrypt data to protect it against unauthorized
access or use. Network users receive a public and private key pair from certification authorities.
If other users want to encrypt data, they get the intended recipient’s public key from a public
directory. This key is used to encrypt the message, and to send it to the recipient. When the
message arrives, the recipient decrypts it using a private key, to which no one else has access.
Public key cryptography remains the most secure protocol (over private key cryptography)
because users never need to transmit or reveal their private keys to anyone, which lessens the
chances of cyber criminals discovering an individual’s secret key during the transmission.
Encryption key management is administering the full lifecycle of cryptographic keys. This
includes: generating, using, storing, archiving, and deleting of keys. Protection of the encryption
keys includes limiting access to the keys physically, logically, and through user/role access.
• Developing and implementing a variety of policies, systems, and standards that govern
the key management process
Digital certificates
Digital certificates are used in public key cryptography functions. They are most commonly used
for initializing secure SSL (Secure Sockets Layer) connections between web browsers and web
servers. Digital certificates are also used for sharing keys to be used for public key encryption
and authentication of digital signatures.
SSL (Secure Sockets Layer): Secure Sockets Layer (SSL) is a networking protocol designed for
securing connections between web clients and web servers over an insecure network, such as
the Internet.
Digital certificates are used by all major web browsers and web servers to provide assurance
that published content has not been modified by any unauthorized actors, and to share keys for
encrypting and decrypting web content. Digital certificates are also used in other contexts, both
online and offline, for providing cryptographic assurance and privacy of data.
2. Code Signing Certificate: Code Signing Certificates are used to sign software or
files that are downloaded over the internet. They’re signed by the
developer/publisher of the software. Their purpose is to guarantee that the
software or file is genuine and comes from the publisher
3. Client Certificate: Client Certificates or Digital IDs are used to identify one user
to another, a user to a machine, or a machine to another machine. One common
example is emails, where the sender digitally signs the communication, and the
recipient verifies the signature. Client certificates authenticate the sender and
the recipient.
Memory Cards
Memory cards provide non-volatile storage of information, but they do not have any processing
capability. A memory card stores encrypted passwords and other related identifying
information. A telephone calling card (SIM CARD) and an ATM card are examples of memory
cards.
Identification and authentication are the keystones of most access control systems.
Identification is the act of a user admitting an identity to a system, usually in the form of a
username or user logon ID to the system. Authentication is a process that verifies the user’s
identity by implementing password technic at the time of logon. Authentication is based on the
following three factor:
• Type 3 — Something you are (physically), such as a finger print or retina scan
Controls
Controls are implemented to manage the risk factor. Controls provide accountability for
individuals who are accessing sensitive information in a cloud environment. This accountability
is accomplished through access control mechanisms that require identification and
authentication, and through the audit function. These controls must be in accordance with and
accurately represent the organization’s security policy. Control measures can be administrative,
logical (also called technical), and physical in their implementation.
• Logical or technical controls involve the restriction of access to systems and the
protection of information. Examples of these types of controls are encryption, smart
cards, access control lists, and transmission protocols.
• Physical controls incorporate guards and building security in general, such as the locking
of doors, the securing of server rooms or laptops, the protection of cables, the
separation of duties, and the backing up of fi les.
Autonomic Systems
The autonomic computing system has the goal of performing self-management to maintain
correct operations despite alarms to the system. Such a system requires physical inputs,
decision-making capability, and the ability to implement various activities to maintain the
normal operation.
• Malicious attacks
• Hardware or software faults
• Excessive CPU utilization
• Power failures
• Organizational policies
• Inadvertent operator errors
• Interaction with other systems
• Software updates
UNIT-8
Market Based Management of Clouds
The Real Potential of cloud computing is that, to facilitates the establishment of a market for
trading IT utilities. Market oriented cloud computing is a virtual market place where IT Service
can be trade.
• Auctioneer:- The auctioneer is in charge of keeping track of the running auctions in the
market place and verifying that the auctions for services are properly conducted and
prevented from performing illegal activities.
• Bank:-the bank is the component that takes care of the financial aspect of all the
operations happening in the virtual market place.
Accounting:- It is responsible for maintaining the actual resources used by the user, so that the final
coat can be charged to the users.
VM monitor:- It keeps the track on the availability of VMs and their resources.
Dispatcher:- The dispatcher mechanism starts the execution of accepted requests on allocated VMs.
Cloud request monitor:- It keeps the track on execution of request with SLA.
Cloud Security is the set of policies and technologies designed to protect data and
infrastructure involved in a cloud computing environment. The top concerns that the cloud
security companies are looking into the Identity, Access management, and Data privacy.
• FireEye: In October 2019, FireEye announced its FireEye Cloud Security Solution, which
includes cloud versions of FireEye Network Security, Detection On Demand security
scanning, and the FireEye Helix security operations platform.
• Lacework: Lacework is a cloud workload security and compliance solution that is well
suited for organizations looking for a visual approach to cloud security.
• McAfee: McAfee has a broad set of cloud security capabilities, including CASB, data
loss prevention (DLP) and threat prevention.
• Palo Alto Networks: Palo Alto Networks has one of the most comprehensive cloud
native security platforms in the market, with deep capabilities to help organizations
with workload security.
• Symantec: Symantec has multiple cloud security functions within its portfolio,
including workload protection and CASB.
1. Cloud Exchange
The Cloud Exchange acts as a mediator
between cloud coordinator and cloud broker.
The demands of the cloud broker are taken
care by the cloud exchange to the available
services provided by the cloud coordinator.
2. Cloud Coordinator
The cloud coordinator assigns the resources of
the cloud to the remote users based on the quality of service they demand and the credits they
have in the cloud bank. The cloud enterprises and their membership are managed by the cloud
controller.
3. Cloud Broker
The cloud broker interacts with the cloud coordinator, analyses the Service-level agreement and
the resources offered by several cloud providers in cloud exchange. Cloud broker finalizes the most
suitable deal for their client.
Benefits of Federated Cloud:
1. It minimizes the consumption of energy.
2. It increases reliability.
4. It connects various cloud service providers globally. The providers may buy and
sell services on demand.
First, it allows providers to earn revenue from computing resources which are idle or
underutilized.
Second, cloud federation enables cloud providers to expand their geographic area and
accommodate more users without establishing a new cloud server.
Advantages:
1. Maintenance and support: If something goes wrong it is the duty of the provider to
ensure the problem is fixed.
2. Security Benefit: A lot of company feel more secure putting their data in the hands of
an experienced cloud computing provider rather than jumping into the unknown and
trying to manage the security of their important data themselves.
3. Cost advantages: Third party clouds are particularly advantageous for SMBs (Small and
Medium Business) because they do not require huge investments.
Disadvantages:
1. Security worries: You are entirely responsible for the security of your own data.
2. Lack of control: With third party cloud computing you have minimal control over cloud
and its management.
3. Potential cost drawbacks: If you will access the third party cloud for along period like
5years or more than it is not a cost worthy.
Case study
• Google App Engine provides Web app developers and enterprises with access to
Google's scalable hosting and 1-tier Internet service.
• Google App Engine provides a scalable runtime based on the Java and Python
programming language.
• Google App Engine also removed some system administration and developmental
tasks to make it easier to write scalable applications.
Google app engine allows you to build web applications on the same stable and extendable
platform which having support facility of Google’s large number of applications.
1. Google app engine gives facility to use and run applications in Google's data centre.
2. Google app engine's language Java and Python are easy to understand and
implement.
3. This platform is absolutely free; you can purchase additional resources if needed.
4. Using Google accounts you can use Google app engine's services.
5. It is easy to scale up as your data storage and traffic needs grows with time.
7. User can easily write the application code, and can test it on own local system and
upload it to Google at the click of a button or with a few lines of command script.
9. Google takes care of all the apps maintenance and allows users/developers to focus on
the features of the application.
UNIT-9
Hadoop
Introduction
Hadoop is an open-source java-based software framework sponsored by the Apache Software
Foundation for distributed storage and distributed processing of very large data sets on
computer clusters built from commodity hardware. It provides storage for big data at reasonable
cost.
Data Source
Hadoop solves two key challenges with traditional databases:
Capacity: Hadoop stores large volumes of data. By using a distributed file system called HDFS
(Hadoop Distributed File System), the data is split into chunks and saved across clusters of
available servers. As these available servers are built with simple hardware configurations, these
are economical and easily scalable for a large amount of data.
Speed: Hadoop stores and retrieves data faster. It uses the Map Reduce functional programming
model to perform parallel processing across data sets. So, when a query is sent to the database,
instead of handling data sequentially, tasks are split and simultaneously run across the
distributed servers. Finally, the output of all tasks is collated and sent back to the user. In this
way it drastically improves the processing speed.
• HDFS is a file system or storage layer of Hadoop. It can store data and can handle very
large amount of data.
• When capacity of file is large then it is necessary to partition it. And the file systems
manage the storage across a network of machine are called distributed file systems.
•
Hadoop keep data safe by duplicating data across nodes.
MapReduce:
Characteristics of Hadoop:
1. Hadoop provides a reliable shared storage(HDFS) and analysis system (Map Reduce).
2. Hadoop is highly scalable. It can contain thousands of servers.
3. Hadoop works on the principles of write once and read multiple times.
4. Hadoop is highly flexible, can process both structured as well as unstructured data
Hadoop: It is an open-source software framework used for storing data and running
applications on a group of commodity hardware. It has large storage capacity and high
processing power. It can manage multiple concurrent processes at the same time. It is used in
predictive analysis, data mining and machine learning. It can handle both structured and
unstructured form of data. It is more flexible in storing, processing, and managing data than
traditional RDBMS. Unlike traditional systems, Hadoop enables multiple analytical processes
on the same data at the same time. It supports scalability very flexibly.
8 The data schema of RDBMS is static type. The data schema of Hadoop is dynamic type.
9 High data integrity available. Low data integrity available than RDBMS.
10 Cost is applicable for licensed software. Free of cost, as it is an open source software.