Tuesday, February 16, 2016

cloud computing


The practice of using a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer

or

In its most simple description, cloud computing is taking services ("cloud services") and moving them outside an organizations firewall on shared systems. Applications and services are accessed via the Web, instead of your hard drive.


Cloud computing enables companies to consume compute resources as a utility -- just like electricity -- rather than having to build and maintain computing infrastructures in-house.

Cloud computing promises several attractive benefits for businesses and end users. Three of the main benefits of cloud computing include:

Self-service provisioning: End users can spin up computing resources for almost any type of workload on-demand.
Elasticity: Companies can scale up as computing needs increase and then scale down again as demands decrease.
Pay per use: Computing resources are measured at a granular level, allowing users to pay only for the resources and workloads they use.

Cloud computing services can be private, public or hybrid.

Private cloud services are delivered from a business' data center to internal users. This model offers versatility and convenience, while preserving management, control and security. Internal customers may or may not be billed for services through IT chargeback.

In the public cloud model, a third-party provider delivers the cloud service over the Internet. Public cloud services are sold on-demand, typically by the minute or the hour. Customers only pay for the CPU cycles, storage or bandwidth they consume.  Leading public cloud providers include Amazon Web Services (AWS), Microsoft Azure, IBM/SoftLayer and Google Compute Engine.

Hybrid cloud is a combination of public cloud services and on-premises private cloud – with orchestration and automation between the two. Companies can run mission-critical workloads or sensitive applications on the private cloud while using the public cloud for bursty workloads that must scale on-demand. The goal of hybrid cloud is to create a unified, automated, scalable environment which takes advantage of all that a public cloud infrastructure can provide, while still maintaining control over mission-critical data.


Although cloud computing  has changed over time, it has always been divided into three broad service categories: infrastructure as a service (IaaS), platform as a service (PaaS) and software as service (SaaS).

IaaS providers such as AWS supply a virtual server instance and storage, as well as application program interfaces (APIs) that let users migrate workloads to a virtual machine (VM). Users have an allocated storage capacity and start, stop, access and configure the VM and storage as desired. IaaS providers offer small, medium, large, extra-large, and memory- or compute-optimized instances, in addition to customized instances, for various workload needs.

In the PaaS model, providers host development tools on their infrastructures. Users access those tools over the Internet using APIs, Web portals or gateway software. PaaS is used for general software development and many PaaS providers will host the software after it's developed. Common PaaS providers include Salesforce.com's Force.com, Amazon Elastic Beanstalk and Google App Engine.

SaaS is a distribution model that delivers software applications over the Internet; these are often called Web services. Microsoft Office 365 is a SaaS offering for productivity software and email services. Users can access SaaS applications and services from any location using a computer or mobile device that has Internet access. 



Advantaes and dis advantages of Cloud

The pros of cloud computing are obvious and compelling. If your business is selling books or repairing shoes, why get involved in the nitty gritty of buying and maintaining a complex computer system? If you run an insurance office, do you really want your sales agents wasting time running anti-virus software, upgrading word-processors, or worrying about hard-drive crashes? Do you really want them cluttering your expensive computers with their personal emails, illegally shared MP3 files, and naughty YouTube videos—when you could leave that responsibility to someone else? Cloud computing allows you to buy in only the services you want, when you want them, cutting the upfront capital costs of computers and peripherals. You avoid equipment going out of date and other familiar IT problems like ensuring system security and reliability. You can add extra services (or take them away) at a moment's notice as your business needs change. It's really quick and easy to add new applications or services to your business without waiting weeks or months for the new computer (and its software) to arrive.

Cons

Instant convenience comes at a price. Instead of purchasing computers and software, cloud computing means you buy services, so one-off, upfront capital costs become ongoing operating costs instead. That might work out much more expensive in the long-term.
If you're using software as a service (for example, writing a report using an online word processor or sending emails through webmail), you need a reliable, high-speed, broadband Internet connection functioning the whole time you're working. That's something we take for granted in countries such as the United States, but it's much more of an issue in developing countries or rural areas where broadband is unavailable.
If you're buying in services, you can buy only what people are providing, so you may be restricted to off-the-peg solutions rather than ones that precisely meet your needs. Not only that, but you're completely at the mercy of your suppliers if they suddenly decide to stop supporting a product you've come to depend on. (Google, for example, upset many users when itannounced in September 2012 that its cloud-based Google Docs would drop support for old but de facto standard Microsoft Office file formats such as .DOC, .XLS, and .PPT, giving a mere one week's notice of the change—although, after public pressure, it later extended the deadline by three months.) Critics charge that cloud-computing is a return to the bad-old days of mainframes and proprietary systems, where businesses are locked into unsuitable, long-term arrangements with big, inflexible companies. Instead of using "generative" systems (ones that can be added to and extended in exciting ways the developers never envisaged), you're effectively using "dumb terminals" whose uses are severely limited by the supplier. Good for convenience and security, perhaps, but what will you lose in flexibility? And is such a restrained approach good for the future of the Internet as a whole? (To see why it may not be, take a look at Jonathan Zittrain's eloquent bookThe Future of the Internet—And How to Stop It.)
Think of cloud computing as renting a fully serviced flat instead of buying a home of your own. Clearly there are advantages in terms of convenience, but there are huge restrictions on how you can live and what you can alter. Will it automatically work out better and cheaper for you in the long term?



In summary

Pros

  • Lower upfront costs and reduced infrastructure costs.
  • Easy to grow your applications.
  • Scale up or down at short notice.
  • Only pay for what you use.
  • Everything managed under SLAs.
  • Overall environmental benefit (lower carbon emissions) of many users efficiently sharing large systems. (But see the box below.)



Cons

  • Higher ongoing operating costs. Could cloud systems work out more expensive?
  • Greater dependency on service providers. Can you get problems resolved quickly, even with SLAs?
  • Risk of being locked into proprietary or vendor-recommended systems? How easily can you migrate to another system or service provider if you need to?
  • What happens if your supplier suddenly decides to stop supporting a product or system you've come to depend on?
  • Potential privacy and security risks of putting valuable data on someone else's system in an unknown location?
  • If lots of people migrate to the cloud, where they're no longer free to develop neat and whizzy new things, what does that imply for the future development of the Internet?
  • Dependency on a reliable Internet connection.


Thursday, February 11, 2016

Email servers

What is a Mail Server?

 With the click of a mouse button, you can send an email from one point of the globe to another in a matter of seconds. Most of us take this process for granted, giving little thought to how it actually works. It's easy to understand how standard snail-mail gets from point A to point B - but how does an email message make its way from a sender to a recipient? The answer to that question revolves around something called a mail server. You can learn more about the role that mail serves play in email delivery by reading on below.

What is a Mail Server?

A mail server is the computerized equivalent of your friendly neighborhood mailman. Every email that is sent passes through a series of mail servers along its way to its intended recipient. Although it may seem like a message is sent instantly - zipping from one PC to another in the blink of an eye - the reality is that a complex series of transfers takes place. Without this series of mail servers, you would only be able to send emails to people whose email address domains matched your own - i.e., you could only send messages from one example.com account to another example.com account.

Types of Mail Servers

Mail servers can be broken down into two main categories: outgoing mail servers and incoming mail servers. Outgoing mail servers are known as SMTP, or Simple Mail Transfer Protocol, servers. Incoming mail servers come in two main varieties. POP3, or Post Office Protocol, version 3, servers are best known for storing sent and received messages on PCs' local hard drives. IMAP, or Internet Message Access Protocol, servers always store copies of messages on servers. Most POP3 servers can store messages on servers, too, which is a lot more convenient.

The Process of Sending an Email

Now that you know the basics about incoming and outgoing mail servers, it will be easier to understand the role that they play in the emailing process. The basic steps of this process are outlined below for your convenience.

Step #1: After composing a message and hitting send, your email client - whether it's Outlook Express or Gmail - connects to your domain's SMTP server. This server can be named many things; a standard example would be smtp.example.com.

Step #2: Your email client communicates with the SMTP server, giving it your email address, the recipient's email address, the message body and any attachments.

Step #3: The SMTP server processes the recipient's email address - especially its domain. If the domain name is the same as the sender's, the message is routed directly over to the domain's POP3 or IMAP server - no routing between servers is needed. If the domain is different, though, the SMTP server will have to communicate with the other domain's server.

Step #4: In order to find the recipient's server, the sender's SMTP server has to communicate with the DNS, or Domain Name Server. The DNS takes the recipient's email domain name and translates it into an IP address. The sender's SMTP server cannot route an email properly with a domain name alone; an IP address is a unique number that is assigned to every computer that is connected to the Internet. By knowing this information, an outgoing mail server can perform its work more efficiently.

Step #5: Now that the SMTP server has the recipient's IP address, it can connect to its SMTP server. This isn't usually done directly, though; instead, the message is routed along a series of unrelated SMTP servers until it arrives at its destination.

Step #6: The recipient's SMTP server scans the incoming message. If it recognizes the domain and the user name, it forwards the message along to the domain's POP3 or IMAP server. From there, it is placed in a sendmail queue until the recipient's email client allows it to be downloaded. At that point, the message can be read by the recipient.

How Email Clients are Handled

Many people use web-based email clients, like Yahoo Mail and Gmail. Those who require a lot more space - especially businesses - often have to invest in their own servers. That means that they also have to have a way of receiving and transmitting emails, which means that they need to set up their own mail servers. To that end, programs like Postfix and Microsoft Exchange are two of the most popular options. Such programs facilitate the preceding process behind the scenes. Those who send and receive messages across those mail servers, of course, generally only see the "send" and "receive" parts of the process.


At the end of the day, a mail server is a computer that helps move files along to their intended destinations. In this case, of course, those files are email messages. As easy as they are to take for granted, it's smart to have a basic grasp of how mail servers work.

Top 10 Server Technology Trends for the New Decade

Top 10 Server Technology Trends for the New Decade

Mobility and agility are the two key concepts for the new decade of computing innovation. At the epicenter of this new enabled computing trend is cloud computing. Virtualization and its highly scaled big brother, cloud computing, will change our technology-centered lives forever. These technologies will enable us to do more; more communicating, more learning, more global business and more computing with less — less money, less hardware, less data loss and less hassle. During this decade, everything you do in the way of technology will move to the data center, whether it's an on-premises data center or a remote cloud architecture data center thousands of miles away.


1.            Mobile Computing
Ten trends for the next 10 years. An era of agile computing is upon us. Keep an eye on these 10 server-oriented technology trends.
As more workers report to their virtual offices from remote locations, computer manufacturers must supply this new breed of on-the-go worker with sturdier products loaded with the ability to connect to, and use, any available type of Internet connectivity. Mobile users look for lightweight, durable, easy-to-use devices that "just work," with no lengthy or complex configuration and setup. This agility will come from these smart devices' ability to pull data from cloud-based applications. Your applications, your data and even your computing environment (formerly known as the operating system) will live comfortably in the cloud to allow for maximum mobility.

2.            Virtualization
By the end of this decade, virtualization technology will touch every data center in the world. Companies of all sizes will either convert their physical infrastructures to virtual hosts and guests or they'll move to an entirely hosted virtual infrastructure. As more business owners attempt to extend their technology refresh cycle, virtualization's seductive money-saving promise brings new hope to stressed budgets as we collectively pull out of the recession. The global move to virtualization will also put pressure on computer manufacturers to deliver greener hardware for less green.

3.            Cloud Computing
Cloud computing, closely tied to virtualization and mobile computing, is the technology that industry observers view as "marketing hype" or old technology repackaged for contemporary consumption. Beyond the hype and relabeling, savvy technology companies will leverage cloud computing to present their products and services to a global audience at a fraction of the cost of current offerings. Cloud computing also protects online ventures with an "always on" philosophy, guaranteeing their services will never suffer an outage. Entire business infrastructures will migrate to the cloud during this new decade, making every company a globally accessible one.

4.            Web-based Applications
Heavy, locally installed applications will cease to exist by the end of the decade. This move will occur ahead of the move to virtual desktops. The future of client/server computing is server-based applications and client. Everything, including the client software, will remain on a remote server. Your client device (e.g., cell phone, computer, ebook reader) will call applications to itself much like the X Terminals of yesteryear.

5.            Libraries
By the end of this decade, printed material will all but disappear in favor of its digital counterpart. Digitization of printed material will be the swan song for libraries, as all but the most valuable printed manuscripts will head to the world's recycling bins. Libraries, as we know them, will cease operation and likely reopen as book museums where schoolchildren will see how we used physical books back in the old days.

6.            Open Source Migration
Why suffer under the weight of license fees when you can reclaim those lost dollars with a move to open source software? Companies that can't afford to throw away money on licensing fees will move to open source software including Linux, Apache, Tomcat, PostgreSQL and MariaDB. This decade will prove that the open source model works, and the proprietary software model does not.

7.            Virtual Desktops
Virtual Desktop Infrastructure (VDI) has everyone's attention these days and will continue to hold it for the next few years as businesses move away from local desktop operating systems to virtual ones housed in data centers. This concept ties into mobile computing, virtualization and cloud computing. Desktops will likely reside in all three locations (PC, data center, cloud) for a few more years, but the transition will approach 100 percent non-local by the end of the decade. Moving away from localized desktop computing will result in lowering maintenance bills and alleviating much of the user error associated with desktop operating systems.

8.            Internet Everywhere
You've heard of the Internet haven't you? Do you remember when it was known as The Information Superhighway and all of the discussions and predictions about how it would change our lives forever? The future is here and the predictions came true. The next step in the evolution of the Internet is to have it available everywhere: supermarket, service station, restaurant, bar, mall and automobile. Internet access will exist everywhere by the end of this new decade. Every piece of electronic gadgetry (yes, even your toaster) will have some sort of Internet connectivity due in part to the move to IPv6.

9.            Online Storage
Currently, online storage is still a geek thing with limited appeal. So many of us have portable USB hard drives, flash drives and DVD burners that online storage is more of a luxury than a necessity. However, the approaching mobile computing tsunami will require you to have access to your data on any device with which you're working. Even the most portable storage device will prove unwieldy for the user who needs her data without fumbling with an external hard drive and USB cable. Much like cell phones and monthly minutes plans, new devices will come bundled with an allotment of online storage space.

10.          Telephony
As dependence on cell phones increases, manufacturers will create new phones that will make the iPod look like a stone tool. They won't resemble current phones in appearance or function. You'll have one device that replaces your phone, your computer, your GPS and your ebook reader. Yet another paradigm shift brought about by the magic of cloud computing. Telephony, as we know it, will fall away into the cloud as Communication as a Service (CaaS). Moving communications to the data center with services such as Skype and other VoIP is a current reality, and large-scale migrations will soon follow.



SCCM Application Deployment Tool

SCCM Application Deployment Tool Streamlining SCCM Application Deployments: Introducing the SCCM Application Deployment Tool. In the realm o...