The challenge of managing IT environments is the ability of staying abreast of the many facets that exist. No matter what your role is in the environment, the requirement to monitor an aspect is present from the manager overlooking operational performance to the service desk agent keeping a handle on open problem tickets. In most cases, the items requiring monitoring are not few. Nor are they convenient.
Most companies invest large portions of their IT budget to create reports and monitoring interfaces to systems. However, many of these efforts are done independently of each other and not often considered as a collective. As a result, the people required to manage the environment often have to use several distinct tools in order to obtain a full picture.
Mashups can provide an effective alternative to the challenge. Using web technologies to access the business logic used to control and support the IT environment, a presentation interface can be created. Do this for every system found in the environment to create a “catalog” of available monitoring tools.
However, mashups actually provide greater capabilities. By identifying the tools required, an individual can create a single interface that allows them to view several systems simultaneously from an web browser. The ability to compile a single interface for multiple perspectives is the core ability of a mashup.
However, it doesn’t end there. Some business functions may find a need for some additional logic in presenting information: for instance, applying financial performance to the number of problem tickets resolved to identify the true cost of services. Unfortunately, the information is often found in separate systems. Mashups components can be created to apply business logic to such situation and make it available for monitoring.
For viewing multiple perspectives of the environment, mashups prove an effective management tool. The ability to customize allows managers to build their own “monitoring tool.” And for the business, an inexpensive alternative to empower their managers.
A lot of companies are looking for ways to manage their IT environments. The fastest growing method is the ITIL® standard. The Information Technology Infrastructure Library (ITIL®) is a series of books covering each major topic of IT management. It serves as a framework of concepts and policies that are considered “best practices” for managing the infrastructure, development, and operations of information technology. Companies implementing ITIL® into their processes can have their employees become certified as well as their business accredited. Still, why adopt ITIL®?
The reasons for ITIL® are numerous, but one reason in particular is often overlooked: the adoption of a common language. Especially with IT businesses working with other IT business, the advantage of using similar frameworks allow for common terminology, concepts, and processes. Two organizations using tools based on ITIL® practices have a the greatest opportunity to merge systems during alliances and partnerships. For instances, many IT organizations have a single problem management process to resolve disruptions on service. Within the ITIL® framework, two processes actually exist: incident management and problem management. The first handles the disruption if a resolution is readily available, a workaround if a resolution is not. Problem management handles finding a resolution when one is not available. Two companies attempting to merge IT operations together as in the case of acquisitions, partnerships, or service support agreements have a step up in the game if they are using the same framework.
Some relationships with other companies require ITIL® compliance, particularly companies in Europe, government contractors, and the financial industry where ITIL® is growing strong. The growth of the framework has encouraged its recognition of a IT management standard.
Virtualization is a technique used in network environments to provide an abstract rendering of the physical attributes of a system. Within this abstract, the capabilities of the system can be expanded, manipulated, and dissected without impacting or changing the physical environment. For instance, with virtualization, several storage devices can be rendered as a single device to be utilized by the entire network .The benefits of doing this allow greater utilization of the total storage capacity of the network. The use of the technique goes the other direction: a rendered volume can be partitioned on the fly without any impact to the physical device or the data found on it. The total volume can be increased without adding anymore physical devices. An advantage to virtualization is that change like these can be made instantly within the rendering.
Though this technique is found in use to manage the technical aspects of a IT organization, the adoption of virtualization had application elsewhere, particularly in process management. Within any organization, a number of of processes exist. Included with the processes are the people and the tools used to execute the processes. Many of these tools are managed separately by different process teams within different systems. The people executing their portion of the process may not even be part of the functional team responsible for the process. The result of this segregation across processes can lead to miscommunication and missteps in process flow. Virtualization can end this segregation.
By creating an abstract rendering of the processes, the technique of virtualization can provide a basis for controlling, monitoring, and improving processes. Technically, the ability to virtualize as simple as extracting the information from multiple systems and creating relationships in the information at a layer of business logic just above the process systems. The advantage of using virtualization provides for moving work through the process and their corresponding systems automatically and simultaneously when required. Through the virtualization layer, the functionalities of the systems can be utilized and expanded to allow for a richer, more robust process capabilities. Additionally, performance impacts between processes can be readily explored to encourage improvements in the interactions of the processes as well as internally to the processes.
With greater focus on process control in the business sector, virtualization can provide a holistic perspective on the strengths and weaknesses of the operations.
Managing a business can be difficult undertaking. Daily issues and operational controls can provide plenty of focus to tie up a full day’s schedule. For this reason, the idea of complying to some external regulation or standard can sometimes seem an unwelcome intrusion, especially when it is announced that a pending audit will happen soon. Then the need to scramble to ensure that everything in place tempts the focus from maintaining operations.
In truth, compliance to a standard is not an intrusion, but an opportunity. Whether referring the ISO standards, process frameworks like ITIL®, quality management programs like Six Sigma, or governmental regulations for industries, compliance is still an opportunity. How? Because, these statutes provide a tool which to rate your organization against other organizations and past performance. As a tool, the statutes provide points of concern to focus on improvements in the business. Using the standards and regulations as tool for improvement provide greater benefits to your operations.
Compliance encourages business. Customers are becoming more savvy in looking for products and services. Discerning customers realize that compliant companies are proficient in receiving and delivering on requirements. In some instances, they are only willing to work with companies compliant to specific regulations and standards to protect their own interests.
The opportunity of being complaint can also be seen in building teams. Being complaint cannot happen with one person nor in a short period of time. It requires a team to be performing as a team at all times within the guidelines set done. So the next time an audit is announced, remember the opportunity for improvement, for increased business, for building teamwork.
With the increased focus on IT Service Management, many companies are striving to know more as well as prove that their IT services are well managed. The best way of handling both is to conform to a standard recognized by the majority of IT professionals. Currently, that would be ISO/IEC 20000, the first worldwide standard for IT Service Management. The standard is aligned with the process approach defined by the Office of Government Commence (OGC) in the Untied Kingdom. ISO/IEC 20000 is complementary to ITIL®.
The standard consists of two parts. The first part is the formal Specification of processes that make up the framework of the standard. The specification defines the requirements for delivering management services to an appropriate quality. The scope of the standard includes:
- Requirements for a management system
- Planning and implementing service management
- Planning and implementing new or changed services
- Service delivery processes
- Relationship processes
- Resolution processes
- Control processes
- Release processes
The second part of the standard is the Code of Practice and describes the best practices for Service Management.
Certification for ISO/IEC 20000 was created and is managed by itSMF. Amy organization can claim to be compliant to the standard, however certification in the standard requires the verification of compliance to be done by an independent assessed. Formal certification is provided when all requirements of the standard have been met. To find a member of itSMF Registered Certification Body (RCB), visit the website www.isoiec20000certification.com.
Over the years, any number of solutions have been used to reduce the cost of IT services in organization or even remove responsibility of such services from normal business operations. The creation of traditional in-house infrastructures can become a large portion of an organization’s operational budget. A popular choice become to outsource IT to a service provider. The infrastructure was still owned or leased by the organization but the maintenance and services required on the infrastructure was managed by the service provider.
The newest trend is an utility-based service, Platform-as-a-Service (PaaS). Ideal for organizations developing and offering web applications, a PaaS offers an environment where applications can be developed, tested, deployed, and maintained. To support the effort, service providers typically provide an IDE and integration with Web Services and Databases.
These platforms are designed on a multi-tenant architecture. With this architecture, the provider can share resources with more than one organization. For the customer, the sharing provides advantages in the form of reduce costs, faster development cycles, and transfer of infrastructure responsibility. As an utility-based service, the organization only pays for what they use, typically based on storage, bandwidth, and/or users. The service solution provides a great option for start-up companies or organizations looking to offset their current IT solutions.