Mainframe Migration: Challenges of the Process

Subject: Tech & Engineering
Pages: 17
Words: 5941
Reading time:
22 min
Study level: College

Introduction

Mainframes have for a long time been the most popular forms of data storage. This is majorly because compared to other systems, mainframes have been proven to be secure and are generally stable even under the harshest of conditions. They also have the potential to handle large volumes of data without undergoing the strain that other systems face. The major drawback that has see most organizations migrate from this system to open systems comes in the hefty maintenance costs which presents in extreme labor force demands and high costs of software upgrading. Institutions and organizations that have mainframe systems trying to make the cross to newer systems will definitely be met with a lot of challenges especially in regards to how to transfer the large volumes of data that have been accumulated over time.

This essay shall provide an analysis of the challenges that the process of mainframe migration brings. The various problems will be explained under the support of cross references from secondary sources, which will essentially be used to offer substance to the main ideas. The paper will also try to encompass the interpretive paradigm to illustrate the impacts of these challenges to the process. In addition to explaining the challenges, the essay will also go ahead to provide a brief detailing of the steps involved in the migration. This will mainly serve the purpose of illustrating the strategic complications that management has to go through when planning on how to make the shift. Finally a review of the discussion and a summary of the key points shall be given alongside the conclusion.

Background

Problem Scene

For a company to remain competitively relevant, it is necessary that it ensures that its systems are constantly upgraded in order to take the strain that is occasioned by change in time. By far, mainframe systems have been the most stable of all technological systems particularly in the field of information technology. However, they as well can last for so long. Business enterprises have had to conform to the changes in technology and as such have been forced to get rid of the mainframes which though expensive to maintain, have served them well for a number of years.

The major advantage of mainframe migration and which has served to attract the attention of most businesses is the reduction in operation costs in the long run. For instance, when the Shared Data Center sought out to implement their two year migration from mainframe to open source starting 2006, they had foreseen a 30-70% reduction in operation costs.

The problems with migrating off of mainframes to open source systems present in the prohibitive costs involved in the transfer alongside the issue of maintaining data integrity during the shift. The switch from mainframe (Legacy) to open source systems such Oracle is not something that can be achieved by the click of a button. This is an extensive process calling for collaboration of various professionals over extended periods of time. In the larger systems, it may take up to fifteen years to complete the switch. This would require the enterprise to recruit full time staff to work on the system during all this time. Even when the switch has been done, it may require that the main frame be kept partially on in case some cross reference has to be done.

The maintenance of data integrity is by far the most challenging of all the issues that come with migration. This is because of the issue of software compatibility. As a consequence, the migration will need very many technicians to analyze the system track by track and ensure that the transferred data is in perfect condition. A survey conducted by Softek in the year 2005 on the impact of migration on 700 end users revealed that cost overruns usually presenting in the form of hidden costs weighed in heavily on the enterprises. Aside from this, the results also found that 83 per cent of the companies studied experienced some problems with the migration.

Problem

Word has always been propagated in the professional fields to support the notion that migration from mainframe systems to open source systems directly relates to savings in the company’s expenditure. What has been ignored is the fact the costs involved in the transfer could literally bring the enterprise down.

Delimitation

The research will be delimited to cover both the technical and financial problems of migration. To a great extent the process will be informed by pre-existing literature on the topic though field study will be carried out to clearly show the extent of the problem.

Literature Review

Reasons for migration off of mainframes

Several reasons have been fronted to support the decision to migrate from mainframes to open systems. First and foremost is the issue of merging of companies that have been operating as different entities under one major organization or the coming together of various small companies to form one large company (Peter & Nemes 1995). Consolidation could also come about as a result of one company buying into another smaller company and essentially acquiring the right to have it (the acquired company) operate under its roof. For instance, last year Microsoft decided to buy the software company Opalis software from Sentillon. Whenever mergers and consolidations occur, the major problem that faces most of such companies is the issue of how to easily and quickly integrate the communication systems of the two organizations in such a way that a harmonious system is developed. This is where the aspect of the timing comes in. For many institutions that use mainframe systems, transactions are constantly happening and there is no way that the systems can be shut down in order to accommodate the restructuring process once a merger has taken place. This aspect then makes the installation of mainframes a huge liability as far as upgrading of the systems is concerned.

The second reason that most companies front as their reason for migrating is the aspect of time (Jesus et al. 1999). Information technology is one of the rapidly evolving developments of this century. The features that made a computer sell two years ago are considered irrelevant today especially as far as the selling of the same is concerned. As newer and newer advancements are being made in the field, the older systems are abandoned as rapidly and the number of professionals who are conversant with these older versions of technology is dwindling. The few that are available are generally inclined toward overcharging for this services and this is only understandable based on the rules of demand and supply. When it comes to maintenance, the person who originally designed the system is generally required to be around to offer guidance in the restructuring whenever need arises. In his or her absence, the original documentation such as the plan of the system can be used. Unfortunately, due to some reason or other, the designer may not be with the company when the restructuring takes place and sometimes the original plans cannot be traced. Hiring individuals to analyze the system and effectively come up with a new plan to take care of factor in restructuring demands a huge amount of money with figures bordering on obscenity. Like most other computer dependent systems, mainframe software is usually under constant attack by bugs and other viruses. These bugs end up destroying the software and sometimes end up crashing the hardware as well. Requests for replacement of software is usually accompanied with the need for a huge some of money. This is because software is developed on a daily basis either as upgrades or as alternatives to the original software.

The third factor that has made most companies shift from mainframe to open systems is the issue of security (Peter & Nemes 1995). Mainframes servers generally tend to be insecure majorly because servers don’t come with an added security feature. As a result, added security measures have to be taken via installation of appropriate hardware and their corresponding software. As much as mainframes can operate for eons without the need for constant maintenance, they are still machines and at some point or other succumb to the pressures of age and degradation. If proper monitoring and constant replacement of both hardware and software is not implemented, the organization’s management can be caught off guard with the system crashing right before their eyes.

Another major challenge to mainframe security is the issue of complacency (Paul 1994). When the mainframes were introduced into the market, banks and other large institutions which relied on the constant movement of data from one center to another took to them with zeal and zest. The importance of the mainframe at that time could not be undermined and the institutions which used them undertook proper security measures to keep the systems secure. In most cases, the machines were encased in huge glass rooms and only selected professionals were allowed access into the rooms. By that time the major security issues involved people physically getting to the system and deploying sabotage. With the changes in time and with the advent of the internet, the glass housings for the mainframes have become irrelevant since individuals or companies with an intention to bring down their competitors can easily hire master hackers to penetrate barriers to the opponent’s mainframe system without having to physically invade the location. Aside from this, inhouse sabotage either intentional or accidental can occur. Back in the day, individuals were specifically trained to handle mainframes (Alon et al. 2005). These people were involved in the installation and maintenance of the systems to particular organizations and it was only they who could rectify mistakes that occurred in the systems. With cross-generational changes happening at the moment, individuals going into the field of information technology are no longer interested in particularly studying on the maintenance of mainframes. Most of the younger individuals would rather keep up with trends and study the computers of the day. As a result, when the ‘mainframe professionals, retire, they have no one to pass the baton to. Institutions are forced to hire people who are not well trained to handle the systems out of lack of options. These inexperienced professionals generally learn their job on-training and it is in this period that they can make disastrous mistakes which can lead to the destruction of the entire system. The most common of these accidental mistakes include changing of codes and the creation of big “loopholes” in the systems which could as easily let in unauthorized access, hence malicious damage.

With temptation by money, individuals can do almost anything. Mainframe technicians are no super humans and can easily fall to this temptation and do some in-house sabotage to their firms’ mainframe systems on behalf of the competitors. In most cases, this kind of sabotage is the one that is responsible for a majority of security breaches on the mainframes. The situation is worsened if the traitor cannot be identified to the extent that it might even require the company to invest in new systems and new staff (Jeffrey 1997).

Challenges of mainframe migration

Business challenges

Institutions that have mainframes installed generally have had them for a number of decades. Therefore the decision to migrate to other systems involves a lot of business considerations (Peter 1996). First, the new system must be able to handle the bulk of data that the mainframe was carrying and pick up the processing of information where the mainframe left. All the information and data that was in the mainframe has to be synchronized with all the information in the new system (Daniel 1997). The major problem that usually comes in the early stages of migration from mainframes is the fact that the shift is usually carried out one application at a time. In as much as the application in mainframe systems tend to be stand-alone this is done so that the number of risks can be kept at a bare minimum (Erik 2004). Unfortunately, most of the organizations with the systems have up to hundreds of such applications which have to be transferred and this makes the process time intensive. The organizations usually have difficulties in setting the protocol in order to identify which applications are of absolute importance and hence come up with a set strategy that would ensure that the migration makes business sense and also works towards reducing the risks as far as the company’s operations are concerned (Michael and Michael 1993).

Once a given application has been transferred the other challenge, at least business-wise comes in making decisions as pertains to the type of programs which have to be moved on as well. Business analysts have studied patterns and have come to the conclusion that the process of migrating off of the mainframes is usually made even more costly due to the introduction of new processes in the migration plan. This has been found to be cost the company even more than what is paid for the technology installation (John, Dennis & Nelson 1999). The business programs of a company would generally illustrate how its systems operate. As a result, during the process of migration, it is not of crucial importance to change, move or introduce new processes during the migration. Most of the processes hosted by the mainframe will have to be moved in their original condition and those that need some changes done will require very minor adjustment in order to be compatible with the new system (Surajit & Gerhard 2002).

When making decisions regarding the migration process, companies usually receive the migration as a chance to analyze the business strategies that they have used in the past and try to adjust them to suit the current times a well as ensure that the company’s systems will survive well into the future (Peter et al. 1996). This is because of the metamorphic nature of technology in the sense that the technological support which used to be important to the company in the past is usually not very relevant a few decades down the line (Cheong & Cyril 1992). In this way, the migration from mainframes usually gives the institution the chance to work on seal loopholes in their business operations especially technology-wise.

In the modern days, information is regarded as the mainstay of any business operation. Therefore, during the migration, it is of optimum importance that the transfer of data be implemented in a way that leads to the most minimal damage on it (Rateb, James & Patrick 1994). The new system should be able to maintain stability in terms of how the data is handled as the mainframe from which the shift is being made.

Technical challenges

The major challenges which come with the migration from the mainframes as far as technicalities are concerned usually presents in the form of the huge volumes of software and business information as well as in the complications of the systems occasioned by evolutionary changes that have taken place over time (Andreas, Andreas & Klaus 1997). In the mainframe systems, a majority of applications organize information in methods that are not relation including maintaining many records in single files. The major challenges as regards this aspect of technology come in the following ways (Jesus et. al 1999).

First, is the incompatibility of the structures and systems (Matteo, Stefano & Juris 2004). Because of the fact that mainframe systems are typically non-relational in terms of their operation, it becomes very challenging and sometimes even impossible to collect sets of data from the system and try to convert them to suit relational system-functionability (Peter & Nemes 1995). The real problems in this scenario present in the differences in the architectural structures of the two systems. Trying to convert data that has for decades accumulated in a non-relational system is a major challenge because it requires extensive knowledge in the inner-workings of the relational systems alongside that of legacy processes. A huge amount of manpower is required for the migration process (Jeffrey 1997). Luckily, even with the challenges in compatibility requirements, it is usually possible to reconcile the differences and have a new system up and running. In this case the company will need a set up which will obtain information from both the new and old system and streamline them in such a way that they flow transiently within the enterprise (Kent et. al 2001).

Integrity in terms of the reference points from a non-relational system to a relational system usually poses some serious challenge during the migration process (Alon 2005). The institution has to make sure that all cross-generation relations such as the parent to child relations within the mainframe files are adequately mapped out. A method of extracting non-relational data without impacting damages from the mainframe and putting it in a graphical format should also be designed (Maurizio 2002). This comes in handy as it offers the technicians the chance to isolate and relate the relational links in the mainframe and consequently maintain them in the new relational system. As a result, the process of migration is even more simplified and information can be transferred across the systems without causing damage (Paula 1994).

The migration process

In the process of migration from mainframe to open source systems, it is important to have a keying schema (Chikofsky & Cross 1990). This comes in handy whey time comes for mapping out links in the mainframe and relating them to the keys in the new system in order to enhance performance. The organization of data on the new system should be in such a way that it makes the same logical sense it was making in the mainframe. The challenge especially comes when trying to isolate the records that the mainframe hosted in a single file and trying to present them individually in the new system.

Migration approaches

Because of the technicalities involved in mainframe migration, business enterprises need to come up with plans that will effectively address all of the issues involved in the migration (Daniel 1997). There are numerous approaches that can be used to implement a successful migration of the mainframe. In this section, some of the most commonly used approaches will be analyzed. It should however be noted that these strategies can be modified as necessary in order to suit the situation at hand since each organization has its own specific needs and requirements(Erik 2004).

One-time offload

This strategy is commonly implemented to aid in trying out the environment which the system is migrating to. Information is transferred from the mainframe all in one go and the process is started and completed at a time when business is slow in the company such as over weekends and very early in the morning (Cheong & Cyril 1992). In order to successfully implement this strategy careful planning is required because the loadful of data has to be shifted within a very small window of time. This approach is used for trying out the migration strategy and when all tests have been confirmed, and then the entire databank can be transferred from the mainframe.

Incremental Offload

In this strategy, information is moved from the mainframe to the new open source system in small packages. Once the trial transfer of data has been completed, changes that were carried out in the mainframe environment are analyzed and then shifted depending on the period of time that they were carried out. For instance, some data can be moved depending on their weekly change classifications while others belong to the category of those with monthly changes (Paul 1994). Selective isolation of changes is then carried out after being extracted from the mainframe.

Replication and Synchronization

In this strategy, the two systems that is the mainframe and the open source system such as Oracle to which the migration is taking place are run concurrently with one system supporting the data while the other copies (Patrick & Klaus 2004). This is a very complex arrangement because the approach is designed in such a way that it supports both bi-directional and the incremental data offload. In a majority of the cases, the two systems will have to run alongside each other for a long time sometimes even going to decades before the mainframe is ready to shut down (Maurizio 2002). Decisions regarding the business will have to be made in proper time before trying to put this replication system into practice. This will help identify how the slave-master relationship between the mainframe and the open source system will work. If proper preparations are not made, some errors could get into the system essentially interfering with data on either entity.

Physical Federation

This strategy involves a number of sources which must be effectively analyzed in and selective pieces of data identified and joined in order to come out with one piece of functional commands in the new open sources system (Van der Hoeven, Brian & Remco 2007). The original information will still be stored in the storage systems of the mainframe but the data in the open source system say, Oracle is regarded as the functional set of information. This strategy is very common especially when the new system that is replacing the mainframe does not have all the functional potential of the mainframe. This approach may also be utilized when some aspects of the mainframe are just to complex to be replaced in a short time. this strategy enables the migration of bits of information from the mainframe which are vital in the implementation of the migration strategy (Muira 2007). The mainframe systems will be maintained but the bits of functionality that will have been crossed over to the new system will also enable the institution keep in time with the demands of the day.

Virtual Federation (Enterprise Information Integration)

This strategy is in more ways than one similar to the physical federation approach (Jeffrey 1997). However, the difference comes in the fact that instead of having to load all information into the open source system, the data that is received from the mainframe storage is combined in a virtual way, such that it comes out as one usable stream of commands that can be executed to ensure the proper running of operations in the enterprise.

Transactions on Mainframe

In this strategy, the new open source system to which the migration is taking place becomes the key storage for the records required in the implementation of business processes (Alon 2001). However, some of the important functionalities of the business will still be retained in the mainframe. Processes requiring redress by the new open source system are processed on it and when completed the feedback is relayed to the mainframe from which appropriate response is sent back to the monitor (Alon et. al 2005). This information is usually packaged in batches and may include such aspects as the control of customer information.

Transactions on the new system

In this strategy, the functionality of the mainframe is gradually transferred to the new open source system (Kent et. al 2001). The main however still maintains the vital role of record storage especially for the data that is critical in the implementation of business strategies. In this approach, migration is not entirely complete for a few years and more often than not reference will have to be made to the mainframe regarding some of the business processes (Charles 1992).

Issues that are associated with mainframe migration

There are six main challenges which come around with the migration of data from the mainframe systems to the open source approaches. These are detailed below.

Analyzing data from the mainframe

In a majority of cases there is usually a very limited understanding of the sources of data and their sources. The kind of information required in the migration is usually scattered amongst very many systems and in very many cases it is usually in the wrong format and of a quality that is below the acceptable standard (Fred 2002). In a number of cases some of the critical data in the implementation of strategies is either found to be very poorly stored in with missing documentation or is completely missing. The identification of information from the source in mainframes is very challenging because the mainframes usually store applications that have been developed in-house by the company over a long period of time (Peter 1996). This essentially means that during the migration the technicians have to go through thousands of applications and sort them out according to their level of importance.

Accessing data from the mainframe

An average business enterprise will usually depend on more than 50 vital business applications. Institutions that make more than $1 billion per year use up to 500 systems. however, irrespective of the number of systems from which information has to be transferred the major question is always about the strategy that will be used to attain this accomplishment (Peter & Nemes 1995). Companies need to come identify how data from the mainframe will be obtained before it is transferred to the new system. It therefore is not quite possible to access data from the mainframe and just directly move it to the open source system because of the volume of source data is usually quite high. This is also challenging because the quality of data has to be maintained. In addition, the format of the data in the mainframe should be the same as that in the new system (Matteo, Stefano & Juris 2004).

The issue of data quality in the mainframe storage

The technicians involved in the migration of data from the mainframe systems to the new systems must understand that more often than not there are instances of corrupt information hosted in the mainframe (Surajit & Gerhard 2000). The quality of data will usually depend on how it was initially fed into the system. If adequate measures are not taken, the data may end up being compromised. In addition, the kind of data that may be of vital importance in the mainframe may not necessarily be of importance in the new open source system. The technicians will therefore have to be extremely careful to ensure that they do not transfer harmful information to the new system as well as take care not to damage the information that is left in the mainframe.

Preparation and loading of data into the new system

When migrating from the mainframe to open source systems, the new system is usually being developed as the migration is taking place (Chikofsky & Cross 1990). During this phase of the migration, some requirements need to be modified and sometimes completely changed (John, Dennis & Nelson 1999). The data being transferred from the mainframe has to undergo rigorous tests and analysis in order to ensure that it is compatible with the target system.

Maintaining and sustaining the migration cycle

The migration off of the mainframe can never be a one-time effort. The mainframe systems have to be run for a while even after the new open source systems have been launched (Michael & Michael 1993). The data in the mainframe and the new system has to be synchronized during this period. In addition, regulations are that enterprises have to substantiate the migration long after it is complete. The mainframe system can therefore not be randomly shut down.

Factoring in behavioral changes of staff

Since mainframe systems are usually in operation for decades, changes especially in the technical aspects cannot just be made without any prior planning (Jesus et. Al 1999). These changes affect the individuals who work in the institution. As a result, the migration strategy will have to factor in the adjustments that employees and other individuals who interact with the system have to be made. This is where the synchronization of both the mainframe and new system comes in place as it provides for adaptation of the individuals to the system. In addition, proper staff training and provision of information regarding the operation of the new systems cannot be undermined.

Migration Methodologies

There are four main stages that govern the migration from the mainframe to open source systems (Andreas, Andreas & Klaus 1997). These are:

  1. Analysis of the data in the mainframe
  2. Transformation of the data to suit the new system
  3. Cleaning up the data before the transfer into new system
  4. The actual transfer of data into the open source system

The major issue that has usually presented with the four-step method of migration listed above is that it does not address the issue of the unpredictable nature of data transfers (Erik 2004). There is also the issue of insufficient technology required for the migration process. In most cases, the kind of technology that is required for the migration will majorly comprise some general purpose aspects which can be modified to fit in each of the aforementioned stages (Rateb, James & Patrick 1994). The most recommended strategy for properly implementing a migration usually happens in a cyclical manner rotating around the stages listed above. This strategy provides the technicians the opportunity to study the information contained in the mainframe and adequately analyze it. Afterwards the information can be removed from the mainframe, cleaned up and validated before it is actually transferred to the open source system (Peter 1996). The process then repeats itself until there is satisfaction that the migration is done.

Summary of Literature Review

The migration off of mainframes is an extremely intensive and demanding process. It should not be regarded as a one time event because the challenges that come with the migration cannot be underestimated. The six most common issues that arise with data mainframe migration include (Daniel 1997):

  1. Analyzing data from the mainframe
  2. Accessing data from the mainframe
  3. The issue of data quality in the mainframe storage
  4. Preparation and loading of data into the new system.
  5. Maintaining and sustaining the migration cycle.
  6. Factoring in behavioral changes of the staff

This section has detailed the above six steps aside from illustrating the major challenges that come with migration. It has been discovered that the challenges coming with the decision to migrate from mainframe to newer systems are numerous and would require extensive managerial decisions to implement. Data from secondary sources has been used to substantiate the discussion as well as offer solutions on circumventing the challenges noted. The literature review has revealed that data migration is indeed a very costly and time-intensive affair. This is majorly because of the fact that mainframes are rapidly going into extinction and therefore access to replacement parts, software and technical support is gradually becoming unavailable. The section has also detailed the processes involved in migration off of mainframes.

Methodology

Methodology Statement

The research will be based on secondary data collection. Data will be extracted from various journals, articles and books. The criteria of selection for the literature will be the relevance to the research topic as well as the year of publication. Both public and private libraries as well as online libraries will be visited in order to access the data. This research will be partly evidence based and partly founded on professional research by professionals in the field. Various articles will be studied in order to provide background information which will essentially give credibility to the final essay. Mainframe migration being a modern day reality cannot be effectively analysed without obtaining information from real case scenarios. Institutions that have made the shift from mainframe to open source will be studied to see and compare the types of installations that have received prominence in recent times. This will definitely make for some interesting research and in as much most of the information will only be used for reference purposes, it will effectively came round to form the back-born of the paper.

Information from the books will serve to provide explanation as regards the internal machinations of mainframe systems. This will be very crucial information that will make the research report appeal to both professionals and the general public. For the latter, it may require that some of the information obtained from the books and other publications be broken down into simple language and at the same time illustrations drawn from the commonly applied functionabilities of mainframe and open source systems.

Empirical data will be collected from recent studies with numbers and figures used to show the costs and economical impact of the mainframe migration in a particular institution and how it could effectively be used to serve as a guide for other organisations that would like to make the switch. Like with any other professional field of study, computer technology researches have to be conducted in such a way that the offer credibility to the practitioner. In such a scientific field, the strength lies in the figures and particularly the numbers obtained from real life scenarios to support collected evidence. With this knowledge in mind, effort will be made to obtain relevant information to the particular topic in question and this will be accompanied by proper citation.

Reasons for Selecting the Above Methodology

For any professional topic, chances are that extensive research has been carried out by professionals in the field before. Consequently, in order to establish the backbone of a given research project, it is only necessary that extensive review of literature be carried before identifying seeking first hand information from the field. The latter, i.e. information collected from the field is also necessary since it helps give professional credibility to the project. Combining results from both sources would serve to foster their symbiotic relationship with one offering background information and the other presenting up-to-date information on the topic.

Research Process Plan

The first step in conducting the research will come in the form of extensive review of literature from various secondary sources. Information on the topic of mainframe migration and integration processes will be collected from books, journals, Magazines, conference proceedings and websites.

The second step in the process is the collection of data directly from the field. This will to a large extent depend on various forms of interviews including one-to-one interviews as well as the use of questionnaires. Some of the people expected to be targeted for this part of the process include information technology specialists, administrators of enterprises which have shifted and those which intend to shift from mainframe to open source systems and financial analysts.

The above two steps would make it easy to come up with a survey question which will guide us into the third step of the process. In this stage, an analysis of the data obtained shall be carried out and the negative issues that are raised regarding the process of mainframe migrations shall be picked out and effectively compared with the positive ones. In this phase as well an evaluation of the financial backlog that face a company due to mainframe migration will also be carried out.

Conclusion

The outcome expected out of this research process is results that would make professionals and organizations involved in mainframe migration re-evaluate the benefits of the procedure as compared to the demerits. The research and subsequent paper will show the various loopholes which if sealed can make migration more affordable and those steps that can be taken to make mainframes last longer with minimal operational costs.

Research Time Plan

STEP 1 1.Literature Review 1 month
STEP 2 3.Interviews
a)Developing Questions
b)Conducting interview
1 to 1/2 months.
2 weeks.
4.Analysis of Interview 2 months
STEP 3 5.Survey:
a)Developing questions
b)Conducting the survey
1 month
2 months
6.Analysis of Survey 2 months
7.Final report 2 months

Reference List

Alon, Y.H., et al. 2005. Enterprise information integration: successes, challenges and controversies, New York, Association for Computing Machinery.

Alon, Y.H., 2001. ‘Answering queries using views: A survey’, The VLDB Journal, pp. 270–294.

Andreas, B., Andreas, G. & Klaus R.D., 1997. On the migration of relational schemas and data to object-oriented database systems. Proceedings of the 5th International Conference on Re-Technologies in Information Systems, Washington DC, IEEE Computer Society.

Charles, J.P., 1992. Enterprise Integration Modeling: Proceedings of the First International Conference. ‎Massachusetts, MIT Press.

Cheong, Y. & Cyril, S.K., 1992. Data Migration. In:Systems, Man and Cybernetics, IEEE International Conference, Washington DC, Institute of Electrical and Electronics Engineers.

Chikofsky, E.J. & Cross, J.H., 1990. Reverse Engineering and Design Recovery – A Taxonomy. IEEE Software, Washington DC, IEEE Computer Society.

Daniel, A., 1997. A Process Model for Re-Engineering, Migration and Multi-Use of Business Data. In:Proceedings of the 1st Euromicro Working Conference on Software Maintenanceand Reengineering (CSMR ’97), Washington DC, IEEE Computer Society Press.

Erik, P.B., 2004 Database Migration: A Literature Review and Case Study. DiplomaThesis. School of Information and Library Science of the University of North Carolina.

Fred, A.C., 2002 Enterprise Integration: An Architecture for Enterprise Application and Systems Integration. New Jersey, John Wiley & Sons.

Jeffrey, D.U., 1997 “Information Integration Using Logical Views”. ICDT, pp. 19–40.

Jesus, B., Deirdre, L., Bing, W. & Jane G., 1999. Legacy Information System Migration: A Brief Review of Problems, Solutions and Research Issues. Dublin, Trinity College.

John, B., Dennis, S. & Nelson W., 1999. DoD Legacy System Migration Guidelines. Technical Report, SEI/CMU. Pennsylvania, Carnegie Mellon University, Software Engineering Institute.

Kent, S., Gail, C., Raymond, B. & Aditya, S., 2001 Enterprise Integration‎. New Jersey, John Wiley & Sons.

Matteo, G., Stefano, R. & Juris, C., 2004 Beyond Data Warehousing: What’s Next in Business Intelligence? In:DOLAP’04. New York, Association for Computing Machinery.

Maurizio, L., 2002 “Data Integration: A Theoretical Perspective”. PODS 2002. pp. 233–246.

Michael, L.B. & Michael, S,. 1993. DARWIN: On the Incremental Migration ofLegacy Information Systems. Technical Report. Berkeley, University of California, Berkeley.

Muira, G., 2007. “Pushing the Boundaries of Traditional Heritage Policy: maintaining long-term access to multimedia content.” IFLA Journal, Vol. 33, pp. 323-326

Patrick, Z. & Klaus, R.D., 2004. “Three Decades of Data Integration – All Problems Solved?” WCC 2004, pp. 3–12.

Paul, J., 1994. A Method for Transforming Relational Schemas into Conceptual Schemas. In:Data Engineering, 1994. Proceedings. 10th International Conference, Washington DC, IEEE Computer Society Press.Peter, B, & Nemes, L 1995, Modelling and Methodologies for Enterprise Integration: Proceedings of the IFIP TC5 Working Conference on Models and Methodologies for Enterprise Integration, Queensland, Australia, November 1995. London, Chapman & Hall.

Peter, B. et al., 1996. Architectures for Enterprise Integration. United Kingdom, Springer.

Peter, H.A., 1996. Data Reverse Engineering: Slaying the Legacy Dragon. New York, McGraw-Hill.

Rateb A.H., James, C. & Patrick, M., 1994. Schema Translation Using Struc-tural Transformation. In:Proceedings of the 1994 conference of the Centre for AdvancedStudies on Collaborative Research. Ontario, Centre for Advanced Studies

Surajit, C. & Gerhard, W., 2000. Rethinking Database System Architecture: To-wards a Self-tuning RISC-style Database System. In:Proceedings of the 26th International Conference on Very Large Databases. Massachusetts, Morgan Kaufmann.

Van der Hoeven, J., Brian L. & Remco V., 2007. “Emulation for Digital Preservation in Practice: The Results.” The International Journal of Digital Curation, Vol. 2, 123-132.