© 2018 ITAMS. All rights reserved | Terms of Use | Privacy Policy | My Account | Sitemap
ITAMS also offers complete software licensing services relating to Oracle, though we never have and never will supply Oracle software. We offer similar services for a wide range of other Independent Software Vendors (ISVs) too.
Additional services include:
Finally, there is an easy solution to this problem.
ITAMS have developed in-house technology to process any volume of LMS script outputs quickly and cost-effectively into a human-readable demand statement.
We maintain this technology to enable the interpretation of all versions of the Review Lite scripts. The solution is low cost, highly secure and completely confidential.
You can use this capability on a one-off basis as a service or extend the service to run regular reports from new datasets on an ongoing basis.
This means that you will know in advance what information Oracle has to work with in preparing their audit position.
If you execute Oracle LMS Scripts on an ongoing basis as part of the service, you can:
a) use Oracle’s own audit process again and again to monitor your demand position, without Oracle involvement, in the leadup to contract renewals;
b) react quickly with highly accurate data in future audit situations;
c) spot and remediate deployment errors where option packs have been left on or where servers have been deployed on inappropriate infrastructure;
d) produce trending information about your Oracle estate, predicting demand.
1) You don’t feel you want to send the files directly to Oracle, as this gives them complete control and interpretation of your demand position.
Yet you have to send them, as this is a requirement of the audit.
2) Considerable licensing and technical experience is required to interpret the files manually.
3) A large amount of analyst labour is required to interpret them, especially for larger estates.
4) It takes longer to interpret them than Oracle gives you in an audit situation.
5) You have no ability to optimise your position prior to submission.
In audit and contract renewal situations, Oracle regularly requires the deployment of their ‘LMS Review Lite scripts’ for Oracle Database servers.
These scripts must be deployed on each Oracle Database server and produce a number of separate files containing reference information on the database instances, their underlying virtual/physical CPU infrastructure and the database option packs deployed.
The files are effectively not human readable without deep knowledge of Oracle licensing and the scripts themselves. These files are deposited by default on the local machine, outside the Oracle security perimeter.
Collectively, the script outputs contain all the information needed to assess licensing demand for Oracle Database products.
There are several substantial challenges in the use of these scripts though, quite aside from the change and security permissions necessary to get them deployed in the first place.
Please fill in the form below to download the GEAR 1 Service Overview.
Please fill in the form below to download the Data Hub Service Overview.
In our three-part whitepaper series on ‘IT Asset Data Quality’, we highlighted the issues that can be caused by inaccurate IT asset data within multiple areas of the organisation such as outsourcer billing optimisation, CMDB and ITSM controls, licence management optimisation, security and far more.
For organisations to have complete confidence in their IT asset data we recommended creating a ‘Golden Record’.
A customer of ours recently asked for a brief explanation as to what a Golden Record is and how it is created, this is how our consultant explained it:
A Golden Record is a single, well defined, version of specific data, held within a company, about a device or asset. It is also known as the ‘single version of the truth’. It is where to go, to ensure you have the correct version of a piece of information.
A Golden Record can be created by reconciling several sources of data that may have a different ‘view’ of the asset. The aim of creating these is to have a set of records that the company knows are accurate and can rely on. The Golden Record can be held in a separate repository but it will not be the master. The master data will always sit within the source it was created and the Golden Record will be updated each time a reconciliation is carried out.
Data to be taken from the sources will need to be optimised initially to give the best possible opportunities for matching the records and a repository will need to be created to accept all the prepared data. Data must then be extracted from the sources, within the same time period, to reduce the risk of time effecting the field values.
The work to carry out the analysis in identifying assets that are the same and creating the Golden Record will be very time consuming. The best matches will be from the identification of the key fields. Even once this match has been made, the other fields must then be analysed to see which values should go through to the Golden Record and which are incorrect.
Once the analysis is complete, it can be investigated where the inaccuracies lie and what needs to be done to correct them. The processes that support the data population for each source will need to be reviewed and adjusted if necessary. For historically incorrect data, manual work may be required to change the data. This may also involve changing roles and responsibilities in some teams to ensure the work gets done.
Improving data quality should be an ongoing process. The initial reconciliation will yield a tremendous amount of work and raise many questions but will lead to the largest immediate benefits. Repeating these activities will show trends over time and support that the work being done is producing improvements. New, or enhanced, processes will ensure data does not deteriorate.
Where an estate is outsourced, the service provider will be creating bills based on their view of the client’s assets. Either they will own and manage their own tools and databases for this, or they will use those belonging to the client. Either way, it is necessary to be able to validate the information. If an asset is not correctly billed for; it could be that it is not registered and therefore not supported, or it is marked as decommissioned and still being billed for. Both scenarios are equally as bad.
The IT asset data used for billing will also support a multitude of other functions within the business, such as refresh, service desk, procurement and software asset management/audits. It is therefore vital that there is complete confidence that the data upon which the billing is based is true and accurate. Using data reconciliation techniques, across multiple sources, will ultimately lead to improved data integrity, processes and confidence – further supporting the business and saving time and money.
Case Study
A London based company was taking back ownership of the server and end-user computing assets from their existing IT outsourcers. To date there had been little ongoing validation of the service billing figures or of the physical estate numbers in general. The already deployed discovery and operational tools (e.g. AD, AV, CMDB) had been under the management of the outsourcer too, so there had also been limited day to day involvement with the asset figures.
An initial consultative assessment was proposed to determine the scope of the transfer and discuss the impact to the company in terms of the required refresh, software licensing, service design, roles and responsibilities and how the future mode of operation would operate. In this case, the billing figures would have been used to understand the estate and plan the above activities. However, lack of confidence in these figures, or the data held within the sources led to further action being required. This work would have been equally as important if the outsourcer was to continue billing and there was not an ongoing transfer. It is also worth noting that a change in Outsourcer agreements will often give rise to higher software publisher attention – compounding the issues already being faced
The sources needed to validate billing were identified and used to populate a “Data Hub” and carry out a targeted reconciliation. Further analysis identified where data was incorrect and in which sources the processes needed to change to remediate and update existing information, and to put in place actions so the data did not deteriorate; support was provided in this area. Benefits were realised such as: correction of asset status, removal of duplicates/invalid assets and populating missing fields. This supported more accurate billing and the transfer of the asset ownership. The client had more confidence in the figures and would be able to more accurately predict the refresh and manage it. The transfer of ownership is now in final stage of completion and the Client is actively considering an ongoing data quality process to perform regular check points and maintain a high degree of ongoing data accuracy.
IT Configuration Management is a process for establishing and maintaining information around the performance, functional and physical attributes of hardware assets e.g. servers and desktops.
This information is commonly stored in a central database tool, providing support to other processes such as the service desk, refresh, transformation, break fix, billing and patching. This data is updated and maintained in a number of ways and if it becomes inaccurate, the validity of the processes using it becomes reduced.
Understanding if your data is complete and accurate can be very time consuming, as the most effective way is to compare several data sources and understand what the ‘true’ picture is of any particular asset. Using this information correctly will lead to improvements being made in processes that feed and maintain the data; hence having a positive effect on other dependent processes.
A well proven method of optimising Configuration Management is to engage with a supplier who can provide a data reconciliation service coupled with some advisory support. The case study below shows an example of how this can progress.
Case Study
This customer engagement started with an initial advisory project to evaluate the existing ITAM maturity. The customer is a multi-national banking sector corporation with approximately 10,000 seats. During this engagement, it was found that there would be a large transformation project for the majority of the companies’ IT hardware.
Phase 1 included delivery of a workshop, detailing a Service Design Overview (with descriptions and RACI), identification of the data sources supporting the operational configuration management function and in-depth analysis as to how the transformation will be run, and the effect of it on the current mode of operation. Data quality was clearly an area for improvement to ensure the assets to be supported were known and that the new configuration database could be populated with fit for purpose, accurate information.
Phase 2 the data sources identified in phase 1 (e.g. AD, CMDB, AV & Discovery) were fed into a ‘Data Hub’ where a reconciliation exercise was carried out. The analysis identified where data was incorrect and in which sources the processes needed to change to update existing information and to put in place actions so the data did not deteriorate; support was provided in this area. An advisory group was set up with all involved stakeholders so that the optimisation would be felt across all the operational areas and the buy-in would support the actions. Benefits were realised such as correction of asset status, removal of duplicates and populating missing fields, increasing the overall accuracy of data. On-going full technical and data services were provided for the customer in support of this work. Tangible savings will be realised when maintenance and support contracts are based on more accurate data.
Please fill in the form below to receive Part 1 of our 3-part guide on ‘Data Quality in IT Asset Management, Licence Management and Software Asset Management’.
Please fill in the form below to register for your trial of our Data Hub Service.