Report 14

Information Systems Audit Report

General Computer Controls and Capability Assessments


We reported 455 general computer controls issues to the 54 agencies audited in 2013.

From the 42 agencies that had capability assessments conducted only eight were meeting our expectations for managing their environments effectively. More than half of the agencies were not meeting our benchmark expectations in three or more categories. Nevertheless, the overall result was a slight improvement on the prior year.

Management of Changes and Physical Security were being managed effectively by most agencies, the Management of IT Risks, Information Security, Business continuity and Operations need much greater focus.


The objective of our general computer controls (GCC) audits is to determine whether the computer controls effectively support the confidentiality, integrity, and availability of information systems. General computer controls include controls over the information technology (IT) environment, computer operations, access to programs and data, program development and program changes. In 2013 we focused on the following control categories:

  • Management of IT risks
  • Information security
  • Business continuity
  • Change control
  • Physical security
  • IT operations

We use the results of our GCC work to inform our capability assessments of agencies. Capability maturity models are a way of assessing how well developed and capable the established IT controls are and how well developed or capable they should be. The models provide a benchmark for agency performance and a means for comparing results from year to year.

The models we developed use accepted industry good practice as the basis for assessment. Our assessment of the appropriate maturity level for an agency’s general computer controls is influenced by various factors. These include: the business objectives of the agency; the level of dependence on IT; the technological sophistication of their computer systems; and the value of information managed by the agency.

What did we do?

We conducted GCC audits at 54 agencies and completed capability assessments at 42 of them. This is the sixth year we have been assessing agencies against globally recognised good practice.

We provided the 42 selected agencies with capability assessment forms and asked them to complete and return the forms at the end of the audit. We then met with each of the agencies to compare their assessment and that of ours which was based on the results of our GCC audits. The agreed results are reported below.

We use a 0-5 scale rating5 listed below to evaluate each agency’s capability and maturity levels in each of the GCC audit focus areas. The models provide a baseline for comparing results for these agencies from year to year. Our intention is to increase the number of agencies assessed each year.

Table 1: GCCCA Rating Criteria


The graphs and tables that follow show in green the percentage of agencies that attained at least a level three in the rating criteria. Red indicates that they were below level three.

What did we find?

Our capability maturity model assessments show that agencies need to establish better controls to manage their IT operations, IT risks, Information security and Business continuity. Figure 2 overleaf summarises the results of the capability assessments across all categories for the 42 agencies we audited. We expect agencies should be at least within the level three band across all the categories.

Fig 1: GCCCA Capability Maturity Model Assessment Results

The model shows that the categories with the most weakness were management of IT risks, information security and business continuity.

The percentage of agencies reaching level three or above for individual categories was as follows:

Table 2: GCCCA Percentage of agencies attaining at least level three







There was an improvement in four areas from the previous year. Information security declined by four per cent and Change control remained the same.

Eight of the 42 agencies were assessed as level three or above across all categories which is an improvement from only three agencies achieving this from the previous year. More than half of the agencies did not achieve level three rating for three or more categories.

Seventeen agencies made improvements in at least one of the categories without regressing in any category. Five agencies showed no change. Eight agencies moved up in one category but went down in another. Five agencies regressed in at least one area without making any improvements.

Seven agencies were assessed for the first time this year. The agencies that we assessed for the first time are generally not better or worse than those that have had ongoing assessments. The results of our work show that some agencies have implemented better controls in their computing environments however, most still need to do more to meet good practice.

IT Operations

This is the third year we have assessed IT Operations for agencies. There was a six per cent improvement by IT branches in IT practices and the service level performance provided to meet their agency’s business requirements (Figure 2).

Effective management of IT Operations is a key element for maintaining data integrity and ensuring that IT infrastructure can resist and recover from errors and failures.

We assessed whether agencies have adequately defined their requirements for IT service levels and allocated resources according to these requirements. We also tested whether service and support levels within agencies are adequate and meet good practice. Some of the tests include whether:

  • policies and plans are implemented and effectively working
  • repeatable functions are formally defined, standardised, documented and communicated
  • effective preventative and monitoring controls and processes have been implemented to ensure data integrity and segregation of duties.

Fig 2: IT Operations

Examples of findings:

  • Several agencies that require staff to sign a confidentiality declaration or non-disclosure agreement had failed to collect or retain these documents.
  • A number of agencies do not have adequate processes in place to review security logs generated by core systems and infrastructure. Examples include:
    • logs for remote access systems
    • modifications made to databases containing confidential information
    • alerts from automated security systems
  • A number of agencies have either no, incomplete or out-dated Information Security Policies
  • One agency had inconsistent and incorrect incurring limits within their expense management system.

The following section highlights trends over the last five years for the remaining five GCC categories.

Management of IT risks

Fifty per cent of agencies met our expectations for managing IT risks, a six per cent improvement on the previous year.

Fig 3: Management of IT Risks

Examples of findings:

  • a number of agencies did not have a risk management process for identifying, assessing and treating IT and related risks. Also many agencies still do not have a risk register for ongoing monitoring and mitigation of identified risks
    • one agency has insufficiently or inaccurately recorded IT risks in the risk register. Key details such as the level of risk and compensating controls were misrepresented
  • one agency’s IT risks identified within their risk register have not been reviewed since 2009 to ensure the relevance of the risks and associated plans.

All agencies are required to have risk management policies and practices that identify, assess and treat risks that affect key business objectives. IT is one of the key risk areas that should be addressed. We therefore expect agencies to have IT specific risk management policies and practices established such as risk assessments, registers and treatment plans.

Without appropriate IT risk policies and practices, threats may not be identified and treated within reasonable timeframes, thereby increasing the likelihood that agency objectives will not be met.

Information security

Only 40 per cent of agencies met our benchmark for effectively managing information security, down four per cent from the previous year. It is clear from the basic security weaknesses we identified that many agencies have not implemented fundamental security controls to secure their systems and information.

Fig 4: Information Security

Examples of findings:

  • we found weak password settings with one agency allowing network user accounts with passwords such as ‘aaaaaa’
  • agencies did not have effective process in place to identify potential security vulnerabilities across their IT infrastructure in a timely manner. We ran our own vulnerability scans and found examples in multiple agencies where critical and moderate security issues were identified
  • a number of agencies did not have good processes in place to review application and network accounts. We found one example of an agency with over 2000 generic user accounts.

Information security is critical to maintaining data integrity and reliability of key financial and operational systems from accidental or deliberate threats and vulnerabilities. We examined what controls were established and whether they were administered and configured to appropriately restrict access to programs, data, and other information resources.

Business continuity

To ensure business continuity, agencies should have in place a business continuity plan (BCP), a disaster recovery plan (DRP) and an incident response plan (IRP). The BCP defines and prioritises business critical operations and therefore determines the resourcing and focus areas of the DRP. The IRP needs to consider potential incidents and detail the immediate steps to ensure a timely, appropriate and effective response.

These plans should be tested on a periodic basis. Such planning and testing is vital for all agencies as it provides for the rapid recovery of computer systems in the event of an unplanned disruption affecting business operations and services.

We examined whether plans have been developed and tested. We found an 11 per cent improvement from last year but 64 per cent of the agencies still did not have adequate business continuity arrangements.

Fig 5: Business Continuity

Examples of findings

  • a number of agencies did not have a BCP or if they did it was either in draft or had not been reviewed for a number of years
  • while some agencies had extensive and detailed Disaster Recovery Plans (DRP) for systems and infrastructure, these plans had not been updated to reflect the current environment and had not been tested since their creation
  • one DRP was last reviewed in 2006 and did not support the current IT environment
  • many agency DRP’s had never been tested or approved and in one case the DRP did not reflect their environment and referred to some infrastructure, key personnel and contacts that were no longer applicable

Change control

We examined whether changes are appropriately authorised, implemented, recorded and tested. We reviewed any new applications acquired or developed and evaluated the consistency with management’s intentions. We also tested whether existing data converted to new systems was complete and accurate.

There was no movement in change control practices from 2012 by agencies.

Fig 6: Change Control

Examples of findings:

  • we found many agencies had no formal change management policies in place to ensure all changes to IT systems and applications are handled in a standardised manner
  • one agency did not comply with their internal Change Management Policy, for record keeping and documentation. Records corresponding to major system changes could not be located. The current change control procedures were failing to reliably capture:
    • complete details of the change
    • who approved the change
    • implementation dates, times and duration
    • risks posed by the change, with evidence of adequate testing and back out contingency plans.

An overarching change control framework is essential to ensure a uniform standard change control process is followed, achieve better performance, reduce time and staff impacts and increase the reliability of changes. When examining change control, we expect defined procedures are used consistently for changes to IT systems. The objective of change control is to facilitate appropriate handling of all changes.

There is a risk that without adequate change control procedures, systems will not process information as intended and an agency’s operations and services will be disrupted. There is also a greater chance that information will be lost and access given to unauthorised persons.

Physical security

We examined whether computer systems were protected against environmental hazards and related damage. We also determined whether physical access restrictions are implemented and administered to ensure that only authorised individuals have the ability to access or use computer systems.

We found an 11 per cent improvement from last year with 86 per cent of agencies now meeting our benchmark for management of physical security.

Fig 7: Physical Security

Examples of findings:

  • a number of agencies could not provide reports on the maintenance and testing of the Uninterrupted Power Supply (UPS) and air conditioning
  • power generators to be used in the event of power failure had not been tested
  • no fire suppression system installed within the server room
  • a number of agencies where found not to have temperature or humidity monitoring configured to alert in the case of an event related to the server rooms
  • some agencies continue to not appropriately restrict access to their computer rooms with staff, contractors and maintenance people having unauthorised access to server rooms. For example, approximately 150 people across one organisation have access to the computer rooms while the log detailing access to the computer room is not reviewed on a regular basis.

Inadequate protection of IT systems against various physical and environmental threats increases the potential risk of unauthorised access to systems and information and system failure.

The majority of our findings require prompt action

Figure 8 provides a summary of the distribution of significance of our findings. It shows that the majority of our findings at agencies are rated as moderate. This means that the finding is of sufficient concern to warrant action being taken by the agency as soon as possible. However it should be noted that combinations of issues can leave agencies with serious exposure to risk.

The diagram on the next page represents the distribution of ratings for the findings in each area we reviewed.

Fig 8: Distribiton of ratings of finds accross each area



Management of IT operations

Agencies should ensure that they have appropriate policies and procedures in place for key areas such as IT risk management, information security, business continuity and change control. IT Strategic plans and objectives support the business strategies and objectives. We recommend the use of standards and frameworks as references to assist agencies with implementing good practices.

Management of IT risks

Agencies need to ensure that IT risks are identified, assessed and treated within appropriate timeframes and that these practices become a core part of business activities.

Information security

Agencies should ensure good security practices are implemented, up-to-date and regularly tested and enforced for key computer systems. Agencies must conduct ongoing reviews for user access to systems to ensure they are appropriate at all times.

Business continuity

Agencies should have a business continuity plan, a disaster recovery plan and an incident response plan. These plans should be tested on a periodic basis.

Change control

Change control processes should be well developed and consistently followed for changes to computer systems. All changes should be subject to thorough planning and impact assessment to minimise the likelihood of problems. Change control documentation should be current, and approved changes formally tracked.

Physical security

Agencies should develop and implement physical and environmental control mechanisms to prevent unauthorised access or accidental damage to computing infrastructure and systems.

Page last updated: July 1, 2014

Back to Top