Report 11

Information Systems Audit Report

General computer controls and capability assessments

We reported 454 general computer controls (GCC) issues to the 45 agencies audited in 2015 compared with 398 issues at 42 agencies in 2014.

Only 10 agencies met our expectations for managing their environments effectively, compared with 11 in 2014. More than half of the agencies are not meeting our benchmark expectations in 3 or more categories and the overall result showed a 3% decline on the prior year.

Change controls and physical security are managed effectively by most agencies, but the management of IT risks, information security, business continuity and IT operations need a much greater focus.

Background

The objective of our GCC audits is to determine whether the computer controls effectively support the confidentiality, integrity, and availability of information systems. General computer controls include controls over the information technology (IT) environment, computer operations, access to programs and data, program development and program changes. In 2015 we focused on the following control categories:

  • management of IT risks
  • information security
  • business continuity
  • change control
  • physical security
  • IT operations.

We use the results of our GCC work to inform our capability assessments of agencies. Capability maturity models are a way of assessing how well developed and capable the established IT controls are and how well developed or capable they should be. The models provide a benchmark for agency performance and a means for comparing results from year to year.

The models we developed use accepted industry good practice as the basis for assessment. Our assessment of the appropriate maturity level for an agency’s general computer controls is influenced by various factors. These include: the business objectives of the agency; the level of dependence on IT; the technological sophistication of their computer systems; and the value of information managed by the agency.

What did we do?

We conducted GCC audits and capability assessments at 45 agencies. This is the eighth year we have assessed agencies against globally recognised good practice.

We provided the 45 selected agencies with capability assessment forms and asked them to complete and return the forms at the end of the audit. We then met with each of the agencies to compare their assessment and ours which was based on the results of our GCC audits.

We use a 0-5 scale rating[1] to evaluate each agency’s capability and maturity levels in each of the GCC audit focus areas. The models provide a baseline for comparing results for agencies from year to year.

Table 1 - Rating criteria

[1] The information within this maturity model assessment is based on the criteria defined within the Control Objectives for Information and related Technology (COBIT) manual.

What did we find?

Our capability maturity model assessments show that agencies need to establish better controls to manage IT operations, IT risks, information security and business continuity. Figure 1 summarises the results of the capability assessments across all categories for the 45 agencies we audited. We expect agencies to rate a level 3 or better across all the categories.

Figure 1 - Capability maturity model assessment results

The percentage of agencies reaching level 3 or above for individual categories was as follows:

Table 2 - Percentage of agencies at level 3 or above

The 2015 results were disappointing with a 3% average decline across all areas when compared with 2014. Results in the level 3 to level 5 categories fell to 61.5% compared to 69% in the previous year. However, this figure in 2012 was 53%, which demonstrates a general improvement over 3 years.

Ten of the 45 agencies were level 3 or above across all categories in 2015 compared to 11 in 2014. Thirty-four agencies were able to achieve level 3 or higher in at least 3 categories compared to only 14 agencies in 2014.

Nine agencies made improvements in at least 1 category without regressing in any other category. Thirteen agencies showed no change. Nine agencies moved up 1 category but went down in another. Eight agencies regressed in at least 1 category without making any improvements.

IT operations

The rating for ‘performance in IT practices and the service level performance provided to meet their agency’s business’ fell 3% in 2015 compared to the previous year. However, there has been overall improvement of 23% since 2011.

Effective management of IT operations is a key element for maintaining data integrity and ensuring that IT infrastructure can resist and recover from errors and failures.

We assessed whether agencies have adequately defined their requirements for IT service levels and allocated resources according to these requirements. We also tested whether service and support levels within agencies are adequate and meet good practice. Other tests included:

  • policies and plans are implemented and effectively working
  • repeatable functions are formally defined, standardised, documented and communicated
  • effective preventative and monitoring controls and processes have been implemented to ensure data integrity and segregation of duties.

Figure 2 - IT operationsNote: Green represents the percentage of agencies that met the benchmark and red represents the agencies that did not meet the benchmark.

Weaknesses we found included:

  • information and communication technology strategies not in place
  • no logging of user access and activity to critical systems or sensitive data
  • network logs kept for short periods, e.g. 1hr to 4 days
  • former staff with access to agency networks and applications years after termination
  • unauthorised devices can connect to networks such as USBs and portable hard drives
  • no reviews of security logs for critical systems including remote access, changes to databases with confidential information
  • no follow-ups to automated alerts from security devices and applications
  • several agencies are running unsupported operating systems
  • no user education of security policy and security related responsibilities and induction processes not implemented or followed
  • unsupported databases for critical systems
  • background checks for key staff not undertaken
  • no incident management procedure
  • sensitive information stored in excel spreadsheets and widely accessible
  • asset registers not maintained and ICT equipment unable to be located.

The above types of findings can mean that service levels from computer environments may not meet business requirements or expectations. Without appropriate ICT strategies and supporting procedures, ICT operations may not be able to respond to business needs and recover from errors or failures.

Management of IT risks

Sixty-four percent of agencies met our expectations for managing IT risks a 28% improvement since the first assessment in 2008, with agencies showing improved management controls over risks.

Figure 3 - Management of IT risks

Weaknesses we found included:

  • risk management policies in draft or not developed
  • inadequate processes for identifying, assessing and treating IT and related risks
  • no risk registers
  • risk registers not maintained, for ongoing monitoring and mitigation of identified risks.

All agencies are required to have risk management policies and practices that identify, assess and treat risks that affect key business objectives. IT is one of the key risk areas that should be addressed. We therefore expect agencies to have IT specific risk management policies and practices established such as risk assessments, registers and treatment plans.

Without appropriate IT risk policies and practices, threats may not be identified and treated within reasonable timeframes, thereby increasing the likelihood that agency objectives will not be met.

Information security

Only 40% of agencies met our benchmark for effectively managing information security, up 2% from the previous year. It is clear from the basic security weaknesses we identified that many agencies are lacking some important and fundamental security controls needed to protect systems and information. The trend across the last 8 years shows no change to information security controls.

We assessed whether agency controls were administered and configured to appropriately restrict access to programs, data, and other information resources.

Figure 4 - Information security

Weaknesses we found included:

  • information security policies did not exist, were out of date or not approved
  • easy to guess passwords for networks, applications and databases, e.g. Password1, guest
  • applications without critical patches applied (1,000’s critical and high severity)
  • operating systems missing critical patches (1,000’s critical and high severity)
  • highly privileged generic accounts shared with many staff and contractors, some accounts exist without agency knowledge
  • lack of processes to identify security vulnerabilities within IT infrastructure
  • no review of application and network accounts
  • weak password controls such as complexity, length, history, expiry, lock out
  • firewalls and intrusion detection/prevention systems not configured correctly leaving exposures
  • unknown accounts accessing firewalls and accounts using insecure access methods
  • not installed or out of date anti-virus software
  • default database accounts remain unchanged with credentials widely known and published on the internet
  • terminated staff used remote access accounts
  • unauthorised software installations on servers and staff computers
  • local administrator privileges granted to allow any activity.

Information security is critical to maintaining data integrity and reliability of key financial and operational systems from accidental or deliberate threats and vulnerabilities.

Business continuity

To ensure business continuity, agencies should have in place a business continuity plan (BCP), a disaster recovery plan (DRP) and an incident response plan (IRP). The BCP defines and prioritises business critical operations and therefore determines the resourcing and focus areas of the DRP. The IRP needs to consider potential incidents and detail the immediate steps to ensure timely, appropriate and effective response.

These plans should be tested on a periodic basis. Such planning and testing is vital for all agencies as it provides for the rapid recovery of computer systems in the event of an unplanned disruption affecting business operations and services.

We examined whether plans have been developed and tested. We found a 9% reduction from last year with 64% of the agencies still not having adequate business continuity and disaster recovery arrangements in place. The trend over the last 8 years has shown no notable improvement. This may mean that agencies do not afford this proper priority.

Figure 5 - Business continuity

Weaknesses we found included:

  • no BCPs
  • BCPs in draft or not reviewed for many years
  • tolerable outages for critical systems not defined
  • no DRPs
  • old and redundant DRPs with some not reflecting current ICT infrastructure
  • DRPs never tested
  • backups never tested and not stored securely
  • uninterrupted power supplies not tested or not functional.

Without appropriate continuity planning there is an increased risk that key business functions and processes will fail and not be restored in a timely manner after a disruption. Disaster recovery planning will help enable the effective and timely restoration of systems supporting agency operations and business functions.

Change control

We examined whether changes are appropriately authorised, implemented, recorded and tested. We reviewed any new applications acquired or developed to evaluate consistency with management’s intentions. We also tested whether existing data converted to new systems was complete and accurate.

Change control practices have slowly been improving since 2008, with almost 3 in 4 agencies achieving a level 3 or higher rating.

Figure 6 - Change control

Weaknesses we observed included:

  • no formal change management policies in place
  • changes to critical systems not logged or approved
  • no documentation regarding changes made to systems and critical devices
  • risk assessments for major changes to infrastructure not performed
  • individuals are able to request and approve their own changes
  • change control groups exist but have never met to manage or consider changes
  • changes affecting staff are not communicated.

An overarching change control framework is essential to maintaining a uniform standard change control process and to achieving better performance, reduced time and staff impact and increased reliability of changes. When examining change control, we expect defined procedures are used consistently for changes to IT systems. The objective of change control is to facilitate appropriate handling of all changes.

There is a risk that without adequate change control procedures, systems will not process information as intended and agency’s operations and services will be disrupted. There is also a greater chance that information will be lost and access given to unauthorised persons.

Physical security

We examined whether computer systems were protected against environmental hazards and related damage. We also determined whether physical access restrictions are implemented and administered to ensure that only authorised individuals have the ability to access or use computer systems.

Six of the 45 agencies fell below our expectations for the management of physical security.

Figure 7 - Physical security

Weaknesses we observed included:

  • power generators in the event of power failure not tested
  • no fire suppression system installed within the server room
  • no temperature or humidity monitoring for server rooms
  • no restricted access to computer rooms for staff, contactors and maintenance.

Inadequate protection of IT systems against various physical and environmental threats increases the potential risk of unauthorised access to systems and information and system failure.

The majority of our findings require prompt action

Figure 8 provides a summary of the distribution of significance of our findings. It shows that the majority of our findings at agencies are rated as moderate. This means that the finding is of sufficient concern to warrant action being taken by the entity as soon as possible. However, it should be noted that combinations of issues can leave agencies with more serious exposure to risk.

Figure 8 - Distribution of ratings for the findings

Recommendations

Management of IT operations

Agencies should ensure that they have appropriate policies and procedures in place for key areas such as IT risk management, information security, business continuity and change control. IT strategic plans and objectives support the business strategies and objectives. We recommend the use of standards and frameworks as references to assist agencies with implementing good practices.

Management of IT risks

Agencies need to ensure that IT risks are identified, assessed and treated within appropriate timeframes and that these practices become a core part of business activities.

Information security

Agencies should ensure good security practices are implemented, up-to-date and regularly tested and enforced for key computer systems. Agencies must conduct ongoing reviews for user access to systems to ensure they are appropriate at all times.

Business continuity

Agencies should have a business continuity plan, a disaster recovery plan and an incident response plan. These plans should be tested on a periodic basis.

Change control

Change control processes should be well developed and consistently followed for changes to computer systems. All changes should be subject to thorough planning and impact assessment to minimise the likelihood of problems. Change control documentation should be current, and approved changes formally tracked.

Physical security

Agencies should develop and implement physical and environmental control mechanisms to prevent unauthorised access or accidental damage to computing infrastructure and systems.

 
Page last updated: June 22, 2016

Back to Top