To begin adequately securing their databases, agencies should assess the value of data stored within them. This data may be personal, commercially sensitive, a target for fraud, or protected under legislation. Agencies should fully understand the consequences of data disclosure, theft or tampering to ensure they expend appropriate effort on security controls.
When considering applying data security controls, it is important to consider where else the data in a production environment may reside. These may include:
- Test, Development or Training environments
Where possible, these systems should use anonymised or masked data. If it is essential that production data is used, even if it is old data, the databases should be secured to the same level as the Production system. Development or Test systems may be less hardened and host potentially insecure services, so additional network segregation may be required.
- Other systems such as reporting or staging databases
Other systems within an agency may need to extract and process sensitive data for a variety of purposes. In these instances, the minimum required data should be extracted to address requirements. Access restrictions, password security and account settings present in the primary database should be applied consistently across these additional systems.
- Backups and disaster recovery images of the production database
The production database should be backed up and replicated, to allow for recovery in the event of a disaster. Any replicated copies or versions of the database should be secured to the same level as the production system. Agencies should conduct a risk assessment on the storage of backup images or media. This should include the physical storage or external media, and may include encryption backups.
The accounts used to access databases must be well controlled and secured. All database user accounts, regardless of their purpose, require a strong password. The passwords used by general users should conform to good practice for complex, hard to guess passwords. These passwords should be set in line with agency password policies and standards. Passwords for administrators and other highly privileged users should be more complex, to reflect the risk presented by a compromised account.
Service and System accounts should be used only by automated services and process, not by individuals. Agencies should consider changing these passwords periodically, and when administrators with access to these accounts leave the agency.
To enable accountability and auditing, administrators should utilise their personal accounts where possible. The passwords for user and administrator accounts should expire periodically.
Systems should be configured to automatically lockout user accounts after a set number of incorrect attempts. This can defeat attempts by an attacker to either guess the password or to use an automated tool to crack the password using ‘brute force’.
The attributes of a hard to guess password include:
- sufficient characters
- a mix of alphanumeric and ‘special’ characters
- do not contain the username, the name of the application or the agency name
- does not contain common dictionary words (e.g. password, test, welcome) or patterns (e.g. qwerty, 12345, abcde)
- has not been used on that system previously by that user.
Database accounts should be periodically audited and examined to see if they:
- are still required
- are assigned appropriate access rights and privileges.
This process ensures that:
- access is revoked for users that have left the agency
- access is modified or revoked for users that have changed roles within the agency
- accounts belonging to obsolete services, or allocated for short term initiates, are removed
- accounts that have not been used for extended periods are assessed
- possible ‘rogue’ accounts that have been created without authorisation (possibly by an attacker) are discovered and removed.
These alterations should be made at the time of the role change or termination, with audits acting as a follow-up assurance process. An attacker may also seek to create a new account to maintain access to a database without disrupting a normal user. Regular audits will aid in the detection of any ‘rogue’ accounts.
Database software, and its supporting operating systems, should be patched and upgraded regularly. Agencies should include databases in their wider patching and vulnerability management programs.
Patches, service packs and upgrades issued by the vendor should be risk assessed and installed on Test or other Pre-Production environments before installation on live systems. This reduces the risk of any complications or issues arising from a patch. This process should be handled as per agency Change Control procedures.
Agencies may also wish to use a Vulnerability Scanning tool to seek assurance that their patching program is effective. These tools are also effective to ensure that systems remain patched after restores from backup or major configuration changes, and are not vulnerable to newly disclosed vulnerabilities.
Database software and operating systems will also undergo significant version changes and upgrades over time. Agencies should keep pace with version changes to ensure that their implementations are supported by the software vendor. Over time, vendors may stop supporting software, meaning that patches (including security patches) will not be released and technical support will not offered. While it may be possible to enter into a custom support arrangement, this will come at a heavy financial cost.
Where databases cannot be quickly patched or upgraded to new versions compensating security controls should be applied. This may include additional physical or logical network segregation and increased monitoring. It is important that the risks posed by out of date systems are assessed on a regular basis. Examples of compensating controls are included in this document under Attack Surface and System Hardening.
Additional advice on system patching and vulnerability management is available from the ASD, both in the ASD ISM and at a high level from the Cyber Security Operations Centre.
Minimising the available Attack Surface reduces the opportunities for an attacker to exploit weaknesses in a database. Any unused or un-configured databases schemas, features or services should be removed.
Database servers should be segregated from the rest of the network, with rules allowing only necessary services and ports to be exposed to end users. This reduces the exposure of other management services and interfaces that may be vulnerable to attack.
Segregation of the internal network (beyond just segregation from the internet) will increase the effort the difficulty, or ‘cost’, to an attacker seeking to access sensitive data and systems.
Some administrators will require additional access to databases to perform administrative tasks and troubleshooting. These workstations should be limited in number and have additional security controls, such as monitoring, applied. Agencies may consider the use of ‘jump servers’ or ‘jump hosts’ that allow administrators remote access to a general use system within the same network segment as the database.
The configuration of a newly installed database can be insecure and should be hardened.
Post installation tasks include:
- restrict default access rights and permissions, particularly ‘PUBLIC’ access
- change default passwords
- remove built-in user accounts that are not required
- align configuration settings with the IT environment and system requirements.
At the database level, data protection can be achieved by:
- encryption of the database, or certain records within it
- use of Virtual Private Databases to apply explicit security restrictions with database tables
- redaction of sensitive data. This may include highly sensitive personal information or credit card numbers.
As mentioned above, any data protection controls applied within the primary database should be replicated within any other copies or instances of the data.
If databases and their underlying operating systems are misconfigured, sensitive data or system settings can be exposed to unauthorised users. Before deploying a new system into a production environment, the configuration should be checked against best practice recommendations and tailored to the purpose of the system.
Agencies may wish to engage specialists to review and test database configurations before deploying a system or after significant changes occur. A robust Change Control process, supported by policy and documented procedures, should be used to ensure that any modifications to the production environment are properly planned, tested and endorsed.
An attacker may also alter the system configuration to open a security hole, or back door, allowing them to maintain access to data during a prolonged attacked. Such an alteration could be considered an unauthorised change. Agencies that apply suitable detective controls, such as auditing and logging, may be able to detect these changes.
Databases should be configured to log and store sensitive actions performed by users, and the system itself. The nature and detail of this logging will differ from system to system, depending on agency requirements.
Process should be established to monitor and to audit logs for suspicious behaviour or another anomalies. Logging should contain, amongst other items:
- successful and rejected login attempts
- account lockouts
- account administration tasks:
- account creation and deletion
- password changes
- user rights and role changes
- account locks and unlocks
- execution of queries (SELECT, UPDATE, DELETE and INSERT) for sensitive data
- changes to system configuration.
Log data should be adequately protected against unauthorised deletion and modification to ensure its integrity and reliability. Automatically duplicating or sending logs to a separate logging system may be advisable in some environments. This allows for more stringent access control to be applied, particularly to accounts that have high levels of privilege to a database and/or operating system. Logs stored in a secondary location also allows for a level of redundancy in logging if the primary system is compromised or destroyed.
Processes should be developed to review and audit logging data once it has been collected. Agencies should use their prior assessments of sensitive data and high risk activities to guide log reviews. If possible, segregation of duties should be introduced to ensure that administrators are not the only party to audit systems under their control.
 Pages 189 to 193 http://www.asd.gov.au/publications/Information_Security_Manual_2015_Controls.pdf  http://www.asd.gov.au/publications/protect/Assessing_Security_Vulnerabilities_and_Patches.pdf  https://benchmarks.cisecurity.org/  https://benchmarks.cisecurity.org/  ISO/IEC 27002:13 12.4 http://www.iso.org/iso/catalogue_detail?csnumber=54533