(1) Computing and information systems underpin the University's activities and are essential to the teaching, learning, research and administration functions of Charles Sturt University (the University). (2) The University is a critical education asset under the Security of Critical Infrastructure Act 2018 and subject to mandatory reporting under that Act. (3) The University acknowledges the requirement to manage cyber risk arising from criminal activities, internal threats, and local and foreign interference. (4) These guidelines support the Information Technology Policy and set out the University's information security practices regarding the confidentiality, integrity, and availability of all information and communication technology (ICT) infrastructure, systems and processes. (5) The purpose of these guidelines is to: (6) Appropriate security standards and measures must be established, implemented, monitored, reviewed and improved as required, to ensure that the University's Information Security Management System (ISMS) documentation framework and business objectives are met. (7) It is the intent of the University that the Information Security Management System (ISMS) documentation framework be implemented and appropriate security measures are: (8) These guidelines apply to all authorised users who own, manage, access or use the University's Information and Communication Technology (ICT) services. (9) These guidelines cover all: (10) These guidelines are a component of the University’s Information Security Management System (ISMS) documentation framework and should be read in conjunction with, but not limited to: (11) See the Information Technology Policy. (12) Nil. (13) To establish a management framework for the implementation and operation of information security, and to assign roles and responsibilities for the management of information security within the University. (14) Information security responsibilities should be defined and allocated in accordance with the Information Technology Policy. Responsibilities for the protection of individual assets and for carrying out specific information security processes should be identified. Responsibilities for information security risk management activities and for acceptance of residual risks are defined in the Information and Communication Technology (ICT) Information Security and Risk Management Policies. (15) Executive management of the Division of Information Technology have the overall responsibility for information security within the University. Guidelines on roles and responsibilities have been outlined in Appendix B. (16) Roles and areas of responsibility must be segregated to reduce opportunities for unauthorised modification or misuse of information or services. Whenever a computer-based process involves sensitive, valuable, or critical information, the respective system should include controls that ensure that no one individual has exclusive control over information. Furthermore, there should be, where practical, separation of duties between authorised users assigned to the development/test environment(s) and to those assigned to the production environment. (17) The University aims for compliance with the NSW Cyber Security Policy as maintained by the NSW Department of Finance Services and Innovation. (18) The University aims for compliance with the Notifiable Data Breaches scheme as maintained by the Office of the Australian Information Commissioner. Refer to the University Information Technology Procedure - Personal Data Breach. (19) The University will comply with all registration, reporting and operational requirements of the Security of Critical Infrastructure Act 2018. (20) The Division of Information Technology ICT Security Team should maintain appropriate contacts with special interest groups or other specialist security forums and professional associations including: (21) Information security must be integrated into the University's project management methods to ensure that information security risks are identified and addressed throughout all stages of a project. Refer to the DIT Project Security Considerations Guide. (22) To ensure that all authorised users: (23) All authorised users should understand their responsibilities towards handling the University's information. Information security responsibilities should be addressed prior to, or on commencement of, employment through adequate job descriptions and in the terms and conditions of employment. (24) Background verification checks should be carried out on all candidates for employment, including contractors, volunteers and third-party users. The checks should be proportional to business requirements, the classification of the information to be accessed, and the perceived risks. Applications for employment involving access to the University's facilities require a background screening process that is commensurate with the sensitivity of the information to be accessed during employment, and the system privileges required for the job function. (25) Screeing for potential and current employees will be in accordance with the Employment Screening Procedure (26) Screening procedures should also be carried out for agents, contractors, volunteers, and third-party users where appropriate. (27) Where contractors are provided through a recruitment agency, the contract with the recruitment agency should clearly specify the agency’s responsibilities for candidate screening, notwithstanding the notification procedures to be followed if screening has not been completed, or if results give cause for doubt or concern. In the same way, the agreement with the third party should clearly specify all responsibilities and notification procedures for screening. (28) As part of contractual obligations, authorised users must sign the terms and conditions of their employment contract. (29) Contractors, volunteers, and other authorised third-party users must sign a contractual agreement or an equivalent privacy and security agreement. The agreement should state the individual and University responsibilities for information security. See also the Finance Procedure - Contractors and Consultants. (30) The terms and conditions of employment or engagement should reflect the requirement to comply with all University policies and procedures. (31) Individuals that are not engaged via a University employment process (e.g. engaged via a third-party organisation) are to be treated to the same information security procedures as defined for employees. This must be followed prior to the granting of access to the University's ICT systems. (32) Confidentiality agreements with the individuals may only be omitted if an existing contract with the third-party specifies appropriate confidentiality requirements. (33) Authorised users must be aware of: (34) Management responsibilities should also be defined in position descriptions to ensure information security is applied throughout an individual’s employment within the University. (35) An adequate level of awareness, education, and training in security procedures and the correct use of information processing facilities should be provided to all authorised users to minimise possible information security risks. (36) Management should require authorised users apply information security in accordance with established University policies, standards, and procedures. Management responsibilities should include ensuring that authorised users: (37) Access rights should be reviewed by system custodians on a regular basis to ensure that permissions are reflective of the activities being carried out by the individual. It is also recommended that ICT security responsibilities be added to job descriptions for management position responsibilities where relevant. (38) Training should be provided to managers with ICT security responsibilities where required. (39) A formal information security awareness program should be implemented to provide all authorised users awareness around the importance of information security, cardholder data security and privacy. The program should educate users on information security problems and incidents and how they could respond to problems and incidents according to the needs of their work role. (40) The University’s security awareness program should include multiple methods of communication (e.g. posters, emails, memos, web-based training, meetings and promotions) to educate authorised users. All authorised users should receive appropriate information security awareness training and regular updates as relevant for their job function. (41) Information security awareness training must be conducted during employee induction and annually thereafter as referenced in the Information Technology Procedure - Acceptable Use and Access. This training should be incorporated into standard employee orientation and education processes. (42) Information security awareness, education and training activities should: (43) Authorised users should receive training in the correct use of ICT facilities and information on disciplinary processes, prior to the granting of access to ICT services. Refer to the Information Technology Access and Induction Procedure for further information. (44) All authorised users departing the University or changing roles should be governed by termination and change of employment processes conducted in an orderly manner. Changes to roles and responsibilities within the University should also be managed, notwithstanding, management of the termination of the respective responsibility or employment in line with this document. (45) Responsibilities should be in place to ensure that the authorised user’s exit from the University is managed and that the return of all equipment, information assets and removal of all access rights is completed. The departing authorised user should be reminded of continuing nondisclosure obligations after leaving. (46) The University’s Division of People and Culture (DPC) is generally responsible for the overall termination process, working together with the supervising manager of the authorised user leaving to manage the information security aspects of the relevant procedures. (47) Supervisors must co-ordinate exits and changes with DPC to ensure appropriate computer systems and building access is maintained or revoked. (48) In the case of a contractor, this termination responsibility process may be undertaken by the agency responsible for the contractor, and in the case of other users (e.g. service providers), this might be handled by their respective organisation. (49) The communication of exit responsibilities should include: (50) Changes of responsibility or employment should be managed as the termination of the respective responsibility or employment, and the new responsibility or employment should be controlled. (51) To ensure integrity of the University’s information assets and that data confidentiality is maintained when equipment or services are established, replaced, decommissioned or serviced. This section also includes the handling, control and disposal of storage media. (52) Business critical assets should be identified and maintained in the appropriate asset register – Applications Portfolio, Data Assets Register or Infrastructure Assets Register. (53) Asset custodians must be assigned to all assets identified and be recorded within the respective asset register. Asset custodians should work with the Division of Information Technology to determine appropriate classification levels for their information assets and make decisions about authorised users permitted to access and use the information. (54) The responsibility of an asset custodian is to: (55) The Information Technology Procedure - Acceptable Use and Access defines: (56) All authorised users must return all University assets in their possession upon termination of their employment, contract or agreement. The termination process should be formalised in accordance with Part B and should include the return of all previously issued software, corporate documents and ICT equipment. This includes: (57) In cases where an authorised user has university operational knowledge, that information should be documented and transferred to the University. Where the authorised user plays a role in information security plans (e.g. incident response procedure or contingency plan), respective plans must be updated accordingly. (58) Sensitive information must be removed from any information system equipment that has been used for University business prior to its disposal, donation or re-use. (59) Disposal of equipment should be undertaken as per the Information Technology Procedure – Purchasing and Disposal. (60) Classifications and associated protective controls for information and/or information assets must be in accordance to the business requirements of sharing and/or the restriction of information, inclusive of, the business impacts associated with such requirements. Classification guidelines have been developed, implemented and communicated to all University information asset custodians. Information and/or information assets must be classified in terms of value, legal requirements, sensitivity and criticality to the University. (61) The University Data security classification scheme requires information assets to be protectively marked into one of four classifications. The way the data is handled, published, moved and stored will be dependent on this scheme. (62) Internal procedures exist for the handling, processing, storing and communicating of information and/or information assets in accordance with the Data security classification scheme adopted by the University. Refer to the Data Access Form. (63) The management of computer media must be controlled. (64) Individuals must take appropriate steps to ensure the security of any computer media removed from the University. All media, in accordance with the classification of the data stored on the media, must be stored in a safe and secure environment. (65) Remote users who may or may not have direct access to the University network must ensure that data, which is backed up to removable media, uses strong encryption methods and is stored in a secure manner. (66) Classification of such data should be determined in collaboration with data and/or business custodian and the Enterprise Architect, Information. (67) All computer media must be disposed of securely and safely when no longer required. Refer to the Information Technology Procedure - Purchasing and Disposal. (68) Disposal of University information must be in accordance with the Records Management Policy and Records Management Procedure. (69) Physical media containing data that is no longer required must be either: (70) For hard-copy materials, acceptable methods of disposal include: (71) To detail access control requirements for information and/or information processing facilities. (72) Business requirements for access control must be defined and documented with access to system components and/or sensitive information restricted to only those individuals whose job requires such access. All authorised users with provisioned access should be given a clear statement of the business requirements which must be met by the access controls. (73) All authorised users must have a single unique user ID and a password for access to any aspect of the University’s ICT systems. (74) Shared, generic or group user IDs and/or passwords must not be created nor used. Conference accounts must be restricted to internet access only. (75) Access to privileged accounts must be restricted to the least privilege level necessary to perform job responsibilities. (76) Assignment of privileges must always be based on each individual’s job classification and function as per the Delegations and Authorisations Policy. Levels of privileges required for each role (e.g. user, administrator) for accessing resources must be defined. (77) Documented approval by system custodians is required for all access, specifying required privileges. Access controls should be implemented via automated access control systems (e.g. Identity and Access Management systems) with access control roles (e.g. access request, access authorisation, access administration) implemented with appropriate separation of duties. (78) Each user must be allocated access rights and permissions to computer systems and data that correspond with the tasks they are expected to perform in accordance with the Information Technology Procedure - Acceptable Use and Access. This should be role-based (e.g. a user account will be added to a group that has been created with access permissions required by that job role). (79) A request for temporary access via the Temporary Access Administration System to the University’s network and/or computer systems must include a declaration by the account supervisor that appropriate checks have been carried out and correct authorisation obtained prior to temporary access account creation. (80) When an employee departs the University under normal circumstances, the termination of access shall be performed in accordance with the standard process as triggered by the human resources management system and implemented by the Identity and Group Management System (IGMS). (81) For services not managed by IGMS, it is the responsibility of the systems custodian to ensure suspension of access for authorised users that have ceased employment with the University. (82) In exceptional circumstances where there is perceived to be a risk that such employee may take action that will harm the University prior to or upon termination, a request to remove access may be approved and actioned by senior management (including Directors, Executive Directors, Deans and Heads of School) in advance of notice of termination being given. This precaution should especially apply in the case where the individual concerned has privileged access rights (e.g. domain administrator rights). (83) User accounts should be initially suspended or disabled only and not deleted. User account names should not be reused as this may cause confusion in the event of a later investigation. (84) The University must provide secure remote access services to enable authorised users to work from a remote worksite. Appropriate controls must be implemented to authorise and control remote work activities. (85) Any remote access for work purposes should employ multifactor authentication mechanisms (e.g. 2-Step Verification, SMS codes) where provided. (86) Authorised users must structure their remote working environment so that it is compliant with the Employment Conditions Procedure - Workplace Attendance. (87) When using direct remote access services (e.g. virtual private network – VPN), authorised users should use managed University devices that are kept up to date with current operating system and application software updates, and a current anti-virus solution. (88) Remote access using personal computing devices should use gateway services (e.g. virtual desktop infrastructure – VDI). (89) Direct remote access using personal computing devices must be approved by the Manager, Infrastructure and provisioned via the IT Service Desk. (90) Remote access service usage will be logged and recorded, including user name and logon/off times. (91) Vendor access to the University's computer systems is granted solely for the work commissioned and for no other purposes. (92) Vendors must comply with all applicable University policies, standards and agreements. (93) Vendor agreements and contracts should specify: (94) Approval for vendor remote access should be sought via the system custodian or relevant manager. (95) Privileged level access will be monitored and logged as per ‘Management of privileged access rights’ below. (96) Before accessing University information systems and, unless covered by an existing contract or agreement, an authorised representative of the vendor must sign the Division of Information Technology Vendor Security, Privacy, Copyright and Confidentiality Agreement Form. (97) For vendors using a generic University account for remote access, contracts or agreements must include a requirement to inform the University of vendor staff moves and changes. Passwords must be changed as per the Information Technology Procedure - Passwords. (98) On a regular basis (at least twice a year), asset and system custodians will be required to review and document who has access to their areas of responsibility and the level of access in place. This identifies: (99) As part of the evaluation process for new or significantly changed systems, requirements for effective access control should be addressed and appropriate measures implemented. (100) These should consist of a comprehensive security model that includes support for, but not limited to: (101) As part of the selection of cloud service providers specifically, the following access-related considerations must be observed: (102) Addressing these requirements as part of the selection process will ensure that the provisions of this document can be met in the cloud, as well as within on-premise systems. (103) Privileged access rights such as those associated with administrator-level accounts must be identified for each system or network and tightly controlled. In general ICT support staff and other technical users should not make day to day use of user accounts with privileged access, but rather a separate “admin” user account should be created and used only when additional privileges are required. These accounts should be specific to an individual (e.g. “John Smith Admin”). Generic admin accounts must not be used as they provide insufficient identification of the user. (104) Access to admin level permissions should only be allocated to individuals whose roles require them and who have received sufficient training to understand the implications of their use. (105) User accounts must not be used for privileged access in automated routines such as batch or interface jobs or as service accounts. (106) Approval for the granting of privileged access rights, and the authorisation level of those rights is at the discretion of the identified system or application custodian. (107) Day to day management of privileged access rights are the responsibility of the delegated system or application administrator. (108) The activity of privileged accounts should be monitored and logged including but not limited to: (109) Logs should be protected from unauthorised access and modification. (110) Users are bound by the Information Technology Procedure - Passwords. Quality and complex passwords must be enforced, with the quality and complexity of passwords created by users enforced by controls in the password management system. (111) Access to information and application system functions must be restricted in accordance with the University Information Technology Procedure - Acceptable Use and Access. All information stored on a computer that is sensitive, critical, and/or valuable, must have system access controls to ensure that they are not improperly disclosed, modified, deleted, or rendered unavailable. (112) User privileges must be defined so ordinary users cannot gain access to, or otherwise interfere with, either the individual activities or private data of other users. (113) Access to operating systems must use a secure logon process, with physical access to business information system hardware restricted. (114) If any part of a logon sequence during the logging into a computer or data communications system process is incorrect, the user must only be given feedback that the entire logon process was incorrect. (115) Every logon screen for multi-user computers must include a special banner stating that: (116) A formal password management system must be enforced. This password management system must include various password controls such as, but not limited to: (117) Information systems must not use vendor supplied defaults for system passwords and other security passwords. (118) Users are bound by the University Information Technology Procedure - Passwords requiring users to follow industry best security practices in the selection and usage of passwords. (119) All information system tools and utilities that may be used to either cause significant damage and/or override systems must automatically be restricted to authorised users for intended usage purposes. (120) Access to program source code must be restricted to authorised employees and agents and on a need-to-know basis only. (121) Only managed personal computing devices are to be given privileged network access to critical University information systems. (122) Non-managed mobile devices are not given privileged network access unless exemption has been granted by the Chief Information and Digital Officer or nominee. (123) Policies and supporting security measures should be adopted to manage the risks introduced by using managed devices. Appropriate controls must also be implemented to protect against the risks of working with mobile computing facilities used in unprotected environments. (124) All managed and non-managed mobile devices containing sensitive information must employ storage encryption for all files. (125) The University provides selected authorised users with portable computer equipment so that they may perform their jobs at remote locations. Information stored in University portable computer equipment is University property and may be inspected or analysed in any manner at any time by the University. (126) Similar to University owned equipment, such equipment must be returned to the University on cessation of employment with the University. (127) All authorised users must keep all portable devices containing University information in their possession, unless stored and/or deposited in a secure location. (128) To ensure proper and effective use of cryptography to protect the confidentiality of information in the event of unauthorised access or interception. (129) Data classified as highly confidential and/or confidential/private should be encrypted in transport over the internet as well as in storage. However deciding as to whether a cryptographic solution is appropriate must be part of the wider process of risk assessment and selection of controls. This assessment can then be used to determine: (130) Specialist advice should be sought from the Division of Information Technology ICT Security Team. This is to: (131) If encryption technology is in use, key management must be in place to support the University's usage of cryptographic techniques. Key management policies and procedures must address that all cryptographic keys be protected against: (132) Key management policies and procedures specified to protect keys used for the encryption of sensitive and/or critical data against disclosure and misuse, should include, but not be limited to: (133) To detail the physical security requirements for the University such as computer room requirements, guarding, physical locks, and the security structure of all relevant premises within the offices of the University. (134) The University must define and use an appropriate security perimeter to protect areas such as data centres, which contain information processing facilities. Perimeter security barriers such as walls, card controlled entry gates and/or manned reception desks, should be utilised dependent on the level of physical security required. (135) Data centre physical security must be reviewed at least annually. (136) Remote data centre physical entry controls are governed by the data centre operator contracted to the University to provide data centre services. (137) On-campus physical entry controls are governed by the Facilities and Premises Procedure - Access, Use and Security. (138) Secure areas must be protected by appropriate entry controls to ensure that only authorised users are allowed access. CCTV cameras and/or other access control mechanisms must be used to monitor individual physical access to sensitive areas. Such mechanisms must be protected from tampering and/or disabling. (139) Access logs must be stored for at least three months, unless otherwise either restricted by law or increased by a subsequent standard or guideline. (140) Access to office, computer room, or work areas containing sensitive and/or critical information must be physically restricted, with access only provided to those with a valid business need. Authorised user access lists must be periodically reviewed with access revoked for individuals no longer requiring access. (141) Documented processes and/or procedures for assigning identification cards (e.g. badges) to onsite authorised users and visitors must exist. Such processes and/or procedures must define: (142) Authorised users who can access the identification card generation and/or access control system(s) must be documented and periodically reviewed. (143) Visitor logs for data centre and secure areas must be retained for at least three months and contain the: (144) All authorised users must wear an identification badge on their outer clothing when in data centres and/or facilities that store sensitive and/or critical University data. This identification must be clearly visible and distinguishable between onsite authorised users and visitors. (145) Employees must not permit unknown or unauthorised users access through doors, gates, and/or other entrances to restricted and/or sensitive areas. (146) Controlled areas should be created to protect offices, rooms, and facilities that should not be open to general or public access. All employees must ensure doors to sensitive areas, rooms, and/or information processing facilities are locked, preventing unauthorised access when not in use. Physical and/or logical controls must be implemented, restricting access to publicly accessible network connection points. (147) Information systems must be housed in a secure manner, protected from external and environmental threats to the premises. Such threats include, but are not limited to: (148) Data centres must have: (149) Additional controls and guidelines for working in sensitive areas must be used to enhance the security provided by the physical controls protecting the secure areas. Access to sensitive areas must be authorised and based on individual job functions with access revoked immediately upon termination. (150) Delivery and loading areas should be controlled, and where possible, isolated from information processing facilities to avoid unauthorised access. (151) On-premise equipment must be located and/or protected to reduce the risks from environmental threats, hazards, and opportunities for unauthorised access. (152) Physical access to networking and/or communications hardware must be restricted by appropriate physical controls. All business critical production computer systems including, but not limited to servers, firewalls, proximity access control, systems, and/or voice mail systems must be physically located within a secure data centre. (153) Authorised users should be encouraged to report detection of tampering and/or the substitution of devices. Training should include, but not be limited to: (154) Key ICT equipment must be protected from power failures and surges and other electrical anomalies. Uninterruptible power supply (UPS) systems, line conditioners, electrical power filters, and/or surge suppressors must be used for business critical ICT infrastructure. (155) Critical supporting utilities must be tested on a regular basis to ensure equipment has adequate capacity, in accordance with the manufacturer’s recommendations. (156) Power and telecommunications cabling carrying data and/or supporting information services must be protected from interception or damage. Installation and maintenance of power and telecommunication cabling must follow current industry security standards. (157) Equipment sent off-site for maintenance purposes must have any sensitive or confidential information erased to ensure the confidentiality and integrity of information. (158) Users must ensure that unattended equipment contains appropriate protection and/or security controls when unattended. If the computer system to which an authorised user is connected contains sensitive information, the authorised user must not leave their personal computer, workstation, or terminal unattended without locking or logging out. (159) Unless information is in active use by authorised users, desks must be clear and clean during non-working hours with sensitive information locked away. (160) Operations security aims to: (161) Operating procedures for information systems must be documented and maintained. Operating procedures must include, but not be limited to: (162) Nonstandard changes to information processing facilities and systems must be documented and controlled via the University Change Advisory Board (CAB). Extensions, modifications, and/or replacements to production software and hardware must be performed only when approval from CAB has been received prior to the proposed change window start time. (163) A change control procedure for all changes is defined and includes the: (164) Risk assessments and/or vulnerability assessments must be conducted when implementing new systems or making significant changes to existing systems. (165) Adequate rollback procedures must be developed for all changes to production systems ensuring information processing in the case of a change failure can be promptly restored to the respective state prior to the most recent change. (166) All changes to information processing facilities must be communicated to all relevant authorised users. Changes to the environment may also trigger the requirement to perform specific security tests, inclusive of, but not limited to vulnerability assessments and penetration tests confirming that changes made have not inadvertently degraded the security profile of the University. (167) Capacity demands must be monitored with projections of future capacity requirements made to ensure that adequate processing power, storage and other required resources are available. (168) Facilities and functions used in the development of computing solutions, notwithstanding their respective testing, must strictly be kept separate from production systems. This is to reduce the likelihood of accidental, and/or unauthorised changes to production systems, subsequently creating operational problems and/or compromising the University’s related information. (169) Separation can be achieved through physical or logical separation, appropriate to the sensitivity of the information and/or functions of the system concerned. (170) Detection and prevention controls to protect against malware and appropriate user awareness procedures must be implemented. Approved anti-malware software must be deployed across the University network to all systems, remain enabled, and contain regular definition updates and scanning. (171) Anti-malware solutions and/or other appropriate controls should also be implemented and configured to prevent or detect the use of: (172) Anti-malware mechanisms must be confirmed as actively running and cannot be disabled or altered by users unless specifically authorised by management on a case-by-case basis for a limited period. (173) Systems that are malware-infected must be disconnected from the network until such time when the anti-malware software has been updated and all malware eradicated. (174) Appropriate content filtering mechanisms must be deployed to protect all user initiated connections. Formal measurement and reporting procedures must also be implemented to record the number and severity of actual or suspected malicious code incidents. (175) The University must ensure backup facilities are provided and used. Backup strategies are developed in collaboration with system custodians and contain copies of essential business information and software. All sensitive, valuable, and/or critical information recorded on backup computer media and stored outside University offices must be given an appropriate level of physical and environmental protection. Backup and restore procedures must be securely and adequately documented. (176) Critical business information and critical software archived on computer storage media for prolonged periods must be tested at least annually providing assurance that such data can be completely and efficiently recovered. (177) Refer to Appendix C for backup details. (178) Audit logs must be produced for business critical systems, showing: (179) Audit trails must be retained for at least one year. Audit trails must be implemented to link all system component access to each individual user. For event reconstruction, audit trails should contain event information that may include, but not limited to: (180) Audit trail entries recorded for all system components for each event must contain, but not be limited to: (181) All system and application logs must be maintained in a form such that logs cannot be accessed by unauthorised users and are stored in a secure manner. Authorised users must have a readily demonstrable need for such access to perform their regular duties. All other authorised users seeking access to these logs must first obtain approval. (182) Audit trails must be secured and promptly backed up to a centralised log server or to media that is difficult to alter. Only individuals with job-related needs may have access to view audit trail files. (183) Log information should be exported to a security information and event monitoring solution for automated analysis, alerting and actioning. (184) Activities of administrators and operational staff must be logged. These logs must be kept for at least one year with a minimum of three months available online and not be altered by anyone. These logs must be subject to regular and independent checks. (185) Time synchronisation technology such as the Network Time Protocol (NTP) must be used to synchronise all critical system clocks, dates, and times. (186) Updating of operational program libraries must only be performed by nominated administrators or system custodians following appropriate authorisation. Configuration management processes should be used to keep track and control all implemented software as well as related system documentation. Changes to operational systems must undergo the formal change management processes. (187) Systems must be in place for the timely gathering of information about technical vulnerabilities of information systems. Processes must be established to identify new security vulnerabilities using reputable outside sources for security vulnerability information. (188) The University’s exposure to these vulnerabilities must be evaluated with appropriate measures taken to address the associated risk across the University. (189) All patches and security updates should be pushed out in a formalised and secure manner with all critical or high-ranking patches installed within one month of vendor release or other approved third party. Installation of all remaining applicable vendor supplied security patches should be applied within an appropriate period (e.g. within three months). (190) For internet accessible web applications, new threats and vulnerabilities must be addressed on an ongoing basis ensuring such applications are protected against known attacks. The installation of an automated technical solution that detects and prevents web-based attacks (e.g. a Web Application Firewall (WAF)) in front of public-facing web applications will allow continual checks of all traffic and generate alerts where applicable. (191) A documented process to review the security of public-facing web applications using either manual or automated tools or methods must exist. These processes must be reviewed: (192) Such web application must be re-evaluated post remediation actions. (193) An authorised user shall not introduce software or technology designed to disrupt, corrupt or destroy programs and/or data, or sabotage University ICT facilities as per the Information Technology Procedure - Acceptable Use and Access. (194) Access rights for the installation of software should follow the principle of least privilege. (195) Audits of operational systems must be planned and agreed upon to minimise the risk of disruptions to business processes. Audit requirements, scope and access other than read-only must be authorised by the University with adequate resources provided. Procedures, requirements and responsibilities must be documented with all access monitored and logged. (196) To ensure the protection of information in networks and its supporting information processing facilities. (197) Controls must be implemented to achieve and maintain performance, reliability and security in networks inclusive of information in transit. Firewalls must be installed at each internet connection between any de-militarised zone (DMZ) and/or intranets and between any wireless network. (198) Configuration standards must be documented, implemented, updated and referenced for the installation and administration of all firewalls and routers. Rule sets and/or access control lists (ACLs) of firewalls and/or routers must be reviewed at least every six months. Configuration standards must include a description of groups, roles, and responsibilities for management of network components. They must also include a list of all services, protocols and ports necessary for business, inclusive of business justifications for protocols considered to be insecure. (199) All data transmitted over open public networks must be secured using strong cryptography and security protocols including, but not limited to TLS (transport layer security), and/or IPsec. A process should also be specified for: (200) Configurations and related parameters on all hosts attached to the University network must comply with current policies and standards. Security risk assessments must be conducted as a part of network design processes with such assessments carried out when introducing new network services or making significant changes to existing services. (201) All administrative access must be encrypted using security protocols, including but not limited to SSH (Secure Shell), VPN, or TLS. This applies to both web-based management and other administrative access. Responsibilities and procedures for the management of the network must also be established and documented. (202) Network diagrams and configurations including connections to other systems and networks must be maintained and kept current. Network diagrams must identify all connections between the environments containing critical and/or sensitive data and other networks including any wireless networks. (203) For critical business systems, methods to obscure IP addressing must be in place to prevent the disclosure of the private IP addresses and routing information from internal networks to the Internet. Such methods may include, but are not limited to: (204) Personal firewall software must be installed and active on any managed or non-managed devices used to access University’s network infrastructure that also connects to the Internet when outside of the University network. Personal security software must not be alterable by users of managed or non-managed devices. Security policies and operational procedures for managing firewalls should be documented, in use, and known to all affected parties. (205) Clear descriptions of security attributes, service levels and management requirements for all network services used by the University must be provided, inclusive of service level agreements and monitoring for services provided in-house or outsourced. (206) Controls must be implemented in networks to segregate groups of information services, users and information systems. Security risk assessments must be conducted as a part of network design processes with such assessments carried out regularly on data networks. (207) When transferring confidential or private information outside of the University, procedures and controls must be developed and implemented to protect the exchange, confidentiality and integrity of information. (208) Formal agreements must be established for the electronic and/or manual exchange of information between the University and other organisations, third parties or clients. (209) All employees of the University with access to email facilities are bound by the Information Technology Procedure - Acceptable Use and Access. (210) Where no contractual agreement exists, confidentiality and/or non-disclosure agreements reflecting the needs of the University for the protection of information should be used, regularly reviewed and documented. (211) To detail the specific criteria around the acquisition, development and maintenance of information systems. (212) University system custodians and project managers must take security into consideration during all stages of system application development for both in-house and outsourced software development. The Division of Information Technology must be consulted on security requirements and specifications in the initial stages of project development. All software development efforts are to follow the defined processes for system and/or software development life cycle (SDLC), where security measures are defined at every stage of the entire process. (213) Statements of business requirements for new information systems or enhancements to existing information systems must specify the requirements for information security controls. Risk assessment and risk management must be the base framework for analysing information security requirements and control identification. Security requirements and controls should reflect the business value of information assets involved and the potential business impact or loss, which may result from a failure or absence of security. (214) Information involved in application services traversing public networks should be protected using strong encryption methods from fraudulent activity and unauthorised disclosure and modification. (215) Information involved in application integration services should be protected to prevent: (216) Rules for the development of software and systems should be established and applied to all developments within the University. (217) The implementation of changes must be controlled using the University Information Technology Infrastructure Library (ITIL) change management procedures. The change management procedures must also be used for the testing and implementation of security patches and software modifications. (218) When hosting environments and/or operating systems are significantly changed, business critical applications must be reviewed and tested to ensure there are no adverse impacts or information security risks to the application. (219) Modifications to vendor supplied software packages must be discouraged and/or limited to necessary changes. All changes must be strictly controlled, documented and approved via change control procedures. (220) Guidelines for engineering of secure systems should be documented, maintained, reviewed and applied to any information system implementation efforts. (221) Virtualisation of systems can deliver increased operational efficiency in terms of hardware, network, storage and utilities usage. The security requirements of all virtualisation components must be considered. (222) Most security vulnerabilities and threats apply equally to virtualised and physical environments however virtualisation may introduce additional security implications. (223) All elements of a virtualisation solution must be secured and security maintained through software updates, configuration reviews and security testing. (224) Administrator access to the hypervisor must be restricted, managed and monitored. (225) Hypervisors and guest operating systems should be monitored for indicators of compromise. (226) The University should establish and appropriately protect secure development environments for system development and integration efforts, encompassing the entire system development lifecycle. (227) Privileged access to development and test environments should use different credentials from those used to access production environments. (228) When outsourcing software development projects the University should consider: (229) Testing of application security functionality should be integrated into development processes. (230) Acceptance criteria for new information systems, upgrades and new versions must be established with suitable tests of the system carried out prior to acceptance. Requirements and criteria for acceptance of new systems must be clearly defined, agreed, documented and tested. (231) Test data should be selected carefully and protected. Data masking or obfuscation should be used if using production data in development or test environments. (232) Test data, applications and related systems must be protected from unauthorised access and modifications. The use of operational databases containing sensitive and/or critical production data/information for testing purposes must be avoided. (233) Confidential information such as Personally Identifiable Information (PII) used for testing purposes, inclusive of all sensitive details and content, should be protected for removal and/or modification. Production data must not be used for testing or development. (234) To ensure protection of the University's assets accessible by suppliers. (235) Information security requirements for mitigating the risks associated with supplier access to University assets should be agreed with the supplier and documented. (236) Any authorised users contemplating the use of cloud-based services, or the transfer or storage of University information externally, will first engage the services of the Division of Information Technology to ensure that resulting solution is viable and secure. (237) The procurement of cloud services should be undertaken in accordance with the NSW Government Cloud Policy and the Division of Information Technology Project Security Considerations. (238) All relevant information security requirements should be established and agreed upon with each supplier that may access, process, store, communicate, and/or provide ICT infrastructure components for the information of the University. (239) Contractual agreements should consider and include the security considerations as described in the NSW Government Cloud Policy and the Division of Information Technology Project Security Considerations. (240) An established process for engaging service providers including proper due diligence prior to engagement should exist. The University must maintain a list of service providers along with a written agreement that includes an acknowledgement by the service provider of their responsibility for securing critical and/or sensitive data that the service provider possesses, or otherwise stores, processes, or transmits on behalf of the University. (241) A formal review of service provider contracts and the compliance status of their security standards should be conducted when significant change occurs or on a regular basis as determined necessary for each provider. The frequency and depth of the review is based on level of risk, the services provided and classification of the information handled by the third party ensuring secure management practices are in place. (242) Service providers should provide regular reports on the status of the services delivered to the University. (243) The University should review reports regularly to ensure adherence to agreements. (244) When required, service providers should allow the University, or an entity on its behalf, to audit the service provider’s facilities, networks, computer systems and procedures for compliance in accordance with the agreed information security policies and standards. (245) Changes to the services provided by third parties are required to be managed. These include but are not limited to: (246) The management of changes should be in line with formal change control procedures. This includes consideration of business system criticality and processes involved for the re-assessment of risks. (247) To detail clear definitions of the types of incidents that are likely to be encountered and document a plan for corrective action. (248) A consistent and effective approach should be applied to the management of information security incidents. Incident management responsibilities and procedures must be established to ensure quick, effective and orderly responses to security incidents and software malfunctions. (249) Individuals responsible for handling information security incidents must provide accelerated problem notification, damage control, and problem correction services in the event of computer related emergencies such as virus outbreaks and intrusions. (250) Individuals responsible for handling information systems security incidents must have clearly defined responsibilities and be provided the authority to handle incidents and create security incident reports. (251) The Division of Information Technology (DIT) is responsible for defining and operating a critical incident response process. (252) Information security events associated with information systems must be reported to the IT Service Desk. (253) A formal reporting and incident response procedure must be established for all breaches of information security, actual or suspected. (254) All authorised users must be made aware of the procedure for reporting security incidents and the need to report incidents immediately. (255) The DIT's critical incident response process must be tested at least annually and testing procedures must be in place. (256) Any observed or suspected security weaknesses in, or threats to, systems or services must be reported to the IT Service Desk. All authorised users must be made aware of this and be instructed not to attempt to deliberately exploit suspected vulnerabilities. (257) Information security events should be assessed to determine whether they are to be classified as information security incidents. (258) Information security incidents should be responded to by the DIT ICT Security Team including other relevant authorised users of the University and/or external parties. (259) Mechanisms must be in place to enable the types, volumes and costs of incidents and malfunctions to be quantified and monitored. An annual analysis of reported information security problems and violations must be prepared. Knowledge gained from analysing and resolving information security incidents should be used to reduce the likelihood or impact of future incidents. (260) Where action against an individual or organisation involves the law, either civil or criminal, the evidence presented must conform to the rules for evidence laid down in the relevant law or in the rules of the specific court in which the case will be heard. This must include compliance with any published current standard and/or code of practice to produce admissible evidence. (261) Information systems continuity should be embedded within the University's business continuity management systems. (262) A managed process regarding information systems availability requirements must be in place for the development and maintenance of business continuity throughout the University. Plans based on appropriate risk assessments must be developed as part of the overall approach to the University’s business continuity. This is described in the Information Technology Policy. The ICT Security Team in conjunction with the Division of Information Technology leadership oversee the implementation of this. The extent of such plans is dependent on the delivery of a business impact analysis (BIA). (263) Such BIA must result in the specification of the: (264) Plans must be developed to maintain or restore business operations in a timely manner following a disaster or crisis. (265) The Information Technology Service Continuity Management (ITSCM) team must prepare and update a crisis management plan, covering topics such as: (266) The ITSCM team will develop, implement and test business continuity and/or disaster recovery plans. Such plans must specify how alternative facilities such as, but not limited to telephones, systems, and networks, will be provided for authorised users to continue operations in the event of an interruption to, or failure of, critical business processes. (267) All business continuity plans must consider the information security requirements of the University. (268) The University should verify established and implemented information security continuity controls at reoccurring intervals ensuring such controls are valid and effective during adverse situations. Business continuity plans must be tested regularly by respective teams as outlined in the ITSCM and undergo regular reviews ensuring such plans are up to date and effective. (269) Appropriate redundancy, as determined by required service levels, must be in place for critical systems and assets ensuring the availability of information processing facilities. (270) The University should identify business requirements for the availability of information systems. Where the availability cannot be guaranteed using existing system architecture, redundant components and/or architectures should be considered. (271) Where applicable, redundant information systems should be tested ensuring successful failover between components and/or other functions as intended. (272) To avoid breaches of legal, statutory, regulatory, and/or contractual obligations related to information security and/or other security requirements. (273) For every University production information system, all relevant statutory, regulatory and contractual requirements must be identified. This includes but is not limited to: (274) Appropriate procedures must be implemented to ensure compliance with legal restrictions on the use of material in respect of intellectual property rights and on the use of proprietary software products. (275) All computer programs and program documentation owned by the University must include appropriate copyright notices. (276) University records must be protected from loss, destruction, and falsification. Refer to the Records Management Policy, Privacy Management Plan and the Information Technology Procedure - Personal Data Breach. (277) The University must conduct information security reviews to ensure that information security controls are implemented and operated in accordance with the University’s policies and procedures. (278) The University's approach to managing information security and its implementation (e.g. control objectives, controls, policies, processes and procedures) should be reviewed independently at planned intervals or when significant changes occur. (279) The University must ensure that all security procedures within their areas of responsibility are carried out correctly. Areas within the University that must be subjected to regular reviews to ensure compliance with security policies, procedures and standards include, but are not limited to: (280) Variances from generally accepted information system control practices must be noted and promptly initiated for corrective action. (281) Technical compliance should be reviewed with the assistance of automated tools which generate technical reports for subsequent interpretation by technical specialists. Alternatively, manual reviews by experienced system engineers supported by appropriate software tools may be performed. (282) Any technical compliance reviews such as penetration tests or vulnerability assessments should only be carried out by competent authorised users and/or under the supervision of such users. (283) The operation of the ISMS will be monitored in accordance with the University ISMS performance monitoring process. (284) Cyber security benchmarking should be conducted to ensure that the University's cyber security posture is aligned with the evolving threat landscape and sector opportunities. (285) With reference to industry benchmarks, frameworks, standards and best practices, the University will identify gaps and vulnerabilities in deployed security controls and develop remediation plans to address these issues as part of an ongoing continuous improvement program. (286) Benchmarking reviews will be carried out by competent authorised parties and will include the following elements: (287) This guideline uses terms defined in the Information Technology Policy, as well as the following:Information Security Guidelines
Section 1 - Purpose
Scope
References
Top of PageSection 2 - Policy
Section 3 - Procedure
Section 4 - Guidelines
Part A - Organisation of information security
Objective
Information security roles and responsibilities
Segregation of duties
Contact with authorities
Contact with special interest groups (SIGs)
Information security in project management
Part B - Human resource security
Objective
Prior to employment
Screening
Terms and conditions of employment
During employment
Management responsibilities
Information security awareness, education and training
Termination and change of employment
Termination or change of employment responsibilities
Part C - Asset management
Objective
Responsibility for assets
Inventory of assets
Custodianship of assets
Acceptable use of assets
Return of assets
Removal and secure disposal or reuse of equipment
Information classification
Classification of information
Handling of information assets
Media handling
Management of removable media
Disposal of media
Part D - Access control
Objective
Business requirements of access control
Access Control
User access provisioning
User access de-registration
Remote work
Vendor remote access
Review of user access rights
System and application access control
Access control considerations for new services
Management of privileged access rights
Use of secret authentication information
System and application access control
Information access restriction
Secure logon
Password management system
Use of privileged utility programs
Access control to program source code
Mobile devices
Part E - Cryptography
Cryptographic controls
Objective
Use of cryptographic controls
Key management
Part F - Physical and environmental security
Objective
Secure areas
Physical security perimeter
Physical entry controls
Securing sensitive offices, rooms and facilities
Protecting against external and environmental threats
Working in secure areas
Delivery and loading areas
Information and communication technology (ICT) equipment
Equipment location and protection
Supporting utilities
Cabling security
Equipment maintenance
Unattended user equipment
Clear desk
Part G - Operations security
Operational procedures and responsibilities
Objective
Documented operating procedures
Change management
Capacity management
Separation of development, testing and operational environments
Protection from malware
Control against malware
Backup
Information backup
Logging and monitoring
Event logging
Protection of log information
Administrator and operator logs
Clock synchronisation
Control of operational software
Installation of software on operational systems
Technical vulnerability management
Management of technical vulnerabilities
Restrictions on software installation
Information systems audit considerations
Information systems audit controls
Part H - Communications security
Network security management
Objective
Network controls
Security of network services
Segregation in networks
Information transfer
Information transfer policies and procedures
Agreements on information transfer
Electronic messaging
Confidentiality or non-disclosure agreements
Part I - System acquisition, development, and maintenance
Objective
Security requirements of information systems
Information security requirements analysis and specification
Securing application services on public networks
Protecting application services transactions
Security in development and support processes
System change control procedures
Technical review of applications after operating platform changes
Restrictions on changes to software packages
Secure system engineering principles
Virtualisation
Secure development environment
Outsourced development
System security testing
System acceptance testing
Test data
Protection of test data
Part J - Supplier and cloud service provider relationships
Information security in supplier and cloud service provider relationships
Objective
Information security procedures for supplier relationships
Addressing security within agreements
Supplier service delivery management
Monitoring and review of supplier services
Managing changes to supplier services
Part K - Information security incident management
Objective
Management of information security incidents and improvements
Responsibilities and procedures
Reporting information security events
Reporting information security weaknesses
Assessment of and decision on information security events
Response to information security incidents
Learning from information security incidents
Collection of evidence
Part L - Information security aspects of business continuity management
Information systems continuity
Planning information systems continuity
Implementing information security continuity
Verify, review and evaluate information security continuity
Redundancies
Availability of information processing facilities
Part M - Compliance
Objective
Compliance with legal and contractual requirements
Identification of applicable legislation and contractual requirements
Intellectual property rights
Protection of records
Information security reviews
Independent review of information security
Compliance with security policies and standards
Technical compliance review
Information security management system (ISMS) performance monitoring process
Benchmarking
Top of PageSection 5 - Glossary
View Current
This is the current version of this document. To view historic versions, click the link in the document's navigation bar.