Introduction
An assessment's planning and scoping stages are essential for structuring the process and ensuring that the security insights gained accurately represent the environment being assessed. During this stage, it’s vital to take into account the context in which an organisation operates, including the threat landscape. In this context, the threat landscape can mean the primary threat the organisation is facing—for example, insider threats or external threats—including the level of threat sophistication.
Assessors should examine the organisation’s policies and procedures, conduct comprehensive tests of technical controls pertinent to each strategy, and assess their effectiveness. Determining the organisation's desired maturity level is key to guiding the assessment and establishing the appropriate scope and methods.
The type and quality of evidence collected will influence the assessment outcomes, so it’s critical to ensure that the evidence gathered is of high quality and reliability. This will underpin the report’s conclusions and recommendations.
When there are mandatory requirements for implementing the Essential Eight, an assessment is needed to attest to the level of maturity of the organisation’s cyber security controls. The assessment process, however, is intended to provide an organisation with actionable insights. For this reason, organisations that do not have a mandated requirement will still find regular assessment helpful as a way to identify improvements.
Non-corporate entities within the Australian Government are typically required to obtain a Maturity Level Two within the broader context of the mandatory Protective Security Policy Framework (PSPF).
The following sections describe the four assessment stages.
Contents
Stage 1 - Assessment planning and preparation
During this stage, pre-planning is undertaken to build a contextual overview of the organisation and the threat landscape in which it operates. The assessor will aim to gain an understanding of the infrastructure, the teams the assessor will need to interact with, and the skills required to complete the assessment.
As part of planning, the assessor should discuss the following with the asset owner:
Determine asset classification and assessment scope.
Requirements around access to low and high-privileged user accounts, devices, documentation, personnel, and facilities.
Any approvals required to run scripts and tools within the environment.
Evidence collection and protection requirements, including following the conclusion of the assessment.
Finalising approval to use tools and scripts on sample systems/servers/networks.
Requirements for where the assessment report will be developed (e.g. on the organisation’s system or externally).
How stakeholder engagement and consultation should be approached, including confirming key contact points.
Whether any managed service providers support or manage any aspects of the system(s) under assessment, including appropriate points of contact if so.
Obtaining copies of any previously completed assessment reports for the system.
Agreement on appropriate use, retention and marketing of the assessment report by both parties.
At the end of this stage, the assessor should have developed the assessment test plan.
Stage 2 - Assessment scoping
Different maturity levels will impact aspects or components of the assessment. During this stage the assessor should become familiar with the requirements for the target maturity level, so the assessment approach and test plan can be adjusted accordingly.
The Essential Eight should be implemented and assessed as a package. If a system has not previously been assessed and demonstrated to meet Maturity Level One, that system should not be assessed for Maturity Level Two. Likewise, a system should be assessed and demonstrated to meet Maturity Level Two before being assessed for Maturity Level Three.
As part of determining the appropriate assessment approach, the assessor should conduct the following activities:
Make use of asset registers that describe the environment to determine the applicability of the Essential Eight.
Conduct workshops with the system owners to identify and agree on the precise assessment scope, including out-of-scope items.
Agree with system owners on the assessment duration and milestones.
Obtain an approximate breakdown of the operating systems used within the environment.
Determine the necessary sample size to represent all in-scope assets and types of assets accurately.
Document any assessment limitations, including sample sizes and constraints in the assessment report.
Evidence quality
Assessments should strive to gather and use the highest-caliber evidence to effectively support conclusions on the effectiveness of controls. Evidence quality requirements should be considered and discussed at this stage.
It’s important to use a mix of qualitative and quantitative techniques, as these will often complement each other and allow for cross-referencing. Qualitative techniques may include reviewing documentation and interviewing system administrators. Quantitative techniques could include reviewing system configurations or utilising tools and scripts.
When conducting assessments, the quality of evidence can typically be categorised as follows:
Stage 3 - Assessment of controls
At this stage, the effectiveness of the controls within the Essential Eight are tested against the target Maturity Level.
ACSC has developed standardised assessment outcomes which must be used.
Each control can be assessed as follows:
Effective: The organisation effectively meets the control’s objective.
Ineffective: The organisation is not adequately meeting the control’s objective.
Alternate control: The organisation effectively meets the control’s objective through an alternate control.
Not assessed: The control has not yet been assessed.
Not applicable: The control does not apply to the system or environment.
No visibility: The assessor was unable to obtain adequate visibility of a control’s implementation.
Importantly, the Essential Eight Maturity Model does not allow for risk acceptance without compensating controls. If a system owner has accepted a risk with no compensating controls, the mitigation strategy must not be considered or implemented.
Moreover, when evaluating the efficacy of compensating controls, it’s important to verify that the level of protection the compensating control(s) offer is commensurate to that prescribed by the Essential Eight to protect against the level of adversarial tradecraft for the target Maturity Level.
There is no scope in the Essential Eight model that allows for risks to be accepted without commensurate compensating controls.
Stage 4 - Development of the assessment report
In the final stage, the assessor will develop the security assessment report.
Understanding maturity levels.
The report will contextualise the assessment against the Maturity Model. The Maturity Model contains four levels that provide a way for an organisation to measure its progress in implementing the Essential Eight while also identifying areas for improvement.
There are three target levels, based on increasingly sophisticated adversarial tradecraft levels. Level 0 designates instances where the requirements of the first maturity level are not met.
At Maturity Level 0, weaknesses exist in the overall cyber security posture. This is also the default starting position if no assessment has been done previously.
At Maturity Level One the focus is on protection against malicious actors who are content to simply leverage widely available tradecraft. This level of maturity does not offer protection against APT tradecraft or other persistent threats, including insider threats.
At Maturity Level Two, a level of protection is reached that is sufficient to mitigate threats from malicious actors willing to invest more time in a target and in the effectiveness of their tools.
At Maturity Level Three, the focus is on threats that are more adaptive and much less reliant on public tools and techniques, such as state-sponsored actors, military operations, and other APTs.
Report validity
The assessment report has no expiry date. Theoretically, an assessment could be indefinite, but assessors should be cautious of relying on older previous reports and should consider doing a gap analysis to determine any deviations from succeeding changes to the Essential Eight and changes within the environment itself.
Treatment and exceptions
The use of exceptions for a system needs to be documented and approved by an appropriate authority through a formal process. The appropriate authority may be defined in the broader PSPF for government entities.
Documentation for exceptions should include the scope and justification for the exception, as well as the following details of the compensating controls:
Reason, scope, and justification for compensating controls.
Anticipated implementation lifetime of the compensating control(s).
The review schedule of the compensating control.
The system risk rating before and after the compensating control was implemented.
Any caveats around the use of the system because of the exception.
The formal acceptance from the appropriate authority of any residual risk for the system.
When will the need for the exception next be considered by the appropriate authority? Note that exceptions should not be approved beyond one year.