Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Checklist metadata validation and checklist mapper severities #2750

Merged
merged 37 commits into from
Jul 31, 2024
Merged
Show file tree
Hide file tree
Changes from 32 commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
0e7a273
input validation for checklist metadata
kemley76 Jun 6, 2024
0d5fd2c
use hdf-converters in hdf2ckl
kemley76 Jun 7, 2024
f0a2dac
updated hdf2ckl tests
kemley76 Jun 7, 2024
71d53c7
update tests based on changes to ckl mapper
kemley76 Jun 28, 2024
f1d09e3
update ckl metadata validation to use hdf-converters helper function
kemley76 Jul 1, 2024
dd3fbd3
added ability to use local install of inspecjs
kemley76 Jul 9, 2024
1999bd5
update checklist commands and tests
kemley76 Jul 9, 2024
0ff6be4
ensure threshold counts stay based off impact
kemley76 Jul 9, 2024
69e94c0
added tests to ensure that converting with invalid metadata display a…
kemley76 Jul 10, 2024
5d2ffda
use checklist types from hdf-converters
kemley76 Jul 10, 2024
16e1bc6
remove redundant code in hdf2ckl command
kemley76 Jul 10, 2024
4cfe734
use inspecJS to convert impact to severity
kemley76 Jul 10, 2024
4482231
use checklist types from hdf-converters
kemley76 Jul 11, 2024
0711ff6
Merge branch 'hdf2ckl-severity-update' into update-hdf-converters
kemley76 Jul 15, 2024
6eaf79e
fix test data
kemley76 Jul 15, 2024
302e731
Merge branch 'main' into update-hdf-converters
kemley76 Jul 15, 2024
1da2b0f
enforce enum matching for user input in generate ckl_metadata command
kemley76 Jul 15, 2024
c4de62d
add backwards compatibility for old checklist metadata format
kemley76 Jul 16, 2024
b3d4724
Merge branch 'main' into update-hdf-converters
kemley76 Jul 23, 2024
72c8f39
remove debug statement
kemley76 Jul 23, 2024
02b21d2
fix code smells
kemley76 Jul 23, 2024
11991ca
linting
kemley76 Jul 23, 2024
5a091f4
format every output json file with 2 space indent
kemley76 Jul 23, 2024
e540f79
add flags for all metadata fields on hdf2ckl command
kemley76 Jul 24, 2024
c531d2b
clarify instructions on ckl metadata generation
kemley76 Jul 24, 2024
83c98f1
change formating from 4 to 2 space indent
kemley76 Jul 24, 2024
14aa7be
make version and release number optional in checklist metadata genera…
kemley76 Jul 24, 2024
9500d89
update tests to reflect better formatted error messages
kemley76 Jul 24, 2024
a84c21a
update markdown summary table to include row for severity: none
kemley76 Jul 25, 2024
4de13d1
update code and tests to count N/A controls with severity other than …
kemley76 Jul 25, 2024
81a36bb
Merge branch 'main' into update-hdf-converters
kemley76 Jul 25, 2024
b4fa9f6
fix code smells
kemley76 Jul 26, 2024
7ad5e57
revert addition of severity-none row to markdown summary table
kemley76 Jul 29, 2024
be94295
Merge branch 'main' into update-hdf-converters
Amndeep7 Jul 31, 2024
61e1dff
remove heimdall version when running checklist tests
kemley76 Jul 31, 2024
a6b99b5
change return type of string | undefined to string | null
kemley76 Jul 31, 2024
2f5f496
refactor to avoid while true loops
kemley76 Jul 31, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
47 changes: 37 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -383,20 +383,47 @@ convert hdf2ckl Translate a Heimdall Data Format JSON file into a
DISA checklist file

USAGE
$ saf convert hdf2ckl -i <hdf-scan-results-json> -o <output-ckl> [-h] [-m <metadata>] [-H <hostname>] [-F <fqdn>] [-M <mac-address>] [-I <ip-address>]
$ saf convert hdf2ckl saf convert hdf2ckl -i <hdf-scan-results-json> -o <output-ckl> [-h] [-m <metadata>] [--profilename <value>] [--profiletitle <value>] [--version <value>] [--releasenumber <value>] [--releasedate <value>] [--marking <value>] [-H <value>] [-I <value>] [-M <value>] [-F <value>] [--targetcomment <value>] [--role Domain Controller|Member Server|None|Workstation] [--assettype Computing|Non-Computing] [--techarea |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS] [--stigguid <value>] [--targetkey <value>] [--webdbsite <value> --webordatabase] [--webdbinstance <value> ] [--vulidmapping gid|id]

FLAGS
-F, --fqdn=<fqdn> FQDN for CKL metadata
-H, --hostname=<hostname> Hostname for CKL metadata
-I, --ip=<ip-address> IP address for CKL metadata
-M, --mac=<mac-address> MAC address for CKL metadata
-h, --help Show CLI help.
-i, --input=<hdf-scan-results-json> (required) Input HDF file
-m, --metadata=<metadata> Metadata JSON file, generate one with "saf generate ckl_metadata"
-o, --output=<output-ckl> (required) Output CKL file
-h, --help Show CLI help.
-i, --input=<value> (required) Input HDF file
-o, --output=<value> (required) Output CKL file

CHECKLIST METADATA FLAGS
-F, --fqdn=<value> Fully Qualified Domain Name
-H, --hostname=<value> The name assigned to the asset within the network
-I, --ip=<value> IP address
-M, --mac=<value> MAC address
-m, --metadata=<value> Metadata JSON file, generate one with "saf generate ckl_metadata"
--assettype=<option> The category or classification of the asset
<options: Computing|Non-Computing>
--marking=<value> A security classification or designation of the asset, indicating its sensitivity level
--profilename=<value> Profile name
--profiletitle=<value> Profile title
--releasedate=<value> Profile release date
--releasenumber=<value> Profile release number
--role=<option> The primary function or role of the asset within the network or organization
<options: Domain Controller|Member Server|None|Workstation>
--stigguid=<value> A unique identifier associated with the STIG for the asset
--targetcomment=<value> Additional comments or notes about the asset
--targetkey=<value> A unique key or identifier for the asset within the checklist or inventory system
--techarea=<option> The technical area or domain to which the asset belongs
<options: |Application Review|Boundary Security|CDS Admin Review|CDS Technical Review|Database Review|Domain Name System (DNS)|Exchange Server|Host Based System Security (HBSS)|Internal Network|Mobility|Other Review|Releasable Networks (REL)|Releaseable Networks (REL)|Traditional Security|UNIX OS|VVOIP Review|Web Review|Windows OS>
--version=<value> Profile version number
--vulidmapping=<option> Which type of control identifier to map to the checklist ID
<options: gid|id>
--webdbinstance=<value> The specific instance of the web application or database running on the server
--webdbsite=<value> The specific site or application hosted on the web or database server
--webordatabase Indicates whether the STIG is primarily for either a web or database server

DESCRIPTION
Translate a Heimdall Data Format JSON file into a DISA checklist file

EXAMPLES
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90:AB

$ saf convert hdf2ckl -i rhel8-results.json -o rhel8.ckl -m rhel8-metadata.json
```
[top](#convert-hdf-to-other-formats)
#### HDF to CSV
Expand Down
56 changes: 56 additions & 0 deletions pack-inspecjs.bat
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
ECHO OFF

SET CYPRESS_INSTALL_BINARY=0
SET PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true

SET original_dir=%cd%
ECHO %original_dir%

IF DEFINED npm_config_heimdall (
CD %npm_config_heimdall%/libs/inspecjs/
) ELSE (
CD ../heimdall2/libs/inspecjs/
)

IF DEFINED npm_config_branch (
CALL git switch %npm_config_branch% || EXIT /B %ERRORLEVEL%
) ELSE (
CALL git switch master || EXIT /B %ERRORLEVEL%
)

ECHO Executing - git fetch ...
CALL git fetch || EXIT /B %ERRORLEVEL%

ECHO Executing - git pull ...
CALL git pull || EXIT /B %ERRORLEVEL%

ECHO Executing - yarn install ...
CALL yarn install || EXIT /B %ERRORLEVEL%

ECHO Executing - yarn pack ...
CALL yarn pack || EXIT /B %ERRORLEVEL%

ECHO Finished generating the tarball

CD %original_dir%

ECHO Executing - npm install remote ...
CALL npm i || EXIT /B %ERRORLEVEL%

ECHO Executing - npm install local ...

IF DEFINED npm_config_heimdall (
FOR /f "tokens=*" %%a IN ('dir /b %npm_config_heimdall%\libs\inspecjs\inspecjs-v*.tgz') DO (
SET THIS_TAR_ZIP=%npm_config_heimdall%\libs\inspecjs\%%a
)
) ELSE (
FOR /f "tokens=*" %%a IN ('dir /b ..\heimdall2\libs\inspecjs\inspecjs-v*.tgz') DO (
SET THIS_TAR_ZIP=..\heimdall2\libs\inspecjs\%%a
)
)
CALL npm i %THIS_TAR_ZIP% || EXIT /B %ERRORLEVEL%

ECHO Executing - npm run prepack ...
CALL npm run prepack || EXIT /B %ERRORLEVEL%

ECHO Install of local inspecjs complete.
40 changes: 40 additions & 0 deletions pack-inspecjs.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
#!/bin/bash

set -o errexit # abort on nonzero exitstatus
set -o nounset # abort on unbound variable
set -o pipefail # don't hide errors within pipes

ORIGINAL=$PWD
echo $ORIGINAL

cd "${npm_config_heimdall:-../heimdall2}"
cd libs/inspecjs

git switch "${npm_config_branch:-master}"

echo "Executing - git fetch ..."
git fetch

echo "Executing - git pull ..."
git pull

echo "Executing - yarn install ..."
CYPRESS_INSTALL_BINARY=0 PUPPETEER_SKIP_CHROMIUM_DOWNLOAD=true yarn install

echo "Executing - yarn pack ..."
yarn pack

echo "Finished generating the tarball"

cd "$ORIGINAL"

echo "Executing - npm install remote ..."
npm i

echo "Executing - npm install local ..."
npm i "${npm_config_heimdall:-../heimdall2}/libs/inspecjs/inspecjs-v"*".tgz"

echo "Executing - npm run prepack ..."
npm run prepack

echo "Install of local inspecjs complete."
5 changes: 4 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -197,7 +197,10 @@
"prepack:darwin:linux": "rm -rf lib && tsc",
"pack-hdf-converters": "run-script-os",
"pack-hdf-converters:win32": "pack-hdf-converters.bat",
"pack-hdf-converters:darwin:linux": "./pack-hdf-converters.sh"
"pack-hdf-converters:darwin:linux": "./pack-hdf-converters.sh",
"pack-inspecjs": "run-script-os",
"pack-inspecjs:win32": "pack-inspecjs.bat",
"pack-inspecjs:darwin:linux": "./pack-inspecjs.sh"
},
"types": "lib/index.d.ts",
"jest": {
Expand Down
2 changes: 1 addition & 1 deletion src/commands/convert/asff2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ export default class ASFF2HDF extends Command {
_.forOwn(results, (result, filename) => {
fs.writeFileSync(
path.join(flags.output, checkSuffix(filename)),
JSON.stringify(result),
JSON.stringify(result, null, 2),
)
})
}
Expand Down
2 changes: 1 addition & 1 deletion src/commands/convert/aws_config2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,6 @@ export default class AWSConfig2HDF extends Command {
region: flags.region,
}, !flags.insecure, flags.certificate ? fs.readFileSync(flags.certificate, 'utf8') : undefined) : new Mapper({region: flags.region}, !flags.insecure, flags.certificate ? fs.readFileSync(flags.certificate, 'utf8') : undefined)

fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(this.ensureRefs(await converter.toHdf())))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(this.ensureRefs(await converter.toHdf()), null, 2))
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/burpsuite2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class Burpsuite2HDF extends Command {
checkInput({data, filename: flags.input}, 'burp', 'BurpSuite Pro XML')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
8 changes: 6 additions & 2 deletions src/commands/convert/ckl2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,11 @@ export default class CKL2HDF extends Command {
const data = fs.readFileSync(flags.input, 'utf8')
checkInput({data, filename: flags.input}, 'checklist', 'DISA Checklist')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
try {
const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
} catch (error) {
console.error(`Error converting to hdf:\n${error}`)
}
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/conveyor2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ export default class Conveyor2HDF extends Command {
for (const [filename, result] of Object.entries(results)) {
fs.writeFileSync(
path.join(flags.output, checkSuffix(filename)),
JSON.stringify(result),
JSON.stringify(result, null, 2),
)
}
}
Expand Down
2 changes: 1 addition & 1 deletion src/commands/convert/dbprotect2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class DBProtect2HDF extends Command {
checkInput({data, filename: flags.input}, 'dbProtect', 'DBProtect report in "Check Results Details" XML format')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/fortify2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,6 @@ export default class Fortify2HDF extends Command {
checkInput({data, filename: flags.input}, 'fortify', 'Fortify results FVDL file')

const converter = new Mapper(data, flags['with-raw'])
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
2 changes: 1 addition & 1 deletion src/commands/convert/gosec2hdf.ts
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,6 @@ export default class GoSec2HDF extends Command {
checkInput({data, filename: flags.input}, 'gosec', 'GoSec results JSON')

const converter = new Mapper(fs.readFileSync(flags.input, 'utf8'), flags.name)
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf()))
fs.writeFileSync(checkSuffix(flags.output), JSON.stringify(converter.toHdf(), null, 2))
}
}
4 changes: 2 additions & 2 deletions src/commands/convert/hdf2asff.ts
Original file line number Diff line number Diff line change
Expand Up @@ -45,11 +45,11 @@ export default class HDF2ASFF extends Command {
fs.mkdirSync(outputFolder)
if (convertedSlices.length === 1) {
const outfilePath = path.join(outputFolder, convertFullPathToFilename(checkSuffix(flags.output)))
fs.writeFileSync(outfilePath, JSON.stringify(convertedSlices[0]))
fs.writeFileSync(outfilePath, JSON.stringify(convertedSlices[0], null, 2))
} else {
convertedSlices.forEach((slice, index) => {
const outfilePath = path.join(outputFolder, `${convertFullPathToFilename(checkSuffix(flags.output || '')).replace('.json', '')}.p${index}.json`)
fs.writeFileSync(outfilePath, JSON.stringify(slice))
fs.writeFileSync(outfilePath, JSON.stringify(slice, null, 2))
})
}
}
Expand Down
Loading
Loading