Heat Validation via Command Line

Note

This section assumes you have already acquired and setup the validation scripts as described in the VVP Installation section.

Basic Script Execution

  1. The scripts must be executed from the ice_validator directory. Navigate to the directory first:

    > cd <vvp-directory>/ice_validator
    
  2. To run the script with default options, issue the following command where <Directory> is the directory containing the Heat templates:

    /path/to/validation-scripts/ice_validator>$ pytest tests --template-directory=<Directory>``
    
  3. Test results will be written to the consoles describing any failures that are encountered. If no failures are found, then a success message will be displayed. Additional reports can be found in the <VVP Directory>/ice_validator/output directory. See Reports for more details.

Command Line Options

Additional options can be specified at the command line to customize output and execution. VVP uses the pytest framework for test execution, and therefore all pytest options are available.

Please refer to the pytest documentation for information on the command-line options that framework provides. We will only document the specific options that are used by VVP:

--template-directory=TEMPLATE_DIR
                      Directory which holds the templates for validation

--category=CATEGORY_NAME
                      Additional validations that can be selected.
                      The only supported value at this time is
                      environment_file which will enforce that specific
                      parameter values are excluded from the environment
                      file per ONAP Heat requirements.

--self-test           Test the unit tests against their fixtured data

--report-format=REPORT_FORMAT
                      Format of output report (html, csv, excel, json)

--continue-on-failure
                      Continue validation even when structural errors exist
                      in input files

--output-directory=OUTPUT_DIR
                      Alternate directory for report output.

--category=TEST_CATEGORIES
                      optional category of test to execute

--env-directory=ENV_DIR
                      optional directory of .env files for preload
                      generation

--preload-format=PRELOAD_FORMATS
                      Preload format to create (multiple allowed). If not
                      provided then all available formats will be created:
                      GR-API, VNF-API

VVP Reports

After completion of the validations, several reports will be written to the ouput directory (specified via --output-directory or <vvp-directory>/ice_validator/output by default).

Report

Description

report.{html,csv,

xlsx,json}

Full report of validation results showing the pass or failure of the validation and details on all errors. See Validation Report for information on the HTML, CSV, and Excel versions of the report, or JSON Report for details on the machine-readable results.

traceability.csv

This shows a mapping from Heat requirement to test case.

mapping_errors.csv

File should be empty, but if present shows tests that are mapped to requirements that no longer exist

traceability.rst

Similar information to traceability.csv, but in reStructuredText for use in the VNF Requirements documentation.

failures

Deprecated JSON version of test failures. Use report.json instead.

preloads/<format>/*

A blank preload will be created for every VF Module in the template directory. The <format> will be based on the preload format(s) selected. See Preload Generation for for more detail.

Validation Report

If the report format of html (default), excel, or csv are requested via the --report-format option, then the a report file will be written to the output directory. Regardless of format, the file will contain a header section that summarizes the results and files scanned, and an error section that has a row for each failure with four columns.

Header Information

Header Element

Description

Categories Selected

Any additional categories selected via --category

Tool Version

Version of the tool that produced the report

Report Generated At

The timestamp of when the report was generated

Directory Validated

Absolute path to directory validated

Checksum

Unique MD5 has of the contents of template directory

Total Errors

Number of errors/violations found

Error Information

If any violations are found, then there will be one row for each violation with the following columns:

Column Name

Description

Files

The file or files that were scanned as part of the test.

Tests

Name of the test case (not shown in HTML version)

Error Message

This shows the test and brief error message from the test that failed. This will contain details about the element that triggered the violation such as the parameter name, resource ID, etc.

In the HTML version of the report this column will also show the test case name, and provide a link to Full Details the raw output of the test

Requirements

The requirement ID and text that was violated

Resolution Steps

For some violations, there are pre-defined resolution steps that indicate what action the user should take to resolve the violation.

Note: Not all violations will have resolution steps, rather the error message and requirement is sufficient.

Raw Test Output

Full output from the pytest test case. This not a dedicated column in the HTML version of the report.

JSON Report

This report is intended to provide a machine-readable version of the test execution, and provides the most comprehensive summary of the test execution and results.

File Header/Top Level

JSON Report <vvp-json-report> The top level will include a summary of available execution metadata.

NOTE: The tests and requirements entries are elided in the example below.

Example Header:

{
  "version": "dublin",
  "template_directory": "/path/to/template",
  "timestamp": "2019-01-21T02:11:07.305000",
  "checksum": "6296aa211870634f9b4a23477c5eab28",
  "profile": "",
  "outcome": "FAIL",
  "tests": [],
  "requirements": [],
}

Header Definition:

Field Name

Required/ Optional/ Conditional

Data Type

Valid Values

Description

checksum

Required

string

MD5 hash of all file contents in the template_directory

outcome

Required

string

PASS FAIL ERROR

One of the valid values:

  • PASS - All tests passed successfully (some may have been skipped as not applicable based on the contents of the template)

  • FAIL - At least one test failed. In this scenario the templates will need to be corrected to comply with the requirements

  • ERROR - An unexpected error occurred during test setup. Some or all tests may have not executed. Issue should be referred to the VVP team for investigation.

categories

Optional

array of string

Categories selected via the --category option.

template_directory

Required

string

Absolute path of the directory containing the Heat templates that were validated

version

Required

string

Version of the validation scripts that produced the report

timestamp

Required

string

ISO 8601 Timestamp in UTC of when the report was generated

tests

Required

List of Test Result

See Test Result

requirements

Required

List of Requirement Result

See Requirement Result

Test Result

For each test result a JSON object will be provided that informs the consumer what tests was run, its result, and the requirements it validated.

Example Test Result:

{
  "files": [
    "/Users/username/Desktop/stark_template2/STARKDB-nested-1.yaml",
    "/Users/username/Desktop/stark_template2/base_starkdb.yaml",
 ],
  "test_module": "test_resource_indices",
  "test_case": "test_indices_start_at_0_increment",
  "result": "FAIL",
  "error": " Index values associated with resource ID prefix STARKDB_server_ do not start at 0\n",
  "requirements": [
    {
      "id": "R-11690",
      "text": "When a VNF's Heat Orchestration Template's Resource ID contains an\n``{index}``, the ``{index}`` is a numeric value that **MUST** start at\nzero and **MUST** increment by one.\n\nAs stated in R-16447,\n*a VNF's <resource ID> MUST be unique across all Heat\nOrchestration Templates and all HEAT Orchestration Template\nNested YAML files that are used to create the VNF*.  While the ``{index}``\nwill start at zero in the VNF, the ``{index}`` may not start at zero\nin a given Heat Orchestration Template or HEAT Orchestration Template\nNested YAML file.",
      "keyword": "MUST"
    }
  ]
}

Test Result Definition:

Field Name

Required/Optional/Conditional

Data Type

Valid Values

Description

files

Required

list of string

List of files that were passed to the test case.

NOTE: If result is ERROR this may be an empty list

test_module

Required

string

Name of module/file anme that contains the test case

test_case

Required

string

Name of the test case

result

Required

string

  • PASS

  • SKIP

  • FAIL

  • ERROR

One of the valid values:

  • PASS - The test case passed with no violations

  • SKIP - The test case was deemed not applicable

  • FAIL - The test case completed, but a violation was found

  • ERROR - An unexpected error was found while setting up the test case

error

Required

string

If the test failed or encountered an error, then ths will be a message summarizing the error. If the test passed or was skipped, then this will be an empty string

requirements

Required

List of Requirement Metadata

See Requirement Metadata

Requirement Metadata

For each test case, the following requirement metadata will be reported.

Example Requirement Metadata:

{
  "id": "R-11690",
  "text": "When a VNF's Heat Orchestration Template's Resource ID contains an\n``{index}``, the ``{index}`` is a numeric value that **MUST** start at\nzero and **MUST** increment by one.\n\nAs stated in R-16447,\n*a VNF's <resource ID> MUST be unique across all Heat\nOrchestration Templates and all HEAT Orchestration Template\nNested YAML files that are used to create the VNF*.  While the ``{index}``\nwill start at zero in the VNF, the ``{index}`` may not start at zero\nin a given Heat Orchestration Template or HEAT Orchestration Template\nNested YAML file.",
  "keyword": "MUST"
}

Requirement Metadata Definition:

Field Name

Required Optional Conditional

Data Type

Valid Values

Description

id

Required

string

Requirement ID from the VNFRQTS project

text

Required

string

Full text of requirement

keyword

Required

string

MUST, MUST NOT, MAY, SHOULD, SHOULD NOT

RFC 2119 keyword of the requirement

Requirement Result

The file also includes an aggregated view of adherence to the VNF Requirements validated by the validation scripts. Since some requirements have multiple test cases, these results roll-up the result to an aggregated result for each requirement. This section does not include detailed test results. If you require detailed error information, then refer to the tests section of the results.

Example Requirement Result:

{
  "id": "R-16447",
  "text": "A VNF's <resource ID> **MUST** be unique across all Heat\nOrchestration Templates and all HEAT Orchestration Template\nNested YAML files that are used to create the VNF.",
  "keyword": "MUST",
  "result": "FAIL"
  "errors": [
     "The error message"
  ]
}

Requirement Result Definition:

Field Name

Required/Optional/Conditional

Data Type

Valid Values

Description

id

Required

string

Requirement ID from the VNFRQTS project.

NOTE:a requirement ID of “Unmapped” may be included if one or more tests are not mapped to a requirement.

text

Required

string

Full text of the requirement.

keyword

Required

string

MUST, MUST NOT, MAY, "SHOULD", "SHOULD NOT"

RFC 2119 keyword associated with the requirement

result

Required

string

PASS, SKIP, FAIL, ERROR

One of the valid values:

  • PASS - The test case passed with no violations

  • SKIP - The test case was skipped because it was deemed not applicable

  • FAIL - The test case completed, but it found a violation

  • ERROR - An unexpected error was found while setting up the test case

errors

Required

List of string

Error messages associated with this requirement. This will be an empty string if the result is PASS or SKIP

Docker Execution

A version of VVP is also provided as a Docker image. If your environment supports Docker, then this eliminates the need to setup and install the application from source code.

To execute from Docker, issue the following command where <Local Template Directory> is where the Heat templates are located and <Local Report Directory> is where you would like the reports to be written on your local machine:

docker run --rm -i -v ~/<Local Template Directory>/:/template \
-v ~/<Local Report Directory>:/reports \
onap/vvp/validation-scripts --template-directory=/template \
--output-directory=/reports

The same command line options can be used with the Docker image that are used with the version from source.