VVP Documentation

The VNF Validation Platform (VVP) is an application to validate that OpenStack Heat Templates comply with the ONAP requirements and guidelines documented in the Heat section of the ONAP’s VNF Requirements and Guidelines documentation.

Adherence to these guidelines ensures that a VNF can be successfully onboarded, modeled, instantiated, and orchestrated by ONAP to the fullest extent possible.

VVP is a utility written in Python that can be executed via a command-line script, Docker container, or a native Desktop GUI application to analyze and report on the compliance of a given set of Heat templates to the ONAP requirements.

This guide will provide the user with instructions on how to acquire, setup, and execute the validations.

Installation

The VNF Validation Platform (VVP) can be run from source as a normal Python executable or a Docker image is provided. This section will describe how to setup and install the vvp/validation-scripts and run from source.

Installation and configuration of Docker is beyond the scope of this document, but you can refer to the Docker Execution instructions for more details on running the validations as from the Docker image.

Pre-requisites

This document assumes you have the following system-level utilities installed.

Please refer to the respective sites for these tools for the appropriate installation instructions for your given operating system.

Setup

The source code for VVP can be obtained from the ONAP Gerrit site or its GitHub mirror.

  1. Clone the source from your desired repository host:

    Choose one of the following git clone commands:

    > git clone https://github.com/onap/vvp-validation-scripts.git
    > git clone https://gerrit.onap.org/r/vvp/validation-scripts
    
  2. (Optional) If desired, you can create a virtual Python environment to avoid installing VVP’s dependencies in your system level installation of Python:

    > python -m venv vvp
    > source vvp/activate
    
  3. Install the required dependencies with the following command:

    > python pip install -r requirements.txt
    
  4. If you plan to make code changes, then initialize the standard git commit hooks by initializing pre-commit:

    > pre-commit install
    

Heat Validation via Command Line

Note

This section assumes you have already acquired and setup the validation scripts as described in the VVP Installation section.

Basic Script Execution

  1. The scripts must be executed from the ice_validator directory. Navigate to the directory first:

    > cd <vvp-directory>/ice_validator
    
  2. To run the script with default options, issue the following command where <Directory> is the directory containing the Heat templates:

    /path/to/validation-scripts/ice_validator>$ pytest tests --template-directory=<Directory>``
    
  3. Test results will be written to the consoles describing any failures that are encountered. If no failures are found, then a success message will be displayed. Additional reports can be found in the <VVP Directory>/ice_validator/output directory. See Reports for more details.

Command Line Options

Additional options can be specified at the command line to customize output and execution. VVP uses the pytest framework for test execution, and therefore all pytest options are available.

Please refer to the pytest documentation for information on the command-line options that framework provides. We will only document the specific options that are used by VVP:

--template-directory=TEMPLATE_DIR
                      Directory which holds the templates for validation

--category=CATEGORY_NAME
                      Additional validations that can be selected.
                      The only supported value at this time is
                      environment_file which will enforce that specific
                      parameter values are excluded from the environment
                      file per ONAP Heat requirements.

--self-test           Test the unit tests against their fixtured data

--report-format=REPORT_FORMAT
                      Format of output report (html, csv, excel, json)

--continue-on-failure
                      Continue validation even when structural errors exist
                      in input files

--output-directory=OUTPUT_DIR
                      Alternate directory for report output.

--category=TEST_CATEGORIES
                      optional category of test to execute

--env-directory=ENV_DIR
                      optional directory of .env files for preload
                      generation

--preload-format=PRELOAD_FORMATS
                      Preload format to create (multiple allowed). If not
                      provided then all available formats will be created:
                      GR-API, VNF-API

VVP Reports

After completion of the validations, several reports will be written to the ouput directory (specified via --output-directory or <vvp-directory>/ice_validator/output by default).

Report

Description

report.{html,csv,

xlsx,json}

Full report of validation results showing the pass or failure of the validation and details on all errors. See Validation Report for information on the HTML, CSV, and Excel versions of the report, or JSON Report for details on the machine-readable results.

traceability.csv

This shows a mapping from Heat requirement to test case.

mapping_errors.csv

File should be empty, but if present shows tests that are mapped to requirements that no longer exist

traceability.rst

Similar information to traceability.csv, but in reStructuredText for use in the VNF Requirements documentation.

failures

Deprecated JSON version of test failures. Use report.json instead.

preloads/<format>/*

A blank preload will be created for every VF Module in the template directory. The <format> will be based on the preload format(s) selected. See Preload Generation for for more detail.

Validation Report

If the report format of html (default), excel, or csv are requested via the --report-format option, then the a report file will be written to the output directory. Regardless of format, the file will contain a header section that summarizes the results and files scanned, and an error section that has a row for each failure with four columns.

Header Information

Header Element

Description

Categories Selected

Any additional categories selected via --category

Tool Version

Version of the tool that produced the report

Report Generated At

The timestamp of when the report was generated

Directory Validated

Absolute path to directory validated

Checksum

Unique MD5 has of the contents of template directory

Total Errors

Number of errors/violations found

Error Information

If any violations are found, then there will be one row for each violation with the following columns:

Column Name

Description

Files

The file or files that were scanned as part of the test.

Tests

Name of the test case (not shown in HTML version)

Error Message

This shows the test and brief error message from the test that failed. This will contain details about the element that triggered the violation such as the parameter name, resource ID, etc.

In the HTML version of the report this column will also show the test case name, and provide a link to Full Details the raw output of the test

Requirements

The requirement ID and text that was violated

Resolution Steps

For some violations, there are pre-defined resolution steps that indicate what action the user should take to resolve the violation.

Note: Not all violations will have resolution steps, rather the error message and requirement is sufficient.

Raw Test Output

Full output from the pytest test case. This not a dedicated column in the HTML version of the report.

JSON Report

This report is intended to provide a machine-readable version of the test execution, and provides the most comprehensive summary of the test execution and results.

File Header/Top Level

JSON Report <vvp-json-report> The top level will include a summary of available execution metadata.

NOTE: The tests and requirements entries are elided in the example below.

Example Header:

{
  "version": "dublin",
  "template_directory": "/path/to/template",
  "timestamp": "2019-01-21T02:11:07.305000",
  "checksum": "6296aa211870634f9b4a23477c5eab28",
  "profile": "",
  "outcome": "FAIL",
  "tests": [],
  "requirements": [],
}

Header Definition:

Field Name

Required/ Optional/ Conditional

Data Type

Valid Values

Description

checksum

Required

string

MD5 hash of all file contents in the template_directory

outcome

Required

string

PASS FAIL ERROR

One of the valid values:

  • PASS - All tests passed successfully (some may have been skipped as not applicable based on the contents of the template)

  • FAIL - At least one test failed. In this scenario the templates will need to be corrected to comply with the requirements

  • ERROR - An unexpected error occurred during test setup. Some or all tests may have not executed. Issue should be referred to the VVP team for investigation.

categories

Optional

array of string

Categories selected via the --category option.

template_directory

Required

string

Absolute path of the directory containing the Heat templates that were validated

version

Required

string

Version of the validation scripts that produced the report

timestamp

Required

string

ISO 8601 Timestamp in UTC of when the report was generated

tests

Required

List of Test Result

See Test Result

requirements

Required

List of Requirement Result

See Requirement Result

Test Result

For each test result a JSON object will be provided that informs the consumer what tests was run, its result, and the requirements it validated.

Example Test Result:

{
  "files": [
    "/Users/username/Desktop/stark_template2/STARKDB-nested-1.yaml",
    "/Users/username/Desktop/stark_template2/base_starkdb.yaml",
 ],
  "test_module": "test_resource_indices",
  "test_case": "test_indices_start_at_0_increment",
  "result": "FAIL",
  "error": " Index values associated with resource ID prefix STARKDB_server_ do not start at 0\n",
  "requirements": [
    {
      "id": "R-11690",
      "text": "When a VNF's Heat Orchestration Template's Resource ID contains an\n``{index}``, the ``{index}`` is a numeric value that **MUST** start at\nzero and **MUST** increment by one.\n\nAs stated in R-16447,\n*a VNF's <resource ID> MUST be unique across all Heat\nOrchestration Templates and all HEAT Orchestration Template\nNested YAML files that are used to create the VNF*.  While the ``{index}``\nwill start at zero in the VNF, the ``{index}`` may not start at zero\nin a given Heat Orchestration Template or HEAT Orchestration Template\nNested YAML file.",
      "keyword": "MUST"
    }
  ]
}

Test Result Definition:

Field Name

Required/Optional/Conditional

Data Type

Valid Values

Description

files

Required

list of string

List of files that were passed to the test case.

NOTE: If result is ERROR this may be an empty list

test_module

Required

string

Name of module/file anme that contains the test case

test_case

Required

string

Name of the test case

result

Required

string

  • PASS

  • SKIP

  • FAIL

  • ERROR

One of the valid values:

  • PASS - The test case passed with no violations

  • SKIP - The test case was deemed not applicable

  • FAIL - The test case completed, but a violation was found

  • ERROR - An unexpected error was found while setting up the test case

error

Required

string

If the test failed or encountered an error, then ths will be a message summarizing the error. If the test passed or was skipped, then this will be an empty string

requirements

Required

List of Requirement Metadata

See Requirement Metadata

Requirement Metadata

For each test case, the following requirement metadata will be reported.

Example Requirement Metadata:

{
  "id": "R-11690",
  "text": "When a VNF's Heat Orchestration Template's Resource ID contains an\n``{index}``, the ``{index}`` is a numeric value that **MUST** start at\nzero and **MUST** increment by one.\n\nAs stated in R-16447,\n*a VNF's <resource ID> MUST be unique across all Heat\nOrchestration Templates and all HEAT Orchestration Template\nNested YAML files that are used to create the VNF*.  While the ``{index}``\nwill start at zero in the VNF, the ``{index}`` may not start at zero\nin a given Heat Orchestration Template or HEAT Orchestration Template\nNested YAML file.",
  "keyword": "MUST"
}

Requirement Metadata Definition:

Field Name

Required Optional Conditional

Data Type

Valid Values

Description

id

Required

string

Requirement ID from the VNFRQTS project

text

Required

string

Full text of requirement

keyword

Required

string

MUST, MUST NOT, MAY, SHOULD, SHOULD NOT

RFC 2119 keyword of the requirement

Requirement Result

The file also includes an aggregated view of adherence to the VNF Requirements validated by the validation scripts. Since some requirements have multiple test cases, these results roll-up the result to an aggregated result for each requirement. This section does not include detailed test results. If you require detailed error information, then refer to the tests section of the results.

Example Requirement Result:

{
  "id": "R-16447",
  "text": "A VNF's <resource ID> **MUST** be unique across all Heat\nOrchestration Templates and all HEAT Orchestration Template\nNested YAML files that are used to create the VNF.",
  "keyword": "MUST",
  "result": "FAIL"
  "errors": [
     "The error message"
  ]
}

Requirement Result Definition:

Field Name

Required/Optional/Conditional

Data Type

Valid Values

Description

id

Required

string

Requirement ID from the VNFRQTS project.

NOTE:a requirement ID of “Unmapped” may be included if one or more tests are not mapped to a requirement.

text

Required

string

Full text of the requirement.

keyword

Required

string

MUST, MUST NOT, MAY, "SHOULD", "SHOULD NOT"

RFC 2119 keyword associated with the requirement

result

Required

string

PASS, SKIP, FAIL, ERROR

One of the valid values:

  • PASS - The test case passed with no violations

  • SKIP - The test case was skipped because it was deemed not applicable

  • FAIL - The test case completed, but it found a violation

  • ERROR - An unexpected error was found while setting up the test case

errors

Required

List of string

Error messages associated with this requirement. This will be an empty string if the result is PASS or SKIP

Docker Execution

A version of VVP is also provided as a Docker image. If your environment supports Docker, then this eliminates the need to setup and install the application from source code.

To execute from Docker, issue the following command where <Local Template Directory> is where the Heat templates are located and <Local Report Directory> is where you would like the reports to be written on your local machine:

docker run --rm -i -v ~/<Local Template Directory>/:/template \
-v ~/<Local Report Directory>:/reports \
onap/vvp/validation-scripts --template-directory=/template \
--output-directory=/reports

The same command line options can be used with the Docker image that are used with the version from source.

Graphical User Interface

If desired, a graphical user interface is also provided when the application has been installed from source. This can provide a convenient wrapper that enables users to perform validations without using extensive command-line usage.

At this time the application can only be run from source, but in the future a packaged version may be provided.

How to Start the Application

  1. Ensure you have installed the application and its dependencies from source as described in the Installation chapter

  2. Navigate to the ice_validator directory:

    > cd <vvp-directory>/ice_validator
    
  3. Launch the gui using the vvp.py command:

    > python vvp.py
    

How to Use the Tool

Note

The look-and-feel of the application will vary slightly depending on the Operating System of the host machine. The screenshot below is how the application looks on a Windows machine.

Sample Screenshot of VVP GUI Application

_images/vvp_app.png

All configuration options available to the command-line version of the application are exposed as options on the left-hand side of the GUI.

Additional Validation Categories

This allows the base set of tests to be extended by selecting additional categories of tests. At this time, only one additional category is supported.

This maps to the --category command-line option.

Category

Description

Environment File Compliance

When selected, VVP will flag parameters in environment files that should be excluded per the ONAP Heat requirements. De-selecting this can be useful when when testing instantiation directly in OpenStack without ONAP orchestration. This is equivalent to specifying --category=environment_file from the command-line.

OpenStack Heat Testing

When selected, will validate the Heat templates are valid per OpenStack specifications (using the latest version of OpenStack available). If not selected, then VVP will only validate that the Heat is compliant with ONAP rules.

Settings

Preload Template - Determines the format of the preload template that will be generated by VVP. The preload template can be completed to load per-instance values into SDNC. This allows the SDC model to be generic and re-used across environments. There are currently two formats supported by ONAP (VNF-API and GR-API)

Report Format - Controls the format of the output report generated after validation. The options are: HTML (the default), Excel, and CSV. This is equivalent the --report-format command-line option with the exception, that the JSON format is not supported via the GUI.

Input Format - Controls the expected format of the template files. This can either be a ZIP file containing the Heat templates or a directory containing the Heat templates. There is no ZIP file option for the command-line script at this time.

Halt on Basic Failures - VVP deems certain tests as “base tests” which if failed have the potential to generate a large number of other errors. This would include tests such as validating that the Heat templates are valid YAML. If checked, the tool will immediately stop all other tests and show a report of this single failure. This can be useful in reducing the number of errors to sift through in these situations. De-selecting this option is the equivalent of specifying --continue-on-failure as a command-line option.

Create Preloads from Env Files - When selected, the Env Files file selection box will be enabled allow the user to select a directory of .env file definitions that can be used to populate a preload template. See Preload Generation for more details

Running Validations

  1. Select the desired preload format (VNF-API or GR-API)

  2. Select the desired input format in the settings (ZIP or Directory)

  3. Select the […] button next to the Template Location input box

  4. Select the directory or ZIP file containing the Heat templates, and then click Open

  5. Once the input is selected, select the “Validate Templates” button to start the validation process. The white box to the right will display output as the validations are executed.

  6. Once validation is complete a summary of pass and fail will be written to the output window, and a “View Report” option will appear on the left-hand control panel.

  7. Select the “View Report” option, and the report will be opened in the the appropriate application based on report format.

  8. If you have questions about report output, please refer to the Validation Report reference material for more information.

Customizing the GUI

The VVP GUI offers an ONAP Operator a number of ways to configure the GUI text and behavior via a configuration file called vvp-config.yaml located in the ice_validator directory. Customizing the GUI would require packaging the VVP GUI with a modified configuration file.

Basic Customizations

This section will cover what the standard configurations that can tweak the display and options displayed in the default GUI.

Here is a sample of the current configuration file. Please note that some of these settings may not be used in the default ONAP configuration.

namespace: org.onap.vvp
owner: ONAP
ui:
  app-name: VNF Validation Tool
  disclaimer-text: This is a legal disclaimer in the footer
  requirement-link-text: ONAP VNF Heat Requirements (master)
  requirement-link-url: https://onap.readthedocs.io/en/latest/submodules/vnfrqts/requirements.git/docs/Chapter5/Heat/index.html
categories:
  - name: Environment File Compliance. (Required to Onboard)
    category: environment_file
    description:
      Checks certain parameters are excluded from the .env file, per HOT Requirements.
      Required for ASDC onboarding, not needed for manual Openstack testing.
  - name: OpenStack Heat Testing (Beta)
    category: openstack
    description:
      Uses the latest OpenStack Heat community version available to validate that
      a heat template is valid OpenStack Heat. This testing is equivalent to using
      heat template-validate from the command line.
settings:
  polling-freqency: 1000
  default-verbosity: Standard

Available Configuration Settings

  • namespace (required)
    • Use: Users prior selections in the GUI are saved in an OS-specific application directory for per user settings. This namespace is used to segregate these cached settings from other versions of the validation tool

    • When to Modify: If you are packaging a new version of the application, then this field should be changed to ensure a user running the standard ONAP version and your custom version do not encounter conflicts in saved settings.

  • owner (required)

    • Use: Similar to namespace this is used to segregate the cached application setting selections.

    • When to Modify: If you are packaging a new version of the application, then this field should be changed to ensure a user running the standard ONAP version and your custom version do not encounter conflicts in saved settings.

  • ui.app-name: (optional - Default is VNF Validation Tool)

    • Use: Controls the name of the application displayed in the title bar

    • When to Modify: When you want the application to display a different name. Please note that the version is displayed from the version.py file and will be displayed in the title bar regardless of this fields setting.

  • ui.disclaimer-text: (optional - no disclaimer footer if omitted)

    • Use: If present the text will be displayed as a message in the footer of the application. At some point this text may also be included on the footer of the reports as well.

    • When to Modify: Provide this if you need to provide any persistent messaging to your users such as a legal disclaimer (not currently used by ONAP)

  • ui.requirement-text: (optional - no requirement link in footer if omitted)

    • Use: If present this will describe the requirements that are validated by the validation tool prepended by the word “Validating:”. Example: Validating: ONAP VNF Provider Heat Requirements (master)

    • When to Modify: If you have modified or extended the tests and you want the GUI to reference a different document than the ONAP requirements. Alternatively you can remove the setting and not show link to the requirements in the footer.

  • ui.requirement-url: (optional - no requirement link in footer if omitted)

    • Use: If present this will be the link to the requirement text. It should be a full URL with protocol (ex: http://url.com ).

    • When to Modify: If you have modified or extended the tests and you want the GUI to reference a different document than the ONAP requirements. Alternatively you can remove the setting and not show link to the requirements in the footer.

  • categories: (optional)

    • Use: This section allows operators to customize the validation categories that can be selected by the end users. Individual tests can be decorated with the category decorator to mark them as distinct categories of tests. These tests will not be executed unless they are specifically requested using the --category command line option.

    • When to Modify: If you have packaged additional test cases under the ice_validator/tests directory with category decorators, then you can define them in this section to make them accessible via the GUI.

    • Fields:

      • name: Descriptive name to display to the user

      • category: Name used in the category decorator

      • description: Additional help text that will displayed upon hovering over the label in the GUI

  • settings.polling-frequency: (optional - default is 1 second)

    • Use: The validations are run in a separate process. This setting defines how frequently (in milliseconds) the GUI is updated from the background process.

    • When to Modify: It’s unlikely this would need to be modified, but this could be tweaked to change the frequency of update if there are performance or latency concerns

  • settings.default-verbosity: (optional - default is Standard)

    • Use: Controls the default level of verbosity in the pytest output.

    • When to Modify: Change this if you want to increase or decrease the default level of verbosity. Please note that once the user changes this setting, then the GUI will use the users last selection over this value.

    • Available Value:

      • Less - corresponds to pytest’s non verbose option

      • Standard - corresponds to pytest’s -v option

      • More - corresponds to pytest’s -vv option

Enabling Terms and Conditions Acceptance

There may be scenarios where an ONAP Operator wishes to gather end user approval or consent to specific terms prior to allowing the end user to use the validation tool. This could be in the form of an End User License Agreement, Terms and Conditions, or some other artifact.

This is also enabled through the vvp-config.yaml. This is configured by defining a terms section in the file as follows:

terms:
    version: <version of terms being accepted>
    path: <relative path or url to the terms>
    popup-link-text: <message for link to terms>
    popup-msg-text: <message in the main body of the pop-up>
    popup-title: <text displayed in the title bar of pop-up dialog>

When this is enabled, a pop-up will be displayed blocking the users progress until the terms are reviewed (link clicked), and either accepted or declined.

If the user accepts, then their acceptance of version of the terms will be recorded and they will not need to re-accept unless the version is changed.

If the user declines, then the validation tool will immediately exit.

_images/vvp-with-terms.png

The actual terms and conditions will be opened in the user’s default browser. The path can be a local file path or an HTTP URL. If you wish for the content to be rendered in the browser, then it is recommended that the content be stored as HTML.

Preload Generation

Overview

To maximize the value of ONAP, the ONAP Heat requirements are defined to ensure that configuration that will vary per environment/instantiation should be defined as inputs into the Heat templates via parameters and excluded from .env files. This ensures that when a VNF described by a set of Heat templates is onboarded into SDC for modeling, the model remains generic and reusable across multiple environments.

In order to instantiate a VNF and its VF Modules, ONAP must be able to supply the instance specific configurations when a module is instantiated. The most common method to achieve this is to register a preload for your VF module into SDNC. The preload is a JSON artifact that:

  1. Provides identifiers that map it to a specific VF Module in SDC

  2. Provides values that will be passed into the Heat templates’ parameters

When ONAP instantiates the VF Module, SDNC will find the preload based on the SDC Model Identifiers, and then map the parameter values in the preload to the appropriate parameters in the Heat template.

The validation tool/scripts will generate a “blank” preload template in the requested format (VNF-API or GR-API) that the end user will complete prior to registering the preload to SDNC.

Optionally, the end user can specify a directory containing .env files and other special files (defaults.yaml, CSAR package) that can generate not just blank templates, but fully or partially completed preload templates.

This section will describe how to perform these actions.

Blank Preload Template Generation

As part of normal validation, VVP will generate a blank preload template for the user to complete.

If validating via the command line, then the resulting blank templates will be in the <output-directory>/preloads/<preload-format>/ directory. Where output-directory is either the default output directory under ice_validator or a custom directory if the default was overridden by the --output-directory parameter. If --preload-format was specified, then only that format will be produced; otherwise all available formats will be produced.

If validating via the GUI, then a link to the preload templates will be provided at the end of the validation. The preload format will be in the format selected in the GUI’s settings.

Populating the Preload Template

Note

The resulting JSON file(s) should be copied to a new directory to avoid being over-written by subsequent validations.

The preload template will be pre-populated based on the VMs, networks, sub-networks, IPs, availability zones, and parameters in the Heat templates. Every value that must be provided will be prefixed by VALUE FOR: and will include either the parameter name that the value will map to or will describe the value that should be provided if it is not related to a Heat parameter.

Example Partial VNF-API Preload
{
    "availability-zones": [
        {
            "availability-zone": "VALUE FOR: availability_zone_0"
        },
        {
            "availability-zone": "VALUE FOR: availability_zone_1"
        }
    ],
    "vnf-networks": [
        {
            "network-role": "private",
            "network-name": "VALUE FOR: network name for private_net_name"
        },
        {
            "network-role": "oam",
            "network-name": "VALUE FOR: network name for oam_net_id"
        }
    ],
    "vnf-vms": ["OMITTED FOR THIS EXAMPLE"]
}

There are instances where a parameter is defined comma-delimited-list in Heat. In these instances, the parameter name will be repeated for each value that needs to be provided. For example, if there are three virtual machines defined in the template and the names are pulled from a parameter called {vm-type}_names, then VALUE_FOR: {vm-type}_names will be repeated three times in the template.

Example: comma-delimited-list parameters
{
    "vm-type": "admin",
    "vm-count": 3,
    "vm-names": {
        "vm-name": [
            "VALUE FOR: admin_names",
            "VALUE FOR: admin_names",
            "VALUE FOR: admin_names"
        ]
    }
}

Special Values

Some values are not supplied to the the Heat template directly, but instead are used to either map the preload template to the SDC model or provide instance specific identifiers such as a module or VNF name. These will still be prefixed with VALUE FOR:.

Note

Refer to the section below on Preload Template Population for alternate ways to populate this information.

The values are:

Value

VNF-API Name

GR-API Name

Source & Description

VNF Name

vnf-name

vnf-name

Name of the VNF as it will appear in A&AI. This is user defined and does not need to match a value in SDC.

Virtual Function Instance

generic-vnf- type

vnf-type

Maps preload to a specific instance of the VF model from SDC. This field is must be in the format of: <Service Name>/<VF Instance Name> and must match SDC

VF Module Model

vnf-type

vf-module- type

Maps preload to the vfModuleModelName from the VSP CSAR or the value from SDC

Preload Template Population

Basic Usage - Single Environment

VVP can also generate fully or partially populated preload templates if a an optional environment directory is provided as a source for the parameter values.

When using the command line, the environment directory is provided using the --env-directory parameter.

When using the GUI, the environment directory is provided by first selecting the Create Preloads from Env Files option in Settings and then providing the directory in the Env Files field.

A template for the environment directory is created in the output directory under preloads/<preload-format>/preload_env. There will be an *.env file for every VF module in the Heat template and a defaults.yaml. This directory can be copied and updated to serve as a data source for populating the preload templates. Every value that needs to be updated will be set to CHANGEME.

Example Partial env file (base.env)
parameters:
  ctrl_net_id: CHANGEME
  ctrl_subnet_id: CHANGEME
  db_ha_floating_ip: CHANGEME
  db_ha_floating_v6_ip: CHANGEME
  svc_flavor_name: svc_flavor
  svc_image_name: svc_image
  db_name_0: CHANGEME

Each environment directory consists of three file types:

  1. Environment files (.env) - One file per VF module with the base name of the env file matching the base name of the heat template it corresponds to.

  2. defaults.yaml - Values specified in here will be used in all modules. This is a useful place to put values that will be the same in every VF module. It is important to note that the environment files take precedence so if you specify a value in defaults.yaml, then that value should be removed from the environment files.

  3. VSP CSAR - Optionally the CSAR can be downloaded from the SDC Artifacts section, and put into the directory. If present, then the SDC Model Identifiers will be pulled from the CSAR instead of them being manually specified

After template processing completes, the blank preload templates will still be generated to the output directory. The populated templates will be generated the <env-directory>/preloads/<preload-format> directory. If a template was not fully populated, then it will be suffixed with _incomplete (ex: base_incomplete.json)

Advanced Usage - Multiple Environments

Using a single environment directory is the most basic use case, but you may need to generate preloads for multiple environments where some values do not change by environment while others do.

To enable this capability, environment directories can be nested and inherit values from their parent directories. The parent directory’s environment files and defaults.yaml files should contain the default values for every environment.

These can be overridden in the child environment directories so if you want to force each environment to provide a value, then do not specify that value in any parent directory.

Example Nested Environment Directory
env_directory/
|--- vsp_name.csar          <-- Global CSAR for all enviroments (assumes shared SDC instance)
|--- defaults.yaml          <-- Global defaults for all env and modules
|--- base.env               <-- Global defaults for base modules
|--- mod_one.env            <-- Global defaults for mod_one modules
|--- env_one/
|    |--- defaults.yaml     <-- env_one specific defaults for all modules
|    |--- base.env          <-- env_one values for the base module
|    |--- mod_one.env       <-- env_one values for mod_one
|--- env_two/
     |--- defaults.yaml     <-- env_two specific defaults for all modules
     |--- base.env          <-- env_two values for the base module
     |--- mod_one.env       <-- env_two values for mod_one

Preload templates will be generated in the leaf directories of the environment directory. In this example, preload templates will be generated in env_one and env_two, but not env_directory.

The value supplied to the preload will follow the following order of precedence with the value being looked up from each of the following in turn and halting once the value is found:

  1. Corresponding .env file for the module in the environment specific directory (ex: env_one/base.env)

  2. The defaults.yaml file in the environment specific directory (ex: env_one/defaults.yaml)

  3. The CSAR file in the environment specific directory (NOTE: only the special values are looked up from the CSAR

  4. If not found in the environment specific, then the parent directory will be searched using the same precedence (1-3).

This lookup chain will continue until the root environment directory is reached. If no value is found, then the template will continue revert the value from the blank template (i.e. VALUE FOR: {parameter-name})

The environment directories can be nested as deeply as needed to map your needs. For example you could have a global directory, production and test environments, and then multiple sub-environments in each test nd production directory.

Example:

Example Nested Environment Directory
env_directory/
|--- vsp_name.csar
|--- defaults.yaml
|--- base.env
|--- mod_one.env
|--- production/
|    |--- defaults.yaml
|    |--- base.env
|    |--- mod_one.env
|    |--- prod_one/
|    |   |--- defaults.yaml
|    |   |--- base.env
|    |   |--- mod_one.env
|    |--- prod_two/
|        |--- defaults.yaml
|        |--- base.env
|        |--- mod_one.env
|--- test/
     |--- defaults.yaml
     |--- base.env
     |--- mod_one.env
     |--- test_one/
     |   |--- defaults.yaml
     |   |--- base.env
     |   |--- mod_one.env
     |--- test_two/
         |--- defaults.yaml
         |--- base.env
         |--- mod_one.env

Alternate Method for Specifying Special Values

If you wish to populate the special values without providing a CSAR, then add the following parameters and values to either your defaults.yaml file or the appropriate .env file.

Registering Preload with SDNC

At the time of this writing, the ONAP documentation does not provide a good source of documentation on how to provide preloads to SDNC. The options known are to either use a REST POST call to the following APIs based on format:

Note

Host and port numbers may vary in your ONAP environment

The ONAP-CLI project could also be used, but it only supports the VNF-API as of the writing of this document.

Customizing Preload Generation

VVP’s preload generation capability leverages a plugin mechanism to enable additional preload formats to be added. This can be useful if you define an intermediary format such as a spreadsheet to capture the preload information.

Preload plugins are discovered using the following method.

  1. The sys.path is scanned to find any top level modules that begin with preload_

  2. If found, then any implementations of AbstractPreloadGenerator are registered as available formats

  3. You can selectively disable formats by specifying them as excluded in the vvp-config.yaml file.

Please refer to the preload_vnfapi.VnfApiPreloadGenerator and preload_grapi.GrApiPreloadGenerator for examples of how to implement a generator.

How to Contribute

Overview

This section will provide details on how to contribute changes to the project covering both the mechanics of how to contribute new code as well as how to adhere to the code quality and coding practices of the project.

Prerequisites

As this project is part of the overall ONAP project, there are some common guidelines and activities you will need to adhere to:

Other useful links:

  • All work is documented and tracked via the VVP Project in the ONAP JIRA instance. Login is via your Linux Foundation ID

  • Proposals for new features, general information about the projects, meeting minutes, and ONAP process information is located on the ONAP Wiki

  • The VVP project hosts a weekly meeting to plan upcoming work, discuss open issues, and align on priorities. Please consider attending if possible if you intend to contribute to the project. Refer to the ONAP Calendar https://wiki.onap.org/pages/viewpage.action?pageId=6587439 for scheduling details

Objective

The primary focus of VVP is ensuring that a VNF that is described using Openstack Heat complies with the ONAP Heat requirements specified in the VNF Requirements (VNFRQTS) project. If a VNF does not comply with these rules, then it may not successfully be modeled in SDC, fail to instantiate, be improperly inventoried in A&AI, or fail orchestration.

The project aims to validate every mandatory requirement in the VNF Requirements related to Heat (i.e. all requirements with a MUST or MUST NOT keyword)

Heat templates are validated using tests written in pytest. Each test will validate one or more requirements. Typically we strive to have 1 test per requirement, but there are situations where it is easiest and clearest to validate multiple, tightly related requirements with a single test.

Every test MUST have a corresponding requirement in the ONAP VNF Requirements project. If your contribution is a test and there is not an appropriate requirement, then please consider making a contribution to that project first.

Writing Tests

Coding Conventions

  • Follow PEP-8 conventions
    • NOTE: The only variation is that the line-length can be 88 characters vs. 80

  • All code must be formatted using the Black code formatter. VVP uses the pre-commit library to automatically format code at check-in. After running pip install, run pre-commit install to initialize the git hook.

  • Familiarize yourself with the utilities that exist in the following utility modules and leverage them to avoid duplication.

    • ice_validator/tests/helpers.py

    • ice_validator/tests/structures.py

    • ice_validator/tests/utils/**

  • Ensure all source files include the standard Apache License 2.0 text and appropriate copyright statement (see other source files for example)

  • All code must pass standard quality checks prior to submission which can be executed via tox or by running checks.py

  • When parsing YAML, always use the tests/cached_yaml module versus the default yaml module. This will greatly improve performance due to the large number of yaml files that must be parsed.

  • In an effort to keep the number of dependencies down, please favor using the Python standard library unless using an external library significantly improves or reduces the needed code to implement the functionality.

  • For security purposes the following hardening activities are used:

    • Avoid usage of yaml.load and always use yaml.safe_load (Note: if you use cached_yaml as instructed above, then this is covered automatically)

    • Docker containers must not be run as root (see current Dockerfile for an example)

    • Inspect and resolve all findings by the bandit scans

File Name

Test files are written in Python, and should go into the /validation-scripts/ice_validator/tests/ directory. They should be prefixed with test_. If not, pytest will not discover your test. The file name should reflect what is being tested.

Test Name

Tests are functions defined in the test file, and also must be prefixed with test_. If not, pytest will not collect them during execution. For example:

test_my_new_requirement_file.py

def test_my_new_requirement():

Requirement Decorator

Each test function must be decorated with a requirement ID from the VNF Requirements project. The following is required to be imported at the top of the test file:

from tests.helpers import validates

Then, your test function should be decorated like this:

@validates("R-123456",
           "R-123457") # these requirement IDs should come from the VNFRQTS project
def test_my_new_requirement():

This decorator is used at the end of the test suite execution to generate a report that includes the requirements that were violated. If a test is not decorated it is unclear what the reason for a failure is, and the implication is that the test is not needed.

The validation reports will show the requirement text that was violated and it will be pulled from the heat_requirements.json file. This file is published by the VNFRQTS project, and VVP maintains a copy of the file. Your requirement should be present in this file. The update_reqs.py command can be used to re-synchronize the VVP copy with VNFRQTS master.

Test Parameters

There are several dynamic fixtures that can be injected into a test based on what the test is attempting to validate. Each test should be parameterized based on what artifact is being validated.

Available parameters are enumerated in /validation-scripts/ice_validator/tests/parameterizers.py. Below is a description of the most commonly used:

  • heat_template: parameter is the full path name for a file with the extenstion .yaml or .yml, if the file also has a corresponding file with the same name but extension .env.

  • yaml_file: parameter is the full path name for a file with the extenstion .yaml or .yml

  • yaml_files: parameter is a list of all files with the extenstion .yaml or .yml.

  • volume_template: parameter is the full path name for a file name that ends with _volume and the extension .yaml or .yml.

There are many others that can also be used, check parameterizers.py for the full list.

The parameter that you decide to use determines how many times a test is executed, and what data is available to validate. For example, if the test suite is executed against a directory with 10 .yaml files, and a test is using the parameter yaml_file, the test will be executed once for each file, for a total of 10 executions. If the parameter yaml_files (note the plural) is used instead, the test will only execute once.

Here’s an example for how to parameterize a test:

@validates("R-123456",
           "R-123457")
def test_my_new_requirement(yaml_file): # this test will execute for each .yaml or .yml

Collecting Failures

To raise a violation to pytest to be collected and included on the final violation report, use the assert statement. Example:

@validates("R-123456",
           "R-123457")
def test_my_new_requirement(yaml_file):
  my test logic
  ...
  ...
  ...

  assert not failure_condition, error_message

As one of the VVP priorities is User Comprehension, the error_message should be readable and include helpful information for triaging the failure, such as the yaml_file, the parameter the test was checking, etc…

If the assert statement fails, the failure is collected by pytest, and the decorated requirements and error_message are included in the final report.

Optional: Pytest Markers and Validation Categories

The VVP test suite has the concept of a base test. These are used as sanity tests and are executed before the other tests, and if they fail the test suite execution is halted. A test should be annotated with base if the failure is likely to generate many subsequent failures (ex: improperly formatted YAML). If you are writing a base test, mark your test like this:

import pytest

@pytest.mark.base # this is the base test marker
@validates("R-123456")
def test_my_new_requirement():

The VVP test suite also has the concept of a category to define what additional set of optional tests to execute when requested by the end user. The way it works is by applying the categories decorator to the test.

By default, all base tests and tests with no category are executed. If you want an additional category to run, pass the command line argument:

--category=<category>

This will extend the default set of tests to also include tests marked with the requested category like the following:

import pytest

@categories("<category>") # substitue <category> with the category name
@validates("R-123456")
def test_my_new_requirement():

This should be used sparingly, and in practice consider reviewing a requirement with the VNF Requirements team before adding a test to a category.

Testing your Test

Every Heat validation test must have a unit test that validates the test is working as expected. This is handled by creating a one or more “fixtures” that will exercise the test and validate the expected result.

The fixtures are stored in the ice_validator/tests/fixtures directory under a directory that matches the test file name exactly.

For example, if your test is named test_neutron_ports.py, then the test fixtures must be in the ice_validator/tests/fixtures/test_neutron_ports/ directory.

At minimum, each test must have one example of heat templates/files that pass (stored in the pass subdirectory), and one example that fails ( stored in the fail subdirectory). These templates do not need to be complete, valid Heat template - they only need to include the minimum content to validate the test.

If you need to test multiple conditions or branches of your test, then you can nest other directories under your test’s fixture directory. Each nested directory, must in turn have a pass and fail subdirectory.

ice_validator/
|--- tests/
     |--- fixtures/
          |--- test_neutron_ports/
               |--- scenario_one/
               |    |--- pass/
               |    |--- fail/
               |--- scenario_two/
                    |--- pass/
                    |--- fail/

To execute all tests for the entire suite, issue the following commmand from the ice_validator directory:

pytest --self-test

If you wish to selectively execute your test against one of the fixtures, then issue the following command from the ice_validator directory:

pytest tests/<test_file>.py --template-directory=tests/fixtures/<test_file>/<scenario>

If you have contributed code outside of a tests_*.py file, then you should create suitable tests for that functionality in the app_tests directory. The tests should be compatible with pytest, but these tests do not use the fixtures mechanism.

Submitting Your Changes For Review

Once you have completed your changes and tested they work as expected, then the next step is to validate they are ready for submission. The checks.py module in the root directory contains are variety of code quality checks that the build server will execute. These can be executed locally using tox or simply running checks.py.

At the time of this writing, the following checks will be performed:

  • Executing the full test suite (app_tests and --self-test)

  • flake8 code style validation

  • Ensuring the heat_requirements.json file is up-to-date with VNFRQTS (run update_reqs.py if this check fails)

  • Ensures all mandatory tests from VNFRQTS have tests in VVP

  • Security checks via bandit

Once all tests are passed, then refer to Pushing Changes Using Git for details on how to submit your change.

Once your change has been submitted, please add the following individuals as reviewers at minimum:

  • Steven Stark

  • Trevor Lovett

  • Steven Wright

VVP Project Release Notes

Version: 8.0.0

Release Date

2020-03-23

Removed Features - None

New Features

  • Performance improvements for test-engine. This reduces runtime for OVP VNF Life-Cycle validation tests. (VVP-503 VVP-504)

Bug Fixes

  • None

Known Issues

  • None

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

VVP code also is passing the mandatory test coverage percentage (%55).

Additionally, all VVP code is still scanned using the Bandit library. All potential issues reported this scanning process have been addressed or marked as non-issues using the # nosec marker in the source code.

Quick Links:

Version: 7.0.0

Release Date

2020-11-18

Removed Features - None

New Features

  • Added additional test for resource group parameters. (VVP-438)

  • Updated validation for R-610030: An incremental module must have a server or volume. (VVP-451)

  • Various enhancements for performance and stability of onap-client. (VVP-487)

Bug Fixes

  • Fixed the error message returned for test_get_attr_usage.py. (VVP-420)

  • Fixed false posive testing port resource ids. (VVP-346)

  • Various fixes for preload generation. (VVP-440)

Known Issues

  • None

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

VVP code also is passing the mandatory test coverage percentage (%55).

Additionally, all VVP code is still scanned using the Bandit library. All potential issues reported this scanning process have been addressed or marked as non-issues using the # nosec marker in the source code.

Quick Links:

Version: 6.0.0

Release Date

2020-05-14

Removed Features - None

New Features

  • Added plugin capability to preload template generation. End users can now create their own python plugins for VVP to generate preload templates in the method and format of their choosing. (VVP-339)

  • The vvp/test-engine repository has been revived and repurposed, and is now used to maintain the code for the following two enhancements:

    • Enhanced OVP VNF Heat validation to execute stand-alone, and moved into the VVP test-engine repository. (VVP-381)

    • Created onap-client python api client to interact with various ONAP applications. (VVP-381)

  • Added validation test for new VNF Heat Template requirement R-55307 . (VVP-354)

  • Enhanced validation for nested resources R-17528. (VVP-357)

  • Updated test_02_no_duplicate_keys_in_file to check environment files for duplicate keys. (VVP-284)

  • Enhanced validation for R-90279 based on updated VNF Heat Template requirements. (VVP-360)

  • Enhanced resiliency of preload template generation to support more general case VNF Heat Templates. (VVP-335)

Bug Fixes

  • Resolved false negatives for internal network floating IP parameters parameter format checks. (VVP-340)

  • Resolved false negatives checking required sections of a VNF Base Template module. (VVP-365)

Known Issues

  • None

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

VVP code also is passing the mandatory test coverage percentage (%55).

Additionally, all VVP code is still scanned using the Bandit library. All potential issues reported this scanning process have been addressed or marked as non-issues using the # nosec marker in the source code.

Quick Links:

Version: 5.0.1

Release Date

2019-09-30

Removed Features - None

New Features

  • OpenStack Heat Validation - VVP now includes the latest version of OpenStack Heat, and can be used to validate that the Heat not only complies with ONAP rules, but is also valid Heat (similar to stack-validate) (VVP-218)

  • Preload Template Generation - VVP will now create preload templates based on the Heat template being validated. The user can optionally populate the template by specifying environment files (VVP-227, VVP-277)

  • Added checks.py to consolidate various quality checks that can now be performed cross-platform and consistently between the local and build environment.

    • Added quality check to ensure VVP includes the latest version of Heat requirements from the VVP project

    • All code is now scanned for security issues using the Bandit library (VVP-244)

  • Updated Availability Zone tests to align with latest VNF Requirements (VVP-226)

  • Performance enhancements - improved performance of validation of large templates by 30-70% (VVP-225)

  • VVP GUI can be customized to display configurable disclaimer text, and also allow the acceptance of terms-and-condition, or other legal agreements before allowing the user to use the tool (VVP-195)

  • Enhanced report readability by removing unnecessary columns and other enhancements (VVP-184)

  • Removed dependency on yamllint library to remove dependency on L/GPL code (VVP-201)

  • Allow error messages with line breaks (VVP-225)

  • Various enhancements to remove redundant tests or improve error messages

Bug Fixes

  • Fixed errors in test_environment_file_parameters where wrong variables were being checked (VVP-267)

  • VVP GUI fails to open reports when tools is launched from a network share (VVP-266)

  • Escape error messages before display in HTML report (VVP-159)

  • Improved error message when Heat archives included nested directories which are not allowed (VVP-217)

  • Relaxed validation of get_param usage to better comply with SDC implementation (VVP-220)

Known Issues

  • None

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

Additionally, all VVP code is now scanned using the Bandit library. All potential issues reported this scanning process have been addressed or marked as non-issues using the # nosec marker in the source code.

Quick Links:

Version: 4.0.0

Release Date

2019-05-10

Removed Features

  • The VVP web application has been deprecated and is no longer supported as of the Dublin release. The validation scripts continue to be supported and enhanced, but contributions to the web-related repositories are now locked and VVP will no longer be supported for deployment via ONAP Operations Manager (OOM).

    The following repositories are now locked as of this release:

    • vvp/ansibile-ice-bootstrap

    • vvp/cms

    • vvp/devkit

    • vvp/engagementmgr

    • vvp/gitlab

    • vvp/image-scanner

    • vvp/jenkins

    • vvp/portal

    • vvp/postgresql

    • vvp/test-engine

New Features

  • A new GUI application has been contributed and can be used to execute validations in a user-friendly way without using complex command line options.

  • VVP is now packaged as a Docker container eliminating the need to run the application from source code. See the Docker Execution instructions for more details.

  • VVP Validation Scripts now cover all mandatory, testable HOT requirements from VNFRQTS

Known Issues

  • None

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

Quick Links:

Version: 3.0.0

Release Date

2018-11-30

New Features

  • Created mapping of validation scripts to VNF Guidelines

  • Increase validation script test coverage

  • Created HTML report generation in validation scripts repository

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

Quick Links:

Version: 2.0.0

Release Date

2018-06-07

New Features

  • Initial release of VNF Validation Program (VVP) for Open Network Automation Platform (ONAP).

  • This intitial releases is based on seed documents that came from Open-O and Open ECOMP.

  • This release provides a process to allow VNFs to be incubated and validated against the ONAP Heat Requirements.

Bug Fixes - None

Known Issues

  • As of now, the VVP Project has been created to check Validity for VNFs using Heat Orchestration Templates.

  • Only deployable using OOM, will be a standalone toolkit in the future.

  • UWSGI webserver dependencies.

Security Notes

VVP code has been formally scanned during build time using NexusIQ and no Critical vulnerability was found.

Quick Links:

Upgrade Notes

  • Initial release - none

Deprecation Notes

  • Initial release - none

Other

NA


End of Release Notes