Redgate Test Data Manager

Data Generation command-line reference

This section provides a reference sheet for all the commands provided by the data generator CLI tool.

Contents


Exit codes

The data generator CLI will return the following exit codes when running commands.

Anything other than SUCCESS (0) will indicate a negative outcome.

Exit codeDesignationOutcome statusWhen
0Success

OK

Used when a data generator command completed successfully.
1GenericFailure

FAILURE

Catch all for general errors that don't have special meaning.
2
UnhandledException

FAILURE

Used when no other error code is suitable. It's the default error code.
3FailedInitialization

FAILURE

Used when failing initialization of services and tools.
4CliInvokedIncorrectly

FAILURE

Used when provided with missing or mismatched command line parameters.
5InvalidConfiguration

FAILURE

Used when the provided configuration does not meet the data generation requirements.
6FailedTableExtraction

FAILURE

Used when the SQL tables to generate into could not be retrieved from the targetdatabase.
7

InsufficientRowsInserted

FAILURE

Used when the data generator couldn't generate as many rows as were requested, due to constraints in the target database.
8NoRowsInserted

FAILURE

Used when the data generator couldn't generate any valid rows, due to constraints in the target database.

Please check our Data generation troubleshooting section for some known error use cases (and workarounds).

CLI logging

In terms of observability, the data generator CLI has two main types of logging:

Logging typeLocationDefault Minimum Log LevelPurpose
FileDefault location varies per OS:
  • WindowsC:\ProgramData\Red Gate\Logs\TDM\DataGenerator
  • Linux/Mac OSX (central location)/var/log/Red Gate/Logs/TDM/DataGenerator
  • Linux/Mac OSX (local location)./Logs

The default log folder can be overridden in the command line by using the CLI parameter --log-folder to point to a locally accessible location.

DEBUG

Log client-side application-level information about data generation execution and outcome. This will include, by default, additional contextual debug execution information.

Minimum log level when running commands can be changed with the global --file-log-level parameter (see global flags and parameters).
Console (standard out)stdout

INFORMATION

Displays non-error client-side application-level information about data generation execution and outcome. This excludes Error and Fatal log events which will be added to the standard error stream instead.

Minimum log level when running commands can be changed with the global --console-log-level parameter (see global flags and parameters).
Console (standard error)stderror

ERROR (cannot be changed)

Displays all Error and above client-side application-level messages for issues during execution of the data generator. This will lead to a non-successful exit code.

Log format

The filename of our log file will use following date sortable format: DataGenerator-YYYYMMDD.json (e.g. DataGenerator-20230503.json).

The files are written in Compact Log Event Format, a JSON-based format. This means each log message has structured data associated with it. The open source Compact Log Viewer tool is a convenient way to view the logs.

(there are other tools for working with Compact Log Event Format files: see the list on the CLEF webpage).

Log file sizes are limited to 20 MB and daily file rotation is implemented in the data generator.

There is no automatic purging of old log files built-in yet, so you may want to keep an eye on log files and delete older files from time to time.

Global flags and parameters

While every command has a set of dedicated key value parameters and flags that drive its behavior, data generator also has a set of global flags and parameters that can be applied to all commands (or subset of commands).

Flag/ParameterDescriptionExample
--versionShow version informationdata generator --version
-?, -h, --helpShow help and usage informationdata generator --help
--console-log-level <Debug|Error|Fatal|Information|None|Verbose|Warning>

Minimum log level for the terminal's standard output and error. [default: Information]

When running on Docker it could be useful to increase the console log level so that debug messages go to the console.

data generator --console-log-level=Warning
--file-log-level <Debug|Error|Fatal|Information|None|Verbose|Warning>Minimum log level for the data generator's log file. [default: Debug]data generator --file-log-level=Information
--log-folder <log-folder>Optional directory path to use when storing log files. If not specified, the default location varies per host Operating System (OS) running the tool.data generator --log-folder=C:\Users\***\Documents

Running the CLI

  1. Follow the data generation installation guide.
  2. Just call the data generator CLI with command line parameters

Commands

Activating your license

You can use the command line to activate your Test Data Manager license. For details see activating your license.

Running the data generator

Please check the installation requirements and known limitations listed in our troubleshooting guide.

Via the command line parameters

The following parameters configure the data generator:

Flag/ParameterDescriptionMandatory
--database-engine <MySql|Oracle|PostgreSql|SqlServer>The type of the database. The data generator currently supports 4 database engines: MySql, Oracle, PostgreSql and SqlServer. Must be present.

YES

--target-connection-string <target-connection-string>

Connection string for the target database. Must be present if not in the environment variable REDGATE_DATAGENERATOR_TARGET_CONNECTION_STRING (takes precedence).

See also the requirements for the target database on the installation requirements page.

YES

--rows-to-generate <rows-to-generate>

The number of rows to generate for each table in the target database.

If omitted it defaults to 1000.

NO

For example:

# Using Windows authentication against local SQL Server database
.\DataGenerator
--database-engine=sqlserver
--target-connection-string="server=SomeSqlServer;database=SomeTargetDatabase;trusted_connection=yes;TrustServerCertificate=true"
--rows-to-generate 1000

# Using Sql authentication against local SQL Server database
.\DataGenerator
--database-engine=sqlserver
--target-connection-string="server=SomeSqlServer;database=SomeTargetDatabase;Uid=sa;Pwd=123;TrustServerCertificate=true"
--rows-to-generate 1000

 # Using a PostgreSQL local database using SQL authentication
.\DataGenerator
--database-engine=postgresql
--target-connection-string="Server=127.0.0.1;Port=5432;Database=SomeTargetDatabase;User Id=SomeUser;Password=SomePassword;"
--rows-to-generate 1000

This is not an exhaustive list of the command line possibilities. Please run DataGenerator--help to discover some of our optional configuration parameters.


Didn't find what you were looking for?