You can run SQLines Data tool from a command line, and this article describes the command line parameters. Additionally, you can specify options in Configuration File.
To run SQLines Data from command line, run .sqldata on Linux and /sqldata.exe on Windows.
The following parameters are available:
Option | Description | Default Value |
-sd | Source database connection string | Option must be specified, no default value |
-td | Target database connection string | Option must be specified, no default value |
-t | List of tables (wildcards *.* are allowed) | No default value |
-tf | A file with list of tables | No file used |
-texcl | Tables to exclude (wildcards *.* are allowed) | No tables excluded |
-qf | A file with SQL SELECT queries | No file used |
-cmd | Command to perform | Perform transfer command |
-topt | Transfer options | recreate tables in the target database |
-vopt | Validation options | rowcount to compare row count for tables |
-smap | Schema name mapping | No any schema mapping is performed |
-cmapf | Column name and data type mapping file | sqlines_cmap.txt file |
-dtmapf | Global data type mapping file | sqlines_dtmap.txt file |
-tmapf | Table name mapping file | sqlines_tmap.txt file |
-tself | Table select expressions file | sqlines_tsel.txt file |
-twheref | File with WHERE conditions | sqlines_twhere.txt file |
-out | Output directory for logs and reports (created if not exists) | The current directory |
-ss | Number of concurrent sessions | 4 concurrent sessions |
-log | Log file name | sqldata.log in the output directory |
-dlog | Log file for executed DDL statements | sqldata_ddl.sql in the output directory |
-cfg | Configuration file name (directory also can be specified) | sqldata.cfg in the current directory |
-? | Print to use the tool and exit |
Note. Parameters are case-insensitive, and you can specify them in any order.
-sd and -td options specify connection strings for the source and target database, respectively.
A special value -td=stdout means outputting the table or query content to the standard output, and you can use this feature to compress and send the data to HDFS/S3 etc.
For more information about connection string formats, see SQLines Data Connection String Formats.
-t option specifies tables to transfer or validate.
In the -t option you can specify a single table or table template, or you can also specify a comma-separated list of tables and table templates.
For example, -t=cities specifies to process a single table cities, while -t=scott.* specifies to process all tables in the schema scott.
-t=cities, countries specifies to process two tables cities and countries.
-t=* specifies to migrate all tables owned by the user connected to the database, while -t=*.* specifies to migrate all tables from all schemas accessible to the user.
-tf option specifies an ASCII file containing a list of tables to transfer or validate. Note that this option is ignored if -t option is set. The file must contain one table name (schema.table or table) per line.
-tf=c:\project\tables.txt specifies to read list of tables from c:\project\tables.txt file. If the directory is not specified, the current directory is used.
You can use -- to comment a line in the file. An example of file containing a list of tables:
-- Comment line contacts employees dept
-texcl option specifies which tables to exclude from transfer or validation. Sometimes it is easier to specify which tables you do not need to migrate, rather than list all tables you need to migrate.
In the -texcl option you can specify a single table or table template, or you can also specify a comma-separated list of tables and table templates.
For example, -t = * -texcl = temp* specifies to migrate all tables excluding tables those name starts with temp.
-qf option allows you to specify one or more queries to extract data from the source database. If multiple queries are defined, they will be executed in parallel (the number of concurrent sessions is defined by -ss option).
-- Syntax: -- target_table1, query1; (use ; to terminate the query, you can use newlines and tabs) -- target_table2, query2; ... ; ORDERS, SELECT * FROM orders WHERE created_dt >= CURRENT_DATE;
For an example, see sqlines_qf.txt located in the tool directory.
Note: -qf option is ignored if either -t or -tf option is specified.
-cmd option specifies which command(s) to perform: transfer, validate or assess. For example, to perform transfer specify:
-cmd = transfer |
-topt option specifies how to handle tables during the data transfer command (see -cmd
option): create, recreate, truncate or none:
create | Create tables in the target database if they do not exist, otherwise load to the existing tables |
recreate | Drop tables if they exist, and use the source metadata to create tables in the target database before loading the data (this is the default) |
truncate | Truncate tables (perform TRUNCATE TABLE command) to remove data in the existing tables before data transfer. |
none | Just start data transfer into existing tables |
Please note that when truncate or none is set the tool does not create tables, table constraints and indexes in the target database.
You can set multiple values separating them by a comma, for example, to create tables if they do not exist in the target database, or truncate them if they exist specify:
-topt = create, truncate |
none option is similar to create option except that SQLines Data will not check whether a table exists before the data transfer, so this can increase the performance of the transfer operation.
-vopt option specifies validation options for validate command (see -cmd
option): rowcount and rows:
rowcount | Compare the count of rows in source and target tables (this is the default) |
rows | Compare data in rows ordered by a primary key, unique constraint or all columns |
For example, to compare rows specify:
-cmd = validate -vopt = rows |
-smap option specifies the schema name mapping. For example, if you are migrating all tables from schema scott in Oracle, and want to move them to dbo schema in SQL Server, specify -smap = scott:dbo
-smap option allows you to map multiple schemas, for example, to map scott schema to dbo and hr to humanresources, specify -smap = scott:dbo, hr:humanresources
If you want to skip the schema name leave the second part empty. For example, -smap = scott: specifies to move tables from scott schema to the default schema in the target database.
-smap=*: specifies to move all selected objects from any source schema to the default schema in the target database.
-smap=*:schema or just -smap=schema specifies to move all selected objects from any source schema to the schema in the target database.
-cmapf option specifies an ASCII file containing a column name and data type mapping. The default mapping file is sqlines_cmap.txt located in the current directory.
You can use -- to comment a line in the file. For example, to rename column DESCRIPTION in SALES.CONTACTS table to DESC_NOTE with target data type VARCHAR(300) specify:
-- schema.table, source_column, target_column, target_datatype SALES.CONTACTS, DESCRIPTION, DESC_NOTE, VARCHAR(300)
You can also specify *, *.*, schema.* or *.table for object to denote all objects in a single mapping:
-- Rename column UPDATED in all tables of SALES schema SALES.*, UPDATED, UPDATED_DT, DATETIME
-dtmapf option allows you to redefine the global data type mapping that is used for all tables. The default mapping file is sqlines_dtmap.txt located in the current directory.
You can use -- to comment a line in the file. For example, to map NUMERIC(n, 0) data type to INTEGER specify:
-- source_datatype(length, scale), target_datatype NUMERIC(*, 0), INTEGER
Note that you can redefine the data type mapping for each column using -cmapf option.
-tmapf option allows you to map table names between the source and target databases. The default table name mapping file is sqlines_tmap.txt located in the current directory.
You can use -- to comment a line in the file. For example:
-- Syntax (one mapping per line): -- schema.table, target_schema.target_table SALES.CONTACTS, CRM.SALES_CONTACTS
-tself option allows you to define a file with the SELECT expressions for extracting data from source tables. Using this option you can change the order of columns, exclude some columns, add new columns based on specified SQL expressions and so on.
For example:
-- Syntax: -- source_table, colexpr1, ...; (use ; to terminate the list, you can use newlines and tabs in expression items) SALES.CONTACTS, NAME, SUBSTR(CREATED_DT, 1, 10) AS CREATED_DT, 'HQ OFFICE' AS OFFICE; SALES.ORDERS, *, 'NA REGION' AS REGION; -- adds a new column with computed value
The default SELECT expressions file is sqlines_tsel.txt located in the current directory.
-twheref option allows you to define a file with the WHERE conditions for extracting data from source tables. Using this option you can specified row filters for specified tables.
For example:
-- Syntax: -- - source_table1, condition1; (do not specify WHERE keyword, use ; to terminate the WHERE clause) SALES.ORDERS, created_dt >= TO_DATE('31-DEC-2022', 'DD-MON-YYYY');
The default WHERE conditions file is sqlines_twhere.txt located in the current directory.
-log option specifies the location and name of the log file. The default log file is sqldata.log located in the directory defined by -out option, or the current directory if -out option is not set.
If you specify a path and file name, the log file is stored at the specified location, and -out option has no effect.
Note that if you specify both directory and file name in -log option, the directory must exist. If you want to automatically create the directory use -out option to specify the directory and -log option to specify the file name only.
Example:
-log=sqldata_transfer1.log |
-dlog option specifies the location and name of the log file for executed DDL statements. The default DDL log file is sqldata_ddl.sql located in the directory defined by -out option, or the current directory if -out option is not set.
If you specify a path and file name, the DDL log file is stored at the specified location, and -out option has no effect.
Note that if you specify both directory and file name in -dlog option, the directory must exist. If you want to automatically create the directory use -out option to specify the directory and -dlog option to specify the file name only.
Example:
-dlog=sqldata_transfer_ddl.sql |