From b14b74c743dba7a1280f7581a76aabcd6e1f5c36 Mon Sep 17 00:00:00 2001 From: sfc-gh-ibarquero <127419946+sfc-gh-ibarquero@users.noreply.github.com> Date: Wed, 28 Feb 2024 09:53:50 -0600 Subject: [PATCH] [Oracle] Add output messages for Oracle during extraction (#27) * Flag version added * Oracle paths fixed * "Mobilize" removed --- BigQuery/README.md | 4 +- BigQuery/bin/create_ddls.sh | 10 ++++ DB2/README.md | 5 +- DB2/bin/create_ddls.sh | 11 ++++ Hive/README.md | 6 ++- Hive/exp_ddl.sh | 10 ++++ Oracle/README.md | 6 ++- Oracle/bin/create_ddls.sh | 25 ++++++++- Oracle/scripts/create_ddls.sql | 93 +++++++++++++++++++++++++++++++++- Redshift/README.md | 4 +- Teradata/README.md | 4 +- Teradata/bin/create_ddls.sh | 12 +++++ 12 files changed, 179 insertions(+), 11 deletions(-) diff --git a/BigQuery/README.md b/BigQuery/README.md index 3fede19..efe22ca 100644 --- a/BigQuery/README.md +++ b/BigQuery/README.md @@ -4,7 +4,7 @@ This repository offers a collection of straightforward scripts designed to facil ## Version -Version 1.0 Release 2021-12-02 +Release 2023-02-28 ## Usage @@ -35,6 +35,8 @@ The following are the steps to execute the DDL Code Generation. They can be exec - Finally, run `create_ddls.sh` to extract the DDLs from BigQuery - After a successful run, remove region information from the top line of `create_ddls.sh`. +3. Run `create_ddls.sh --version` to check the current version of the extraction scripts. + ### DDL Files These files will contain the definitions of the objects specified by the file name. diff --git a/BigQuery/bin/create_ddls.sh b/BigQuery/bin/create_ddls.sh index 1fb5e11..c128cb9 100644 --- a/BigQuery/bin/create_ddls.sh +++ b/BigQuery/bin/create_ddls.sh @@ -1,5 +1,15 @@ #!/bin/bash +#This version should match the README.md version. Please update this version on every change request. +VERSION="Release 2024-02-28" + +export versionParam=$1 + +if [ "$versionParam" = "--version" ]; then + echo "You are using the $VERSION of the extraction scripts" + exit 1 +fi + REGION='us' echo " " diff --git a/DB2/README.md b/DB2/README.md index acada83..19f9a47 100644 --- a/DB2/README.md +++ b/DB2/README.md @@ -4,8 +4,7 @@ This repository provides some simple scripts to help exporting your DB2 code so ## Version -Version 1.2 -Release 2022-05-27 +Release 2023-02-28 ## Usage @@ -25,6 +24,8 @@ That variable will determine if there are any database that you want to exclude 2 - After modifying, the `create_ddls.sh` file can be run from the command line to execute the extract. The following files will be created in the directory `/object_extracts/DDL`: +3 - Run `create_ddls.sh --version` to check the current version of the extraction scripts. + ## **For Windows:** 1 - Modify `create_ddls.ps1` located in the `bin` folder. diff --git a/DB2/bin/create_ddls.sh b/DB2/bin/create_ddls.sh index 3ce8201..2197792 100755 --- a/DB2/bin/create_ddls.sh +++ b/DB2/bin/create_ddls.sh @@ -1,4 +1,15 @@ #!/bin/bash + +#This version should match the README.md version. Please update this version on every change request. +VERSION="Release 2024-02-28" + +export versionParam=$1 + +if [ "$versionParam" = "--version" ]; then + echo "You are using the $VERSION of the extraction scripts" + exit 1 +fi + echo "DB2 DDL Export script" echo "Getting list of databases" OUTPUTDIR="../object_extracts" diff --git a/Hive/README.md b/Hive/README.md index 36fe6bc..1fa8a47 100644 --- a/Hive/README.md +++ b/Hive/README.md @@ -3,9 +3,9 @@ This repository provides some simple scripts to help exporting your Hive code so it can be migrated to [Snowflake](https://www.snowflake.com/) using [SnowConvert](https://docs.snowconvert.com/snowconvert/apache-hive/introduction) -## Version 1.1 +## Version -Release 2021-12-03 +Release 2023-02-28 ## Usage @@ -18,6 +18,8 @@ The following are the steps to execute the DDL Code Generation. They can be exec 2 - After modifying, the `exp_ddl.sh` file can be run from the command line to execute the extract. The following files will be created in the current directory under `ddl_extract`: +3 - Run `create_ddls.sh --version` to check the current version of the extraction scripts. + `./exp_ddl.sh` ## Reporting issues and feedback diff --git a/Hive/exp_ddl.sh b/Hive/exp_ddl.sh index 35c78bb..284acae 100644 --- a/Hive/exp_ddl.sh +++ b/Hive/exp_ddl.sh @@ -3,6 +3,16 @@ #use one of the 2 CLI clients below to connect #adjust arguments below to connect to your environment, using username,password or keytab +#This version should match the README.md version. Please update this version on every change request. +VERSION="Release 2024-02-28" + +export versionParam=$1 + +if [ "$versionParam" = "--version" ]; then + echo "You are using the $VERSION of the extraction scripts" + exit 1 +fi + HOST=localhost PORT=10000 diff --git a/Oracle/README.md b/Oracle/README.md index dd62ab2..70fe21a 100644 --- a/Oracle/README.md +++ b/Oracle/README.md @@ -4,7 +4,7 @@ This repository offers a collection of straightforward scripts designed to facil ## Version -Release 2024-02-23 +Release 2024-02-28 ## Prerequisites @@ -24,7 +24,7 @@ You will need the conection string to your database. You can use a connection st For example, if your database is hosted on AWS, you will need to use the connection format provided in this link: [AWS conection string](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ConnectToOracleInstance.SQLPlus.html). Therefore, a valid example connection string would be as follows: - `TEST_USER@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=url.amazonaws.com)(PORT=1521))(CONNECT_DATA=(SID=orcl)))` + `TEST_USER/PASSWORD@(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=url.amazonaws.com)(PORT=1521))(CONNECT_DATA=(SID=orcl)))` The link contains detailed instructions on how to connect to an Oracle instance on AWS using SQL\*Plus or SQLcl. It provides the necessary format for specifying the connection details, such as the hostname, port, and SID. By following the instructions in the link, you will be able to establish a successful connection to your Oracle database hosted on AWS. @@ -51,6 +51,8 @@ To obtain the necessary files for executing the DLL code generation, follow thes 7. Extract the contents of the .zip file by right-clicking on it and selecting the "Extract All" or similar option. Choose a destination folder where you want to extract the files. +8. Run `create_ddls.sh --version` to check the current version of the extraction scripts. + You are now ready to proceed with executing the DLL code generation using the files found in the "bin" and "script" folders. In the "bin" folder, you will find the bash scripts for Unix/Linux environments or the batch scripts for Windows. These scripts are designed to facilitate the DLL code generation process, ensuring compatibility across different operating systems: diff --git a/Oracle/bin/create_ddls.sh b/Oracle/bin/create_ddls.sh index 9586f11..cd54a34 100644 --- a/Oracle/bin/create_ddls.sh +++ b/Oracle/bin/create_ddls.sh @@ -1,6 +1,22 @@ #!/bin/bash #GENERAL INSTRUCTIONS: This script is used to extract object DDL from your Oracle Database. Please adjust the variables below # to match your environment. Once completed, your extracted DDL code will be stored in the object_extracts folder. + + +#Version 2024-02-28: Added flag to display version. Update output text with more detailed information about the execution. + +#This version should match the README.md version. Please update this version on every change request. +VERSION="Release 2024-02-28" + +export versionParam=$1 + +if [ "$versionParam" = "--version" ]; then + echo "You are using the $VERSION of the extraction scripts" + exit 1 +fi + +echo "[$(date '+%Y/%m/%d %l:%M:%S%p')] Info: Execute Oracle extraction scripts: Started" + export ORACLE_SID= export CONNECT_STRING=system/oracle export SCRIPT_PATH= @@ -19,6 +35,7 @@ if [ ! -e "$SQLCL_PATH" ]; then exit 1 fi +echo "[$(date '+%Y/%m/%d %l:%M:%S%p')] Info: Step 1/4 - Creating Directories: Started" #Path to where object extracts are written mkdir -p $OUTPUT_PATH/object_extracts @@ -26,6 +43,8 @@ mkdir -p $OUTPUT_PATH/object_extracts/DDL mkdir -p $OUTPUT_PATH/object_extracts/STORAGE touch -- "${OUTPUT_PATH}/object_extracts/DDL/.sc_extracted" +echo "[$(date '+%Y/%m/%d %l:%M:%S%p')] Info: Step 1/4 - Creating Directories: Completed" + if [ ! -e "$OUTPUT_PATH" ]; then echo "The output path does not exist." @@ -47,4 +66,8 @@ export EXCLUDE_CONDITION="('SYSMAN')" # Modify this JAVA variable to asign less or more memory to the JVM # export JAVA_TOOL_OPTIONS=-Xmx4G -$SQLCL_PATH/sql $CONNECT_STRING @$SCRIPT_PATH/create_ddls.sql $INCLUDE_OPERATOR $INCLUDE_CONDITION $EXCLUDE_OPERATOR $EXCLUDE_CONDITION $OUTPUT_PATH \ No newline at end of file +echo "[$(date '+%Y/%m/%d %l:%M:%S%p')] Info: Step 2/4 - Extracting DDLs: Started" + +$SQLCL_PATH/sql $CONNECT_STRING @$SCRIPT_PATH/create_ddls.sql $INCLUDE_OPERATOR $INCLUDE_CONDITION $EXCLUDE_OPERATOR $EXCLUDE_CONDITION $OUTPUT_PATH + + diff --git a/Oracle/scripts/create_ddls.sql b/Oracle/scripts/create_ddls.sql index b1fe212..f40c759 100644 --- a/Oracle/scripts/create_ddls.sql +++ b/Oracle/scripts/create_ddls.sql @@ -14,7 +14,7 @@ SET SHOWMODE OFF -- spool &5/object_extracts/extract_info.txt -select 'Snowflake/Mobilize.Net SnowConvert Oracle Extraction Scripts 0.0.18.01' || CHR(10) || 'Date: ' || sysdate || CHR(10) || 'Oracle Version: ' || BANNER from V$VERSION; +select 'Snowflake SnowConvert Oracle Extraction Scripts 0.0.18.01' || CHR(10) || 'Date: ' || sysdate || CHR(10) || 'Oracle Version: ' || BANNER from V$VERSION; spool off -- @@ -29,6 +29,13 @@ execute dbms_metadata.set_transform_param (DBMS_METADATA.session_transform,'PRET -- +SET TERMOUT ON + + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Step 3/4 - Extracting DDLs: Started' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting tables' from dual; + +SET TERMOUT OFF + spool &5/object_extracts/DDL/DDL_Tables.sql SELECT '/* ' || owner || '.' || object_name || ' */', DBMS_METADATA.get_ddl(object_type, object_name, owner) @@ -45,6 +52,10 @@ AND (owner, object_name) not in (select owner, table_name from dba_nested_tables AND (owner, object_name) not in (select owner, table_name from dba_tables where iot_type = 'IOT_OVERFLOW'); spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted tables' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting views' from dual; +SET TERMOUT OFF -- @@ -66,6 +77,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted views' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting functions' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Functions.sql @@ -84,6 +100,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted functions' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting procedures' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Procedures.sql @@ -103,6 +124,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted procedures' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting packages' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Packages.sql @@ -123,6 +149,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted packages' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting synonyms' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Synonyms.sql @@ -147,6 +178,11 @@ AND OWNER NOT LIKE 'SQLT%' spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted synonyms' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting types' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Types.sql @@ -168,6 +204,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted types' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting indexes' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Indexes.sql @@ -189,6 +230,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted indexes' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting triggers' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Triggers.sql @@ -210,6 +256,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted triggers' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting sequences' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_Sequences.sql @@ -229,6 +280,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted sequences' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting DBlink' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_DBlink.sql @@ -246,6 +302,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted DBlink' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting queue tables' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_QUEUE_TABLES.sql @@ -266,6 +327,11 @@ AND (owner, queue_table) not in (select owner, table_name from dba_tables where spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted queue tables' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting olap cubes' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_OLAP_CUBES.sql @@ -282,6 +348,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted olap cubes' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting materialized views' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_MATERIALIZED_VIEWS.sql @@ -298,6 +369,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted materialized views' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting queuqes' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_QUEUES.sql @@ -316,6 +392,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted queues' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting analytic views' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_ANALYTIC_VIEWS.sql @@ -332,6 +413,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted analytic views' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Start extracting operators' from dual; +SET TERMOUT OFF + -- spool &5/object_extracts/DDL/DDL_OPERATORS.sql @@ -348,6 +434,11 @@ AND OWNER NOT LIKE 'SQLT%'; spool off +SET TERMOUT ON + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Extracted operators' from dual; + SELECT '[' || TO_CHAR(SYSDATE, 'YYYY/MM/DD HH:MI:SSAM') || '] Info: Step 4/4 - Writing Storage Tables report' from dual; +SET TERMOUT OFF + -- --STORAGE diff --git a/Redshift/README.md b/Redshift/README.md index 1b2c404..56ab5e5 100644 --- a/Redshift/README.md +++ b/Redshift/README.md @@ -4,7 +4,7 @@ This repository provides some simple scripts to help exporting your Redshift Cod ## Version -Release 2021-03-24 +Release 2024-02-28 ## Usage @@ -76,6 +76,8 @@ MAX_ITERATIONS|AWS handles requests asynchronously, therefore we need to perform * After modifying these variables, execute the scripts and your DDL Code should be extracted into the path you specified. +* Run `create_ddls.sh --version` to check the current version of the extraction scripts. + #### Manual * Open the queries located in `Redshift/scripts` in your preferred SQL Editor and replace the `{schema_filter}` line with the desired filter for your needs. If you need all schemas to be pulled, you could either input `lower(schemaname) like '%'` or remove the entire `WHERE`. diff --git a/Teradata/README.md b/Teradata/README.md index 69e70ac..6652056 100644 --- a/Teradata/README.md +++ b/Teradata/README.md @@ -4,7 +4,7 @@ This repository provides some simple scripts to help exporting your Teradata cod ## Version -Release 2023-02-27 +Release 2023-02-28 ## Usage @@ -48,6 +48,8 @@ These files will contain the definitions of the objects specified by the file na * `DDL_Macros.sql` * `DDL_Procedures.sql` +3 - Run `create_ddls.sh --version` to check the current version of the extraction scripts. + ## Reporting issues and feedback If you encounter any bugs with the tool please file an issue in the diff --git a/Teradata/bin/create_ddls.sh b/Teradata/bin/create_ddls.sh index d6502a2..f19dc2e 100644 --- a/Teradata/bin/create_ddls.sh +++ b/Teradata/bin/create_ddls.sh @@ -5,8 +5,20 @@ #Version 2024-02-01: Add spliting mechanism for output code. #Version 2024-02-23: Remove spliting mechanism for output code. #Version 2024-02-27: Update output text with more detailed information about the execution. +#Version 2024-02-28: Added flag to display version. + +#This version should match the README.md version. Please update this version on every change request. +VERSION="Release 2024-02-28" + +versionParam=$1 + +if [ "$versionParam" = "--version" ]; then + echo "You are using the $VERSION of the extraction scripts" + exit 1 +fi ##### PARAMETERS + ##### Modify the connection information connection_string="dbc,dbc"