📖 User Manual

SQL2DB User Manual

Complete guide covering all SQL2DB features from installation to advanced functionality

🎯 Overview

What is SQL2DB?

SQL2DB is a Node.js-based tool for efficiently performing data migration between Microsoft SQL Server instances. It provides advanced features such as large-scale data processing, real-time monitoring, and resume functionality to support stable and efficient data migration.

Key Features

  • Interactive Interface (v0.8.1): User-friendly menu system for easy operations
  • Standalone Executable (v0.8.1): Run immediately without Node.js installation
  • Progress Monitoring (v0.8.1): Detailed migration history and progress tracking
  • Batch Data Migration: Optimized large-scale data processing
  • Flexible Configuration: XML or JSON-based configuration
  • Column Override: Modify/add specific column values during migration
  • Pre/Post Processing: Execute SQL scripts before/after migration
  • Dynamic Variables: Extract and utilize data at execution time
  • Resume Migration: Restart interrupted migration from the completed point
📊
Source DB
⚙️
SQL2DB
🎯
Target DB

🛠️ Installation & Setup

1

System Requirements

For Standalone Executable Users

  • Windows 7 or higher (64-bit)
  • SQL Server 2012 or higher (source/target)
  • Appropriate database permissions
  • No Node.js required

For Node.js Source Users (Developers)

  • Node.js 14.0 or higher (18.x recommended)
  • npm 6.0 or higher
  • SQL Server 2012 or higher (source/target)
  • Appropriate database permissions
2

Installation

Option 1: Standalone Executable (Recommended)

bash
# 1. Download sql2db-v0.8.1-win-x64.zip
# 2. Extract to desired location
# 3. Navigate to extracted folder

# Run English version
run.bat

# Run Korean version
실행하기.bat

# Or run directly
sql2db.exe --lang=en
sql2db.exe --lang=kr

Option 2: Node.js Source (For Developers)

bash
# Clone repository
git clone https://github.com/happysoft2018/sql2db.git
cd sql2db

# Install dependencies
npm install

# Run
npm start              # English
npm run start:kr       # Korean
3

Database Configuration

Create config/dbinfo.json file and configure database connection information:

json
{
  "dbs": {
    "sourceDB": {
      "server": "source-server.com",
      "port": 1433,
      "database": "source_database",
      "user": "username",
      "password": "password",
      "isWritable": false,
      "description": "Source database",
      "options": {
        "encrypt": true,
        "trustServerCertificate": true
      }
    },
    "targetDB": {
      "server": "target-server.com", 
      "port": 1433,
      "database": "target_database",
      "user": "username",
      "password": "password",
      "isWritable": true,
      "description": "Target database"
    }
  }
}

🚀 Basic Usage

Interactive Interface (v0.8.1)

SQL2DB provides a user-friendly menu-based interface. You can perform all tasks with simple menu selections without memorizing complex commands.

How to Run

# Standalone Executable
run.bat              # English version
실행하기.bat         # Korean version
sql2db.exe --lang=en # Direct run (English)
sql2db.exe --lang=kr # Direct run (Korean)

# Node.js Source
npm start            # English
npm run start:kr     # Korean

Menu Options

  • 1. Validate Query Definition File: Validate syntax and attributes of XML/JSON query definition files
  • 2. Test Database Connection: Test database connection (displays list of connectable databases)
  • 3. Execute Data Migration: Execute actual data migration
  • 4. Check Migration Progress: View progress and detailed information of recent migrations
  • 5. Show Help: Display available commands and feature descriptions
  • 0. Exit: Exit program

Selecting Query Files

You need to select a query definition file for operations. Just enter the displayed number without typing the full path:

Available query definition files:
1. migration-queries.xml [XML]
2. test-migration.json [JSON]

Select file number (1-2): 1
Selected file: D:\sql2db\queries\migration-queries.xml

Progress Monitoring

Select menu option 4 to view progress of recent migrations:

  • Initially shows only the recent 3 migrations
  • Enter 'A' to view full migration history
  • Select number to view detailed information:
    • Overall migration status (completed/failed/running)
    • Start/end time and duration
    • Query-level progress (processed rows, batches)
    • Error information (if occurred)

Basic CLI Command Usage

Validate Configuration

bash
node src/migrate-cli.js validate --query ./queries/migration-queries.xml

List Databases

bash
node src/migrate-cli.js list-dbs

Execute Data Migration

bash
node src/migrate-cli.js migrate --query ./queries/migration-queries.xml

Simulation Run (DRY RUN)

bash
node src/migrate-cli.js migrate --query ./queries/migration-queries.xml --dry-run

Resume Interrupted Migration

bash
node src/migrate-cli.js migrate --query ./queries/migration-queries.xml --resume

XML Configuration File Structure

xml
<?xml version="1.0" encoding="UTF-8"?>
<migrations>
  <!-- Global settings -->
  <global>
    <batchSize>1000</batchSize>
    <timeout>30000</timeout>
    <retryCount>3</retryCount>
  </global>
  
  <!-- Dynamic variable definitions -->
  <dynamicVars>
    <dynamicVar name="currentDate">
      <![CDATA[
        SELECT GETDATE() as CurrentDate
      ]]>
    </dynamicVar>
  </dynamicVars>
  
  <!-- Migration task definition -->
  <migration name="User_Data_Migration" use="true">
    <source db="sourceDB">
      <![CDATA[
        SELECT UserID, UserName, Email, CreatedDate
        FROM Users 
        WHERE CreatedDate >= '2024-01-01'
      ]]>
    </source>
    
    <target db="targetDB" table="Users">
      <columns>
        <column name="UserID" type="int" />
        <column name="UserName" type="varchar(100)" />
        <column name="Email" type="varchar(255)" />
        <column name="CreatedDate" type="datetime" />
        <column name="MigratedDate" type="datetime" override="true">${currentDate.CurrentDate}</column>
      </columns>
    </target>
    
    <!-- Pre-processing script -->
    <preProcess db="targetDB">
      <![CDATA[
        DELETE FROM Users WHERE CreatedDate >= '2024-01-01'
      ]]>
    </preProcess>
    
    <!-- Post-processing script -->
    <postProcess db="targetDB">
      <![CDATA[
        UPDATE Users SET Status = 'Migrated' WHERE MigratedDate IS NOT NULL
      ]]>
    </postProcess>
  </migration>
</migrations>

🎨 Advanced Features

🔄

Dynamic Variable System

Extract values from the database at execution time and utilize them in the migration process.

  • Real-time data extraction
  • Variable substitution and utilization
  • Complex query support
  • Debug mode support
📊

Real-time Monitoring

Track migration progress in real-time through keyboard interactive monitoring and charts.

  • Real-time progress display
  • Chart-based visualization
  • Keyboard interaction
  • Performance metrics display
🔄

Resume Migration

Restart interrupted migration from the completed point due to network errors or other issues.

  • Progress state saving
  • Automatic restart support
  • Data consistency guarantee
  • Error recovery functionality
⚙️

Pre/Post Processing

Execute SQL scripts before and after migration to perform data cleanup and validation.

  • Data cleanup scripts
  • Validation and verification
  • Index management
  • Statistics updates
🎯

Column Override

Modify or add specific column values during migration.

  • Default value setting
  • Dynamic value assignment
  • Data transformation
  • Metadata addition
📈

Batch Processing Optimization

Provides a batch processing system for efficient handling of large-scale data.

  • Memory efficiency
  • Performance optimization
  • Transaction management
  • Error handling

📋 Practical Examples

📊 User Data Migration

Basic example of migrating user table data from source to target.

xml
<migration name="User_Migration" use="true">
  <source db="sourceDB">
    <![CDATA[
      SELECT UserID, UserName, Email, CreatedDate, Status
      FROM Users 
      WHERE Status = 'Active'
    ]]>
  </source>
  
  <target db="targetDB" table="Users">
    <columns>
      <column name="UserID" type="int" />
      <column name="UserName" type="varchar(100)" />
      <column name="Email" type="varchar(255)" />
      <column name="CreatedDate" type="datetime" />
      <column name="Status" type="varchar(20)" />
      <column name="MigratedDate" type="datetime" override="true">${currentDate.CurrentDate}</column>
    </columns>
  </target>
</migration>

🔄 Dynamic Variable Usage

Example of extracting data at execution time and using it as migration conditions.

xml
<dynamicVars>
  <dynamicVar name="lastSyncDate">
    <![CDATA[
      SELECT MAX(LastModifiedDate) as LastSyncDate 
      FROM TargetDB.dbo.Products
    ]]>
  </dynamicVar>
</dynamicVars>

<migration name="Product_Migration">
  <source db="sourceDB">
    <![CDATA[
      SELECT ProductID, ProductName, Price, LastModifiedDate
      FROM Products 
      WHERE LastModifiedDate > ${lastSyncDate.LastSyncDate}
    ]]>
  </source>
</migration>

⚙️ Pre/Post Processing Usage

Example of performing data cleanup and validation before and after migration.

xml
<migration name="Order_Migration">
  <!-- Pre-processing: Clean existing data -->
  <preProcess db="targetDB">
    <![CDATA[
      DELETE FROM Orders WHERE OrderDate >= '2024-01-01'
    ]]>
  </preProcess>
  
  <source db="sourceDB">
    <![CDATA[SELECT * FROM Orders WHERE OrderDate >= '2024-01-01']]>
  </source>
  
  <target db="targetDB" table="Orders" />
  
  <!-- Post-processing: Recreate indexes and update statistics -->
  <postProcess db="targetDB">
    <![CDATA[
      CREATE INDEX IX_Orders_OrderDate ON Orders(OrderDate)
      UPDATE STATISTICS Orders
    ]]>
  </postProcess>
</migration>

🔧 Troubleshooting

❌ Connection Error

Issue: Database connection fails.

Solution:

  • Check connection information in config/dbinfo.json file
  • Verify SQL Server service is running
  • Check firewall settings
  • Verify user permissions

⏱️ Timeout Error

Issue: Query execution times out.

Solution:

  • Increase timeout value in global settings
  • Decrease batch size (batchSize)
  • Optimize queries
  • Utilize indexes

🔄 Resume Migration Issue

Issue: Failed to resume interrupted migration.

Solution:

  • Check progress state file
  • Verify database status
  • Analyze log files
  • Use --resume option

📊 Memory Insufficient

Issue: Memory shortage error during large data processing.

Solution:

  • Decrease batch size
  • Optimize queries
  • Utilize indexes
  • Check system resources

❓ Frequently Asked Questions

Q: What databases does SQL2DB support?

+

A: Currently supports Microsoft SQL Server 2012 or higher. Both source and target must be SQL Server.

Q: How is performance with large-scale data processing?

+

A: Large-scale data can be processed efficiently through batch processing and real-time monitoring. Memory usage can be controlled by adjusting batch size.

Q: What happens if an error occurs during migration?

+

A: You can resume migration from the completed point using the resume functionality. Use the --resume option.

Q: How do I use real-time monitoring?

+

A: The monitoring screen is automatically displayed when migration is executed. You can check progress in real-time through keyboard interaction.

Q: What are dynamic variables used for?

+

A: You can extract values from the database at execution time and use them as migration conditions or column values. For example, you can perform incremental migration based on the last sync date.

📞 Support & Contact

📧

Email Support

For technical issues or inquiries, please contact us via email.

Send Email
🐛

Issue Report

Report bugs or request features through GitHub issues.

Create Issue
📚

Documentation & Examples

For more detailed information and examples, visit the GitHub repository.

Visit GitHub