Wednesday, 31 March 2021

Loading the CSV file data in the table using script task in SSIS

 We are getting the CSV file and we need to load this file into the SQL server table. There are multiple ways to load in the SSIS.

In this demo, we will see how we will load the CSV file using the script task.

Below is the file. 

We need to load it into the below table.  

Let’s how to load this file.

Taking Script task. 

In the script task we are writing the below code.

public void Main()

                                {

            try

            {

                // TODO: Add your code here

                char[] delimiters = new char[] { ',' };  //CSV file with comma separated

                DataTable dt = new DataTable();

                using (StreamReader reader = new StreamReader("H:\\SSIS1\\Source\\Sales.csv")) //Reading the CSV file

                {

 

                    string[] headers = reader.ReadLine().Split(','); // Adding the Header into the data table

                    foreach (string header in headers)

                    {

                        dt.Columns.Add(header);

                    }

                    dt.Columns.Add("Created_Date");  //Adding Created_Date column in the header

                    dt.Columns.Add("Created_by");   // Adding created_by column in the header

                    while (!reader.EndOfStream)

                    {

                        string[] rows = reader.ReadLine().Split(',');

                        DataRow dr = dt.NewRow();

                        for (int i = 0; i < headers.Length; i++)

                        {

                            dr[i] = rows[i];

                        }

                        dr["Created_date"] = DateTime.Now; //Adding Created_Date

                        dr["created_by"] = "Bagesh";  //Adding Created_by

                        dt.Rows.Add(dr);

 

                    }

                }

 

                //  creating connection to the data base

                string connection = "Data Source=BAGESH-PC\\BAGESHDB;database=Test;integrated Security=SSPI;";

                SqlConnection con = new SqlConnection(connection);

                //create object of SqlBulkCopy which help to insert 

                SqlBulkCopy obj = new SqlBulkCopy(con);

 

                //assign Destination table name 

                obj.DestinationTableName = "sales_Stg";

 

                //Mapping the Source column with destination column.

                //Keep in mind we need to provide the correct mapping otherwise it will throw an error

                obj.ColumnMappings.Add("SalesOrderNumber", "SalesOrderNumber");

                obj.ColumnMappings.Add("SalesAmount", "SalesAmount");

                obj.ColumnMappings.Add("UnitPrice", "UnitPrice");

                obj.ColumnMappings.Add("ExtendedAmount", "ExtendedAmount");

                obj.ColumnMappings.Add("TaxAmt", "TaxAmt");

                obj.ColumnMappings.Add("Created_Date", "Created_Date");

                obj.ColumnMappings.Add("created_by", "created_by");

                //Opining the connection

                con.Open();

                //insert bulk Records into DataBase. 

                obj.WriteToServer(dt);

                //Closing the conntion

                con.Close();

                Dts.TaskResult = (int)ScriptResults.Success;

            }

                //If any error comes logging that error

            catch (Exception ex)

            {

                Dts.Events.FireError(0, "Fire Error", "An error occurred: " + ex.Message.ToString(), "", 0);

 

                Dts.TaskResult = (int)ScriptResults.Failure;

            }

                                }

Build and close the script window and click ok.

Now our package is ready to run.

Before running the package see the records in the table.

 

Now running this package. 

Package executed successfully.

See the records in the table.

 

Data loaded successfully.

Popular Posts