SQL Server - Dynamic number of columns in Insert statement












1















I will not go into the reasons for this requirement (too long and not relevant).



I have a rather wide table Table_1 with 256 columns, all NVARCHAR(512) and accepting null.



On the other hand, I have a string into which a dynamic select is built and, depending on the case, it generates each time a different number of columns.



The results of the query need to be inserted into the Table_1.



The problem is that either I explicitly mention the columns into which results are to be inserted or the results of the query must match the structure of the target table.



For instance, let's have:



DECLARE @Table_1 TABLE (Field_1 NVARCHAR(512) ,
Field_2 NVARCHAR(512) ,
Field_3 NVARCHAR(512) ,
Field_4 NVARCHAR(512) ,
Field_5 NVARCHAR(512) ,
Field_6 NVARCHAR(512) ) ;


And a query like:



DECLARE @_My_Query NVARCHAR(1000) = '
SELECT Name Field_1 ,
Street Field_2 ,
Phone Field_3
FROM My_Table
';


Now, I can't do:



INSERT INTO @Table_1 
EXECUTE sp_executesql @_My_Query ;


because it would complain about mismatching columns. Neither can I hardcode (Field_1,Field_2,Field_3) because other queries would need a different number of columns.



Edit



Answers to the comments posted by KrazzyNefarious, Jeroen Mostert and Damien_The_Unbeliever:



I do have a clear view of the fields returned by the query as well as their count.



The functionality I'm trying to implement is as follows:




  1. The system I'm working on should support the generation of CSV files with a variable number of columns and contents.


  2. The idea is that the query that generates the result data is prepared at a higher layer and sent to a Stored Procedure (as a string) along with a table that describes all the columns of the result (e.g. ordinal position, field name, data type, title in the Excel's first row, etc.).


  3. The Procedure I'm working on would take the query string and run a combined INSERT INTO Table_1 EXECUTE sp_executesql @_My_Query. Once this is done, the mechanism that examines each and every field (for instance, to add " whenever needed) can handle the contents of Table_1 without any concern about the origin of the data.



I would prefer to avoid using temporary tables since, besides the ugliness of the approach, it affects performance and additional issues (need to delete records after use, collisions, etc.).










share|improve this question

























  • what is the maximum number of columns that the dynamic query may return?

    – KrazzyNefarious
    Nov 13 '17 at 11:49













  • How do you know how to map fields? Are the names at least in correspondence?

    – Jeroen Mostert
    Nov 13 '17 at 11:49











  • "... I have a string into which a dynamic select is built and ..." - are you able to make changes to the process of constructing that string, such that you can track how many columns have been added? Starting from just a string containing query text and trying to parse it to determine the column count seems like too much complexity here.

    – Damien_The_Unbeliever
    Nov 13 '17 at 11:49











  • describe the issue. this looks like a solution you (or someone else) found and are now trying to implement. how the select is generated? you have control on the creation of the dynamic select? what's the requirement behind a dynamic select to insert into a bunch of nullable columns?

    – Paolo
    Nov 13 '17 at 11:59











  • @KrazzyNefarious, please see the EDIT.

    – FDavidov
    Nov 13 '17 at 12:09
















1















I will not go into the reasons for this requirement (too long and not relevant).



I have a rather wide table Table_1 with 256 columns, all NVARCHAR(512) and accepting null.



On the other hand, I have a string into which a dynamic select is built and, depending on the case, it generates each time a different number of columns.



The results of the query need to be inserted into the Table_1.



The problem is that either I explicitly mention the columns into which results are to be inserted or the results of the query must match the structure of the target table.



For instance, let's have:



DECLARE @Table_1 TABLE (Field_1 NVARCHAR(512) ,
Field_2 NVARCHAR(512) ,
Field_3 NVARCHAR(512) ,
Field_4 NVARCHAR(512) ,
Field_5 NVARCHAR(512) ,
Field_6 NVARCHAR(512) ) ;


And a query like:



DECLARE @_My_Query NVARCHAR(1000) = '
SELECT Name Field_1 ,
Street Field_2 ,
Phone Field_3
FROM My_Table
';


Now, I can't do:



INSERT INTO @Table_1 
EXECUTE sp_executesql @_My_Query ;


because it would complain about mismatching columns. Neither can I hardcode (Field_1,Field_2,Field_3) because other queries would need a different number of columns.



Edit



Answers to the comments posted by KrazzyNefarious, Jeroen Mostert and Damien_The_Unbeliever:



I do have a clear view of the fields returned by the query as well as their count.



The functionality I'm trying to implement is as follows:




  1. The system I'm working on should support the generation of CSV files with a variable number of columns and contents.


  2. The idea is that the query that generates the result data is prepared at a higher layer and sent to a Stored Procedure (as a string) along with a table that describes all the columns of the result (e.g. ordinal position, field name, data type, title in the Excel's first row, etc.).


  3. The Procedure I'm working on would take the query string and run a combined INSERT INTO Table_1 EXECUTE sp_executesql @_My_Query. Once this is done, the mechanism that examines each and every field (for instance, to add " whenever needed) can handle the contents of Table_1 without any concern about the origin of the data.



I would prefer to avoid using temporary tables since, besides the ugliness of the approach, it affects performance and additional issues (need to delete records after use, collisions, etc.).










share|improve this question

























  • what is the maximum number of columns that the dynamic query may return?

    – KrazzyNefarious
    Nov 13 '17 at 11:49













  • How do you know how to map fields? Are the names at least in correspondence?

    – Jeroen Mostert
    Nov 13 '17 at 11:49











  • "... I have a string into which a dynamic select is built and ..." - are you able to make changes to the process of constructing that string, such that you can track how many columns have been added? Starting from just a string containing query text and trying to parse it to determine the column count seems like too much complexity here.

    – Damien_The_Unbeliever
    Nov 13 '17 at 11:49











  • describe the issue. this looks like a solution you (or someone else) found and are now trying to implement. how the select is generated? you have control on the creation of the dynamic select? what's the requirement behind a dynamic select to insert into a bunch of nullable columns?

    – Paolo
    Nov 13 '17 at 11:59











  • @KrazzyNefarious, please see the EDIT.

    – FDavidov
    Nov 13 '17 at 12:09














1












1








1








I will not go into the reasons for this requirement (too long and not relevant).



I have a rather wide table Table_1 with 256 columns, all NVARCHAR(512) and accepting null.



On the other hand, I have a string into which a dynamic select is built and, depending on the case, it generates each time a different number of columns.



The results of the query need to be inserted into the Table_1.



The problem is that either I explicitly mention the columns into which results are to be inserted or the results of the query must match the structure of the target table.



For instance, let's have:



DECLARE @Table_1 TABLE (Field_1 NVARCHAR(512) ,
Field_2 NVARCHAR(512) ,
Field_3 NVARCHAR(512) ,
Field_4 NVARCHAR(512) ,
Field_5 NVARCHAR(512) ,
Field_6 NVARCHAR(512) ) ;


And a query like:



DECLARE @_My_Query NVARCHAR(1000) = '
SELECT Name Field_1 ,
Street Field_2 ,
Phone Field_3
FROM My_Table
';


Now, I can't do:



INSERT INTO @Table_1 
EXECUTE sp_executesql @_My_Query ;


because it would complain about mismatching columns. Neither can I hardcode (Field_1,Field_2,Field_3) because other queries would need a different number of columns.



Edit



Answers to the comments posted by KrazzyNefarious, Jeroen Mostert and Damien_The_Unbeliever:



I do have a clear view of the fields returned by the query as well as their count.



The functionality I'm trying to implement is as follows:




  1. The system I'm working on should support the generation of CSV files with a variable number of columns and contents.


  2. The idea is that the query that generates the result data is prepared at a higher layer and sent to a Stored Procedure (as a string) along with a table that describes all the columns of the result (e.g. ordinal position, field name, data type, title in the Excel's first row, etc.).


  3. The Procedure I'm working on would take the query string and run a combined INSERT INTO Table_1 EXECUTE sp_executesql @_My_Query. Once this is done, the mechanism that examines each and every field (for instance, to add " whenever needed) can handle the contents of Table_1 without any concern about the origin of the data.



I would prefer to avoid using temporary tables since, besides the ugliness of the approach, it affects performance and additional issues (need to delete records after use, collisions, etc.).










share|improve this question
















I will not go into the reasons for this requirement (too long and not relevant).



I have a rather wide table Table_1 with 256 columns, all NVARCHAR(512) and accepting null.



On the other hand, I have a string into which a dynamic select is built and, depending on the case, it generates each time a different number of columns.



The results of the query need to be inserted into the Table_1.



The problem is that either I explicitly mention the columns into which results are to be inserted or the results of the query must match the structure of the target table.



For instance, let's have:



DECLARE @Table_1 TABLE (Field_1 NVARCHAR(512) ,
Field_2 NVARCHAR(512) ,
Field_3 NVARCHAR(512) ,
Field_4 NVARCHAR(512) ,
Field_5 NVARCHAR(512) ,
Field_6 NVARCHAR(512) ) ;


And a query like:



DECLARE @_My_Query NVARCHAR(1000) = '
SELECT Name Field_1 ,
Street Field_2 ,
Phone Field_3
FROM My_Table
';


Now, I can't do:



INSERT INTO @Table_1 
EXECUTE sp_executesql @_My_Query ;


because it would complain about mismatching columns. Neither can I hardcode (Field_1,Field_2,Field_3) because other queries would need a different number of columns.



Edit



Answers to the comments posted by KrazzyNefarious, Jeroen Mostert and Damien_The_Unbeliever:



I do have a clear view of the fields returned by the query as well as their count.



The functionality I'm trying to implement is as follows:




  1. The system I'm working on should support the generation of CSV files with a variable number of columns and contents.


  2. The idea is that the query that generates the result data is prepared at a higher layer and sent to a Stored Procedure (as a string) along with a table that describes all the columns of the result (e.g. ordinal position, field name, data type, title in the Excel's first row, etc.).


  3. The Procedure I'm working on would take the query string and run a combined INSERT INTO Table_1 EXECUTE sp_executesql @_My_Query. Once this is done, the mechanism that examines each and every field (for instance, to add " whenever needed) can handle the contents of Table_1 without any concern about the origin of the data.



I would prefer to avoid using temporary tables since, besides the ugliness of the approach, it affects performance and additional issues (need to delete records after use, collisions, etc.).







sql-server dynamic-sql






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Dec 24 '18 at 23:41









halfer

14.7k759116




14.7k759116










asked Nov 13 '17 at 11:40









FDavidovFDavidov

2,39741230




2,39741230













  • what is the maximum number of columns that the dynamic query may return?

    – KrazzyNefarious
    Nov 13 '17 at 11:49













  • How do you know how to map fields? Are the names at least in correspondence?

    – Jeroen Mostert
    Nov 13 '17 at 11:49











  • "... I have a string into which a dynamic select is built and ..." - are you able to make changes to the process of constructing that string, such that you can track how many columns have been added? Starting from just a string containing query text and trying to parse it to determine the column count seems like too much complexity here.

    – Damien_The_Unbeliever
    Nov 13 '17 at 11:49











  • describe the issue. this looks like a solution you (or someone else) found and are now trying to implement. how the select is generated? you have control on the creation of the dynamic select? what's the requirement behind a dynamic select to insert into a bunch of nullable columns?

    – Paolo
    Nov 13 '17 at 11:59











  • @KrazzyNefarious, please see the EDIT.

    – FDavidov
    Nov 13 '17 at 12:09



















  • what is the maximum number of columns that the dynamic query may return?

    – KrazzyNefarious
    Nov 13 '17 at 11:49













  • How do you know how to map fields? Are the names at least in correspondence?

    – Jeroen Mostert
    Nov 13 '17 at 11:49











  • "... I have a string into which a dynamic select is built and ..." - are you able to make changes to the process of constructing that string, such that you can track how many columns have been added? Starting from just a string containing query text and trying to parse it to determine the column count seems like too much complexity here.

    – Damien_The_Unbeliever
    Nov 13 '17 at 11:49











  • describe the issue. this looks like a solution you (or someone else) found and are now trying to implement. how the select is generated? you have control on the creation of the dynamic select? what's the requirement behind a dynamic select to insert into a bunch of nullable columns?

    – Paolo
    Nov 13 '17 at 11:59











  • @KrazzyNefarious, please see the EDIT.

    – FDavidov
    Nov 13 '17 at 12:09

















what is the maximum number of columns that the dynamic query may return?

– KrazzyNefarious
Nov 13 '17 at 11:49







what is the maximum number of columns that the dynamic query may return?

– KrazzyNefarious
Nov 13 '17 at 11:49















How do you know how to map fields? Are the names at least in correspondence?

– Jeroen Mostert
Nov 13 '17 at 11:49





How do you know how to map fields? Are the names at least in correspondence?

– Jeroen Mostert
Nov 13 '17 at 11:49













"... I have a string into which a dynamic select is built and ..." - are you able to make changes to the process of constructing that string, such that you can track how many columns have been added? Starting from just a string containing query text and trying to parse it to determine the column count seems like too much complexity here.

– Damien_The_Unbeliever
Nov 13 '17 at 11:49





"... I have a string into which a dynamic select is built and ..." - are you able to make changes to the process of constructing that string, such that you can track how many columns have been added? Starting from just a string containing query text and trying to parse it to determine the column count seems like too much complexity here.

– Damien_The_Unbeliever
Nov 13 '17 at 11:49













describe the issue. this looks like a solution you (or someone else) found and are now trying to implement. how the select is generated? you have control on the creation of the dynamic select? what's the requirement behind a dynamic select to insert into a bunch of nullable columns?

– Paolo
Nov 13 '17 at 11:59





describe the issue. this looks like a solution you (or someone else) found and are now trying to implement. how the select is generated? you have control on the creation of the dynamic select? what's the requirement behind a dynamic select to insert into a bunch of nullable columns?

– Paolo
Nov 13 '17 at 11:59













@KrazzyNefarious, please see the EDIT.

– FDavidov
Nov 13 '17 at 12:09





@KrazzyNefarious, please see the EDIT.

– FDavidov
Nov 13 '17 at 12:09












1 Answer
1






active

oldest

votes


















-1














This:



The system I'm working on should support the generation of CSV files with a variable number of columns and contents.



is the crux of your problem. You have chosen a path that doesn't really match/support this requirement. You have ruled out temp tables. So the only other option is to create a (temp or permanent) table for each export "situation" - which is ugly and prone to lots of other issues. Because a table in a rdbms has a fixed structure. More than just a different number of columns, you also need to consider different datatypes. Are you ignoring that issue as well or did you make a very simplifying assumption? If you only look at one situation you often lose sight of the complexity you face.



In addition, you can search for the many discussions about exporting CSV data from a stored procedure (or script). It is simply a nest of problems and constraints even without the variability of the source data structure.



I suggest you are using the wrong tool to implement your requirement. TSQL is simply not intended and does not really support ETL. If you intend to go down this path, you will need some advanced skills.






share|improve this answer
























  • Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

    – FDavidov
    Nov 14 '17 at 5:45














Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f47263475%2fsql-server-dynamic-number-of-columns-in-insert-statement%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









-1














This:



The system I'm working on should support the generation of CSV files with a variable number of columns and contents.



is the crux of your problem. You have chosen a path that doesn't really match/support this requirement. You have ruled out temp tables. So the only other option is to create a (temp or permanent) table for each export "situation" - which is ugly and prone to lots of other issues. Because a table in a rdbms has a fixed structure. More than just a different number of columns, you also need to consider different datatypes. Are you ignoring that issue as well or did you make a very simplifying assumption? If you only look at one situation you often lose sight of the complexity you face.



In addition, you can search for the many discussions about exporting CSV data from a stored procedure (or script). It is simply a nest of problems and constraints even without the variability of the source data structure.



I suggest you are using the wrong tool to implement your requirement. TSQL is simply not intended and does not really support ETL. If you intend to go down this path, you will need some advanced skills.






share|improve this answer
























  • Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

    – FDavidov
    Nov 14 '17 at 5:45


















-1














This:



The system I'm working on should support the generation of CSV files with a variable number of columns and contents.



is the crux of your problem. You have chosen a path that doesn't really match/support this requirement. You have ruled out temp tables. So the only other option is to create a (temp or permanent) table for each export "situation" - which is ugly and prone to lots of other issues. Because a table in a rdbms has a fixed structure. More than just a different number of columns, you also need to consider different datatypes. Are you ignoring that issue as well or did you make a very simplifying assumption? If you only look at one situation you often lose sight of the complexity you face.



In addition, you can search for the many discussions about exporting CSV data from a stored procedure (or script). It is simply a nest of problems and constraints even without the variability of the source data structure.



I suggest you are using the wrong tool to implement your requirement. TSQL is simply not intended and does not really support ETL. If you intend to go down this path, you will need some advanced skills.






share|improve this answer
























  • Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

    – FDavidov
    Nov 14 '17 at 5:45
















-1












-1








-1







This:



The system I'm working on should support the generation of CSV files with a variable number of columns and contents.



is the crux of your problem. You have chosen a path that doesn't really match/support this requirement. You have ruled out temp tables. So the only other option is to create a (temp or permanent) table for each export "situation" - which is ugly and prone to lots of other issues. Because a table in a rdbms has a fixed structure. More than just a different number of columns, you also need to consider different datatypes. Are you ignoring that issue as well or did you make a very simplifying assumption? If you only look at one situation you often lose sight of the complexity you face.



In addition, you can search for the many discussions about exporting CSV data from a stored procedure (or script). It is simply a nest of problems and constraints even without the variability of the source data structure.



I suggest you are using the wrong tool to implement your requirement. TSQL is simply not intended and does not really support ETL. If you intend to go down this path, you will need some advanced skills.






share|improve this answer













This:



The system I'm working on should support the generation of CSV files with a variable number of columns and contents.



is the crux of your problem. You have chosen a path that doesn't really match/support this requirement. You have ruled out temp tables. So the only other option is to create a (temp or permanent) table for each export "situation" - which is ugly and prone to lots of other issues. Because a table in a rdbms has a fixed structure. More than just a different number of columns, you also need to consider different datatypes. Are you ignoring that issue as well or did you make a very simplifying assumption? If you only look at one situation you often lose sight of the complexity you face.



In addition, you can search for the many discussions about exporting CSV data from a stored procedure (or script). It is simply a nest of problems and constraints even without the variability of the source data structure.



I suggest you are using the wrong tool to implement your requirement. TSQL is simply not intended and does not really support ETL. If you intend to go down this path, you will need some advanced skills.







share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 13 '17 at 14:47









SMorSMor

1,524259




1,524259













  • Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

    – FDavidov
    Nov 14 '17 at 5:45





















  • Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

    – FDavidov
    Nov 14 '17 at 5:45



















Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

– FDavidov
Nov 14 '17 at 5:45







Thank you @Smor for your post, though I can't really see in which way it is of any help. Some comments: (1) It would be more useful if you assume that whoever posts a question may have at least some basic understanding of what he/she is talking about (not always true, but good as a starting point). (2) It would appear reasonable to assess that there will be thousands of different results structures and many users concurrently, making the option of temp or permanent table prone to collisions, load => irrelevant. (3) Advanced Skills: Well, I think I found a solution. Will post it and you judge.

– FDavidov
Nov 14 '17 at 5:45






















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f47263475%2fsql-server-dynamic-number-of-columns-in-insert-statement%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

How to change which sound is reproduced for terminal bell?

Can I use Tabulator js library in my java Spring + Thymeleaf project?

Title Spacing in Bjornstrup Chapter, Removing Chapter Number From Contents