I've a DataTable with 100 rows and 100 columns. Then I'm updating each
column in each row. When I call the Update method on the DataAdapter, this
generates 10000 UPDATE statements.
Are there any solutions how I can create a better solution, which reduces
the UPDATE statements? Or how can I handle such big updates with SQL Server?
Thanks
Klaus Aschenbrenner
MVP Visual C#
www.csharp.at,www.anecon.com
http://weblogs.asp.net/klaus.aschenbrenner
Use a stored procedure to do the update. This sounds very much like you
are using a table as an array though, which isn't generally a good way
to model data in SQL.
David Portas
SQL Server MVP
|||I've already tried it with stored procedures and the performance isn't
better.
The problem on the data model is that the 100 rows and 100 columns are
representing a fincance plan. So each column must save additional
information (like formats, formula, ...).
So I have a table for each row (called "Position") and this table references
another table which stores the columns (called "PosVal") of the row. With
this data model I've the possibility that the table "PosVal" can reference
other tables which contains the format, formulas...
Or this there any other way to model this?
Thanks
Klaus Aschenbrenner
MVP Visual C#
www.csharp.at,www.anecon.com
http://weblogs.asp.net/klaus.aschenbrenner
"David Portas" <REMOVE_BEFORE_REPLYING_dportas@.acm.org> wrote in message
news:1109333983.806317.283120@.g14g2000cwa.googlegr oups.com...
> Use a stored procedure to do the update. This sounds very much like you
> are using a table as an array though, which isn't generally a good way
> to model data in SQL.
> --
> David Portas
> SQL Server MVP
> --
>
|||It seems like you are trying to model an abstraction ("rows", "columns"
and "formulae" from a hypothetical spreadsheet) instead of modelling
the actual data. Isn't your metadata static enough to create a proper
relational representation of it? If not then I suggest you need a
middle tier to present this data. The back end may be largely
irrelevent - I'm not sure just what benefit you are hoping to get from
using SQL Server as the data store for this.
If you do have some real data to model, then A) Normalize your tables,
B) post a CREATE TABLE statement and your stored proc. Since your proc
can update an entire row at a time I would have expected 100 updates to
outperform 10,000 but that largely depends on how you are doing the
updates and what your data looks like.
David Portas
SQL Server MVP
|||Klaus Aschenbrenner wrote:
> Hi!
> I've a DataTable with 100 rows and 100 columns. Then I'm updating each
> column in each row. When I call the Update method on the DataAdapter,
> this generates 10000 UPDATE statements.
> Are there any solutions how I can create a better solution, which
> reduces the UPDATE statements? Or how can I handle such big updates
> with SQL Server?
>
Are you using the CommandBuilder to generate the code? It probably makes
more sense to write the code yourself.
You stated in a subsequent reply that you created a stored procedure to do
the update but that it did not improve performance. Could you elaborate on
what the procedure did? I'm assuming you created a procedure that accepted
parameters for each of the 100 columns and did the update for an entire row
at a time, requiring 100 calls to the procedure instead of 10000 calls to a
procedure that did 1 column at a time...
Is that correct?
Bob Barrows
Microsoft MVP -- ASP/ASP.NET
Please reply to the newsgroup. The email account listed in my From
header is my spam trap, so I don't check it very often. You will get a
quicker response by posting to the newsgroup.
|||Klaus,
You can use SQLXML that comes with MDAC to reduce the number of hits going
to your database. You can then send an updategram. With 2.0, you have a
graceful upgrade to the Managed SQLXML driver, so it's not a deadend.
This is obviously SQL Server specific, but Oracle has other such equivalent
solutions.
- Sahil Malik
http://codebetter.com/blogs/sahil.malik/
"Klaus Aschenbrenner" <Klaus.Aschenbrenner@.anecon.com> wrote in message
news:#hBwlNzGFHA.2976@.TK2MSFTNGP15.phx.gbl...
> Hi!
> I've a DataTable with 100 rows and 100 columns. Then I'm updating each
> column in each row. When I call the Update method on the DataAdapter, this
> generates 10000 UPDATE statements.
> Are there any solutions how I can create a better solution, which reduces
> the UPDATE statements? Or how can I handle such big updates with SQL
Server?
> Thanks
> Klaus Aschenbrenner
> MVP Visual C#
> www.csharp.at,www.anecon.com
> http://weblogs.asp.net/klaus.aschenbrenner
>
|||First and foremost, an RDBMS is NOT FOR PERFORMANCE. You gain performance
by using the tool correctly and using the native enhancements to boost
performance. That is not to say that an RDBMS can not be fast, on the
contrary; however, that is not its chief purpose.
If performance is your ONLY concern, use a flat file or an XML file.
You use an RDBMS to MODEL THE DATA, so that others can query it in a myriad
of ways and garauntee that there results are accurate. Therefore, you have
to use the RELATIONAL rules to model your data before you build the physical
database and constrain it in order to provide DATA INTEGRITY. It is this
integrity that you build on an RDBMS system. The system is optimized for
performance, but only after providing the foundation, a relational database
properly constrained.
Sincerely,
Anthony Thomas
"Klaus Aschenbrenner" <Klaus.Aschenbrenner@.anecon.com> wrote in message
news:%23hBwlNzGFHA.2976@.TK2MSFTNGP15.phx.gbl...
Hi!
I've a DataTable with 100 rows and 100 columns. Then I'm updating each
column in each row. When I call the Update method on the DataAdapter, this
generates 10000 UPDATE statements.
Are there any solutions how I can create a better solution, which reduces
the UPDATE statements? Or how can I handle such big updates with SQL Server?
Thanks
Klaus Aschenbrenner
MVP Visual C#
www.csharp.at,www.anecon.com
http://weblogs.asp.net/klaus.aschenbrenner
|||You must be a consultant.

- Sahil Malik
http://codebetter.com/blogs/sahil.malik/
"Anthony Thomas" <ALThomas@.kc.rr.com> wrote in message
news:OIk#AyDHFHA.3352@.TK2MSFTNGP10.phx.gbl...
> First and foremost, an RDBMS is NOT FOR PERFORMANCE. You gain performance
> by using the tool correctly and using the native enhancements to boost
> performance. That is not to say that an RDBMS can not be fast, on the
> contrary; however, that is not its chief purpose.
> If performance is your ONLY concern, use a flat file or an XML file.
> You use an RDBMS to MODEL THE DATA, so that others can query it in a
myriad
> of ways and garauntee that there results are accurate. Therefore, you
have
> to use the RELATIONAL rules to model your data before you build the
physical
> database and constrain it in order to provide DATA INTEGRITY. It is this
> integrity that you build on an RDBMS system. The system is optimized for
> performance, but only after providing the foundation, a relational
database
> properly constrained.
> Sincerely,
>
> Anthony Thomas
>
>
> --
> "Klaus Aschenbrenner" <Klaus.Aschenbrenner@.anecon.com> wrote in message
> news:%23hBwlNzGFHA.2976@.TK2MSFTNGP15.phx.gbl...
> Hi!
> I've a DataTable with 100 rows and 100 columns. Then I'm updating each
> column in each row. When I call the Update method on the DataAdapter, this
> generates 10000 UPDATE statements.
> Are there any solutions how I can create a better solution, which reduces
> the UPDATE statements? Or how can I handle such big updates with SQL
Server?
> Thanks
> Klaus Aschenbrenner
> MVP Visual C#
> www.csharp.at,www.anecon.com
> http://weblogs.asp.net/klaus.aschenbrenner
>
|||Klaus,
The rows are updated depending on the rowstate.
It can be that when you start that the rowstate are set to "added" while you
don't want to update them all. (The dataadapter.fill set them automaticly
to unchanged when you have not set the property for that to false, however
when you load them by hand, by instance using a datareader they are all set
to "added".).
You can by instance in the case of that filling with the datareader set the
rowstate of all rows to unchanged by ds.acceptchanges
Maybe this helps?
Cor
|||If only consultants would be so highly critical.
Sincerely,
Anthony Thomas
"Sahil Malik" <contactmethrumyblog@.nospam.com> wrote in message
news:%23Phk7IcHFHA.2984@.TK2MSFTNGP15.phx.gbl...
You must be a consultant.

- Sahil Malik
http://codebetter.com/blogs/sahil.malik/
"Anthony Thomas" <ALThomas@.kc.rr.com> wrote in message
news:OIk#AyDHFHA.3352@.TK2MSFTNGP10.phx.gbl...
> First and foremost, an RDBMS is NOT FOR PERFORMANCE. You gain performance
> by using the tool correctly and using the native enhancements to boost
> performance. That is not to say that an RDBMS can not be fast, on the
> contrary; however, that is not its chief purpose.
> If performance is your ONLY concern, use a flat file or an XML file.
> You use an RDBMS to MODEL THE DATA, so that others can query it in a
myriad
> of ways and garauntee that there results are accurate. Therefore, you
have
> to use the RELATIONAL rules to model your data before you build the
physical
> database and constrain it in order to provide DATA INTEGRITY. It is this
> integrity that you build on an RDBMS system. The system is optimized for
> performance, but only after providing the foundation, a relational
database
> properly constrained.
> Sincerely,
>
> Anthony Thomas
>
>
> --
> "Klaus Aschenbrenner" <Klaus.Aschenbrenner@.anecon.com> wrote in message
> news:%23hBwlNzGFHA.2976@.TK2MSFTNGP15.phx.gbl...
> Hi!
> I've a DataTable with 100 rows and 100 columns. Then I'm updating each
> column in each row. When I call the Update method on the DataAdapter, this
> generates 10000 UPDATE statements.
> Are there any solutions how I can create a better solution, which reduces
> the UPDATE statements? Or how can I handle such big updates with SQL
Server?
> Thanks
> Klaus Aschenbrenner
> MVP Visual C#
> www.csharp.at,www.anecon.com
> http://weblogs.asp.net/klaus.aschenbrenner
>