G
Guest
I have a 20meg in-memory stack-based array and I'd like to normalise in the
client's memory... then add foriegn keys, and display results on a datagrid.
I discovered that converting the stack to a datatable my memory utilisation
increases to 120 megs. (lots of overhead)
Couple of questions
1)- Is that memory increase typical? All I want to do is bind it to a
datagrid.
2)- Suppose I dump it to a CSV, or SQL. Is it possible to only retrieve a
subset of all that data? ...And page through the data when the user scrolls
(given that this is not an ASP application)
3)- Lastly and **perhaps most importantly** - how do people in the real
world access and update normalised data when it comes to large DB's like this
one in C#? Is the data joined by stored procedure,in-memory datatable FK's,
or a Transact-SQL command issued by the client?
client's memory... then add foriegn keys, and display results on a datagrid.
I discovered that converting the stack to a datatable my memory utilisation
increases to 120 megs. (lots of overhead)
Couple of questions
1)- Is that memory increase typical? All I want to do is bind it to a
datagrid.
2)- Suppose I dump it to a CSV, or SQL. Is it possible to only retrieve a
subset of all that data? ...And page through the data when the user scrolls
(given that this is not an ASP application)
3)- Lastly and **perhaps most importantly** - how do people in the real
world access and update normalised data when it comes to large DB's like this
one in C#? Is the data joined by stored procedure,in-memory datatable FK's,
or a Transact-SQL command issued by the client?