Here is my issue..
We have 2 types of reports in our site, data shown in grid and data instantly downloaded as a report.
These reports can contain a few years of data (1+ million rows), we have been allowing our customers to download the data against a date range, but we have started limited how long they can view data to prevent performance issues in our site. However, the data is still getting pretty large even on a small date range, now that they are expanding, and if they download too much, our memory spikes over a few gigs and run out of memory.
Question I have is, I rather not limit their data so I'm trying to figure out a good solution to allow them to download as much as they want.
I can limit what they see by only returning data per page so there is no performance issues, however downloading is always the problem.
I've looked into async but haven't successfully been able to get that to work, as it spikes the memory when I'm loading the data.
Ideas? Thoughts? Suggestions?
Code Example:
// Get Data SqlConnection con = new SqlConnection(); SqlCommand cmd = new SqlCommand(); SqlDataAdapter da; DataSet ds = new DataSet(); con.ConnectionString = "MyConnectionString"; con.Open(); cmd.Connection = con; cmd.CommandType = CommandType.StoredProcedure; cmd.CommandText = "MyStoredProc"; da = new SqlDataAdapter(cmd); da.Fill(ds); con.Close(); StringWriter sw = new StringWriter(); HtmlTextWriter htw = new HtmlTextWriter(sw); DataGrid dg = new DataGrid(); dg.DataSource = ds.Tables[0]; dg.DataBind(); dg.RenderControl(htw); Response.ClearContent(); Response.ContentType = "application/vnd.ms-excel"; Response.AddHeader("Content-Disposition", "attachment;filename=Report.xls"); Response.Write(sw.ToString()); Response.End();
When I run this with my data.. which is approximately 800k rows, my memory spikes and I get an out of memory error, and make things worse.. it always hogs up at the RenderControl until its done