4 Replies - 9173 Views - Last Post: 01 October 2012 - 07:45 AM

#1 November-06  Icon User is offline

  • D.I.C Regular

Reputation: 46
  • View blog
  • Posts: 407
  • Joined: 04-January 11

System Out Of Memory Exception

Posted 26 September 2012 - 09:25 PM

I am generating a report containing a summary of Sales. I get the SUM of sales which is grouped by organization. No problem here. The data that was needed for the summary was stored in a table. Let's call it tblSales.

During testing, I was asked to give the testers an Excel file containing the raw data that was used as a basis for the summary so I needed to give them the whole tblSales for them to check.

I displayed the data using the simple code
SELECT * FROM tblSales


and the code ran for almost 35 minutes and displayed 830,000+ rows.

I wanted to copy the rows displayed and paste them to the Excel file but afer copy, a System Out of Memory exception was displayed in a messagebox. I tried copying it about three or 4 times and it showed the same error. Then, my computer crashed, SQL closed, and the table displayed after a 35-minute runtime for the query was lost. So I needed to query it again.

So what should I do to successfully transfer the data to excel without encountering this error? Any advice?

Is This A Good Question/Topic? 0
  • +

Replies To: System Out Of Memory Exception

#2 thava  Icon User is offline

  • D.I.C Lover
  • member icon

Reputation: 180
  • View blog
  • Posts: 1,606
  • Joined: 17-April 07

Re: System Out Of Memory Exception

Posted 26 September 2012 - 09:31 PM

Use the CSV format of result in your sql server management studio and save the result file then open it with the help of excel
Was This Post Helpful? 0
  • +
  • -

#3 e_i_pi  Icon User is offline

  • = -1
  • member icon

Reputation: 801
  • View blog
  • Posts: 1,700
  • Joined: 30-January 09

Re: System Out Of Memory Exception

Posted 27 September 2012 - 07:32 PM

As thava said, saving to CSV file will be much less intensive than trying to use the clipboard. You need to right click in the result area in order to access the 'Save Results As...' functionality:
Attached Image

EDIT: Alternately, you can skip the resultset generation to window altogether but right clicking on the database in SSMS, and selecting 'Tasks > Export Data...'. After that, the DB should have been automatically selected as the source. The next step of the wizard you define the target. You'll need to drop down 'Microsoft Excel' from the Destination dropdown. From there you should be able get the rest of it easy enough. Unfortunately it doesn't look like you can select which columns you want, if you wanted a subset of the table.

This post has been edited by e_i_pi: 27 September 2012 - 07:46 PM
Reason for edit:: More info

Was This Post Helpful? 0
  • +
  • -

#4 BBeck  Icon User is online

  • Here to help.
  • member icon


Reputation: 591
  • View blog
  • Posts: 1,318
  • Joined: 24-April 12

Re: System Out Of Memory Exception

Posted 01 October 2012 - 07:05 AM

If it took 35 minutes to display less than a million rows, it sounds like the rows are rather wide and you are dealing with a massive amount of data here.

It's a little bit questionable that they even need this quantity of data in Excel; it's the job of SQL Server to sort through the data and give answers, not hand over all the data and let them get their own answers. It's not just a data file to have data stored some place; a database is there to return answers, not to return data.

None the less, some times you need to give them data in Excel for whatever reason. Everyone's got you on the right track here. You shouldn't be using the clip board for 35 minutes worth of data; I'm not at all surprised that it crashed your computer, although we like to think that Windows should be more "robust" than to just crash. Copy it to a disk based file, like a .CSV whether you use the Export Wizard, BCP, or just redirect your output to a file, in Management Studio, rather than to a grid.

The next problem you may have is getting that much data into Excel. Excel isn't going to like it. If you have an older version of Excel it will simply tell you to "take a long walk off a short pier". Up until the most recent version(s) of Excel, it would only deal with 64,000 rows. In the newer version(s) (I can't remember if it's only the current version or if it goes back a version or two), it will take a lot more rows than that, but that doesn't mean that just because you can load it in Excel that Excel or the computer are going to be happy about it. If they are putting it in Excel it's likely to be on a client machine that they've given the lowest specs they could get away with. In other words, the machine's likely to have very little RAM. So, you may get it loaded and wish that you hadn't with crashing and disk thrashing and just an all around painful process.

Ideally, they would start over at the beginning and look at what "question" they are trying to ask (problem they are trying to solve). Then they would ask SQL Server the question (assign the problem to SQL Server), rather than asking for all the data. SQL Server is designed to crunch through those massive amounts of data, Excel is not. SQL Server is also not designed to hand over those quantities of data and will perform exceedingly poorly when you constantly ask it to hand over all the data.

People often think they are somehow making it easier on the server by asking for all the data so that they can do the processing. But in fact, that's exactly the worst thing you can do to SQL Server; you're not helping it and in fact you are doing the number one thing you can possibly do to kill SQL Server's performance and lock up the server so that noone can use it. That's because it's designed to answer questions, not stream data. It's designed for a client server environment where SQL Server is designed to do all the work and the client is nothing more than a pretty display (think webpage with nothing but standard HTML).

SQL Server likes answering queries that return a limited result set that a human can read. What kills it's performance is when you ask it for more data than one person can read with their eyes. If you can't read all of it with your eyes, then you are doing SQL Server's job for it rather than letting SQL Server do it's job.

No person can actually consume a million rows of data. So "almost" any time you are asking for that many rows you are doing something wrong. Often, I see people do this because they don't know how to use SQL Server to answer the questions and they just want to import the data into a product that they know how to use so that they can do SQL Server's job and process the data themselves. In those cases, they need to stop using SQL Server and possibly find a something that's designed to stream data or just use a standard file server.

On the other hand, sometimes you want to move large chunks of data around for data warehousing or some such thing. That's a job for replication.

Anyway, I apologize for the lecture, you guys may have a good reason for putting it into Excel. But I would say that it's worth rethinking why you are putting it in Excel if you haven't. Excel is not designed to deal with millions of records. Generally, you want SQL Server to deal with the millions of records and return one screen full of data to Excel as needed.

And if you haven't seen Analysis Services with Excel Power Pivot, it's pretty awesome and allows users to work with absolutely massive amounts of data in Excel where they can do their own business analysis on the data.

I hope that's hopeful.

P.S. I went back and re-read the original post. It sounds like you're just using Excel to allow the testers to cross check that the report is running correctly. I can understand that and certainly their desire to do that. Excel still may choke on it though. You may be better off giving them database access so that they can query the database themselves to cross check the report. Hopefully, this is all running in a test environment where any query they run won't hurt production. Anyway, what I said may or may not apply to your particular situation, but it's something to think about anyway.

This post has been edited by BBeck: 01 October 2012 - 07:18 AM

Was This Post Helpful? 1
  • +
  • -

#5 BBeck  Icon User is online

  • Here to help.
  • member icon


Reputation: 591
  • View blog
  • Posts: 1,318
  • Joined: 24-April 12

Re: System Out Of Memory Exception

Posted 01 October 2012 - 07:45 AM

Incidentally, since we're on the subject: a million rows of data may or may not be a lot of data depending on the width of every row.

If the width of the rows is 2 bytes wide then a million rows of data is 2MB, which is nothing on today's machines. However, if you max out a row size without going to cross page records (which would allow you to exceed the 8,000 byte row width limit), then you have 8 GB of data. (Under the hood in SQL Server your problem is actually probably quite a bit worse because you may have empty pages from page splits and such causing what "should" be 8GB to be 12 or 16GB or some size larger than 8. Lesson learned: be careful how wide you make your rows.)

8GB may not sound like a lot of data by today's standards, but it is - especially if you have hardware that's maybe not brand new.

In order for a program to deal with that data, it has to pull all of it off disk and into memory.

On a 32 bit computer, the maximum memory size for your program is roughly 1.6GB. Yes. That's right. You may have heard 4GB is the max. Your machine may have 4GB installed (because that's essentially the maximum amount you can physically install on a 32 bit machine). But even with 4GB of Ram, 100% of it will not go to your app whether your app is SQL Server (server) running on your machine, or Management Studio, or Excel, or any other app. Even assuming your app is the only thing running, Windows itself will use 2GB of that 4GB. In the real world video drivers and other things will eat up about 0.4GB leaving you with 1.6GB to run whatever it is you're trying to run assuming you've got the full 4GB of RAM installed. It just gets worse if you have less physical memory. This applies to servers as well although you can use PAE and AWE to skirt that issue on a SQL Server that has more than 4GB of memory installed. It's not so much an option on the client machine however.

So, now that 8GB is VERY significant, because you have about a quarter of the memory you need to so something even as simple as sum a value for every row. This is a very ugly problem even for SQL Server, although it's designed to deal with this sort of thing in the best possible way (which still may not be enough - go buy a 64 bit server and some memory). But client apps aren't nearly as smart about how to deal with such a severe memory problem.

And remember, we're just talking about one table here with a million rows and the width maxed out (assuming you haven't exceeded the max which is possible now days with off page data). A database "should" have more than one table.

So, this is both a call to realize that your SQL Server needs to be 64 bit with as much RAM as you can give it, and a call to understand that client machines are still often 32 bit with 2GB of RAM and you're trying to squeeze 8GB of data into 0.5GB of physical memory (I don't know exactly how much physical memory your app would be given on a machine with 2GB of physical RAM but I know that Windows is 2GB and it will have no choice but to share that 2GB with your app and anything else running on the machine. 0.5GB sounds about right to me.)

Anyway, that's something to think about any time you are getting out of memory errors related to SQL Server data.

This post has been edited by BBeck: 01 October 2012 - 07:48 AM

Was This Post Helpful? 1
  • +
  • -

Page 1 of 1