3 Replies - 934 Views - Last Post: 04 December 2011 - 08:49 PM Rate Topic: -----

#1 ellexy  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 5
  • Joined: 27-November 11

Question: dealing with database memory

Posted 01 December 2011 - 06:11 AM

Hello, I am still a student making a loan system..The system deals with about a thousand customer everyday to input their payment..My question is dealing about the memory of my database..(assume my database is properly normalize already)

If I have a table in my database which is already normalize and contains the data about the amount of payment that day, the customer_no, and the balance remaining for that day...If I am dealing with a lot of customer everyday (say 1000 customer's payment data to be inputed to that table in just one day) I'll be producing about 1000 rows everyday.. :blink:

Is my MS SQL database be able to support that number of rows which in a year can be about moreover a hundred thousand..?

The other way around I come to think is backup that table for that day and truncate it for another day to be inputed but the problem here is that this table has a relationship with other table..(something like the table of target performance and so on..) :huh2:

Is This A Good Question/Topic? 0
  • +

Replies To: Question: dealing with database memory

#2 Atli  Icon User is offline

  • Enhance Your Calm
  • member icon

Reputation: 4240
  • View blog
  • Posts: 7,216
  • Joined: 08-June 10

Re: Question: dealing with database memory

Posted 01 December 2011 - 07:20 AM

View Postellexy, on 01 December 2011 - 01:11 PM, said:

Is my MS SQL database be able to support that number of rows which in a year can be about moreover a hundred thousand..?

Yes, easily. Given that the server computer was built sometime in the last decade and there aren't some other processes eating up the server's resources.

Properly set up MSSQL servers could potentially handle more than a thousand transactions per second. Not that I can say whether your server can do that. A lot depends on your hardware and server configuration. - But one insert every 86 seconds... my phone has the hardware to run that server :)

Also, a hundred thousand rows, each with a handful of fields like you described, is nothing to be worried about, really. I double any of the mainstream databases would have any trouble dealing with that. Especially not MSSQL and Oracle. (The "Enterprise" end of the RDBMS selection.)
Was This Post Helpful? 1
  • +
  • -

#3 ellexy  Icon User is offline

  • New D.I.C Head

Reputation: 0
  • View blog
  • Posts: 5
  • Joined: 27-November 11

Re: Question: dealing with database memory

Posted 04 December 2011 - 08:35 PM

Can a Microsoft SQL Server Express Edition hold this large memory of data?
Was This Post Helpful? 0
  • +
  • -

#4 Atli  Icon User is offline

  • Enhance Your Calm
  • member icon

Reputation: 4240
  • View blog
  • Posts: 7,216
  • Joined: 08-June 10

Re: Question: dealing with database memory

Posted 04 December 2011 - 08:49 PM

Looking through the SQL Server comparison chart from Microsoft, it looks like the Express edition of 2008 R2 is limited to 10GB per database file, and a maximum of 1GB of RAM usage.

Assuming 10.000 rows, that limits you to 1MB per row. And at rate of 1000 inserts per day the RAM usage will barely even register.

So yea, the Express edition should also be able to handle this with ease.

Edit:
Note, however, that the Express edition is not a server edition. It's targeted at stand-alone desktop applications; to be shipped as a part of the application to manage individual local database files. (Like the popular SQLite, if you are familiar with that.)

If you are creating a server I would recommend you rather stick with one of the server editions.

This post has been edited by Atli: 04 December 2011 - 08:53 PM

Was This Post Helpful? 0
  • +
  • -

Page 1 of 1