Hi,
We have a need to manage about 900 million rows/ year of data of one
kind. I am looking for suggestions on how best to design a database table(s)
to handle this.
These data is essentially information - across various markets and time. We
need the ability to search across markets/ time and update across markets
and time.
I did some basic tests and extrapolating the data would lead me to believe
that if I were have this as a single table the table size will be over 100
GB.
Thanks* Get a fast machine with a lot of RAM.
* Have much patience!
* Look into partitioned tables.
* Look into OLAP cubes.
Which is only to say, as the scale of your app challenges the hardware, be
very sure you know what your real requirements are. A minor glitch in desig
n
can cost you big, when your data is big, but what's a glitch and what's a
feature depends on the situation.
Sounds like fun anyway, good luck!
Josh
"shikarishambu" wrote:
> Hi,
> We have a need to manage about 900 million rows/ year of data of one
> kind. I am looking for suggestions on how best to design a database table(
s)
> to handle this.
> These data is essentially information - across various markets and time. W
e
> need the ability to search across markets/ time and update across markets
> and time.
> I did some basic tests and extrapolating the data would lead me to believe
> that if I were have this as a single table the table size will be over 100
> GB.
> Thanks
>
>|||Hire a pro to do the design and spec work. To do anything other is to set
yourself up for extreme disappointment. The cost now will be a LOT less
than in the future when your system is live and non-performant!! :-)
Kevin G. Boles
TheSQLGuru
Indicium Resources, Inc.
"JRStern" <JRStern@.discussions.microsoft.com> wrote in message
news:E4D50946-48C9-4264-B344-E35F8E757FAF@.microsoft.com...[vbcol=seagreen]
>* Get a fast machine with a lot of RAM.
> * Have much patience!
> * Look into partitioned tables.
> * Look into OLAP cubes.
> Which is only to say, as the scale of your app challenges the hardware, be
> very sure you know what your real requirements are. A minor glitch in
> design
> can cost you big, when your data is big, but what's a glitch and what's a
> feature depends on the situation.
> Sounds like fun anyway, good luck!
> Josh
>
> "shikarishambu" wrote:
>|||Indexing and partitioning will be your friend with a huge table such as
this. Also if you find a need to update large amounts of data consider
doing it in batches to avoid lock escalation issues.
Kevin G. Boles
TheSQLGuru
Indicium Resources, Inc.
"shikarishambu" <shikarishambu70@.hotmail.com> wrote in message
news:%23hJvuDOPIHA.1208@.TK2MSFTNGP05.phx.gbl...
> Hi,
> We have a need to manage about 900 million rows/ year of data of one
> kind. I am looking for suggestions on how best to design a database
> table(s) to handle this.
> These data is essentially information - across various markets and time.
> We need the ability to search across markets/ time and update across
> markets and time.
> I did some basic tests and extrapolating the data would lead me to believe
> that if I were have this as a single table the table size will be over 100
> GB.
> Thanks
>|||On Wed, 12 Dec 2007 14:49:29 -0600, "TheSQLGuru"
<kgboles@.earthlink.net> wrote:
>Hire a pro to do the design and spec work. To do anything other is to set
>yourself up for extreme disappointment. The cost now will be a LOT less
>than in the future when your system is live and non-performant!! :-)
Never time to do it right, always time to do it over.
J.|||Well, up to the point where it stops functioning or meeting SLAs. Then they
call for me (or another performance consultant) in a panic!! ;)
Kevin G. Boles
TheSQLGuru
Indicium Resources, Inc.
kgboles a earthlink dt net
"JXStern" <JXSternChangeX2R@.gte.net> wrote in message
news:sls1m3liavtoscl3l9doi5ajrg2ak2rs7j@.
4ax.com...
> On Wed, 12 Dec 2007 14:49:29 -0600, "TheSQLGuru"
> <kgboles@.earthlink.net> wrote:
>
> Never time to do it right, always time to do it over.
> J.
>|||On Dec 12, 11:03 pm, "shikarishambu" <shikarishamb...@.hotmail.com>
wrote:
> Hi,
> We have a need to manage about 900 million rows/ year of data of one
> kind. I am looking for suggestions on how best to design a database table(
s)
> to handle this.
> These data is essentially information - across various markets and time. W
e
> need the ability to search across markets/ time and update across markets
> and time.
> I did some basic tests and extrapolating the data would lead me to believe
> that if I were have this as a single table the table size will be over 100
> GB.
> Thanks
Maybe you can design your database so it vertically partitions your
data into 90 tables with 10 milion records per table.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment