Is Postgres appropriate for this application? (1 billion inserts a year)
0
votes
0
answers
227
views
I am currently working on drug record management system which needs to record and retain insurance billing information for patients. So far, all the typical information (patient profile, prescriber profile, drug information etc.) can easily be handled by a single postgres server as they don't really scale past the tens of millions of rows, but I am worried about the billing information as that enters into the billions territory.
Some back of the napkin calculation:
Each location (~2000) performs about 300 billing transactions a day - this translates to about 1000 inserts. So this is about 2000 * 1000 * 365 = ~700 million inserts a year.
There will also be a migration of historic billing data which goes back about a decade so thats another 7 billions records thats already there. Since this is medical information **we're not allowed to delete any of it** including all the new insertions moving forward.
Each billing row has a patient id, billing number, date and other insurance related information. So this table only needs to support three types of look ups:
1) Finding all the billing rows for a particular patient
2) Finding all the billing rows for a particular number
3) Finding all the rows within a certain time period
Is postgres appropriate for this type of workload?
Asked by d124y
(1 rep)
Aug 27, 2022, 12:18 AM