Hardware
The following hardware is required to run the AveriSource Platform™️.
Lines of Code | Number of Servers | Configuration |
Up to 1 million | Processing Server: 1 DB Server: 1 API Server (Linux): 1 GenAI Server (Linux): 1 | Processing Server: vCPUs: 8 (per server) Memory: 64 GB+ (per server) Storage: 500 GB (per server) DB Server: vCPUs: 4 (per server) Memory: 16 - 32 GB+ (per server) Storage: 500 GB (per server) API Server: vCPUs: 4 - 8 (per server) Memory: 16 - 32 GB (per server) Storage: 84 GB (per server) *GenAI Server: GPU: Minimum 4 x A100 Nvidia (70b model requirement) vCPUs: 32 - 96 Memory: 440 - 880 GB Storage: 768 GB *Not required if leveraging AWS Bedrock |
1-5 million | Processing Server: 1 DB Server: 1 API Server (Linux): 1+ GenAI Server (Linux): 1 | Processing Server: vCPUs: 8 Cores Memory: 128 GB+ (per server) Storage: 500 GB (per server) DB Server: vCPUs: 8 (per server) Memory: 128 GB+ (per server) Storage: 500 GB (per server) API Server: vCPUs: 4 - 8 (per server) Memory: 16 - 32 GB (per server) Storage: 84 GB (per server) *GenAI Server: GPU: Minimum 4 x A100 Nvidia (70b model requirement) vCPUs: 64 - 96 Memory: 440 - 880 GB Storage: 768 GB *Not required if leveraging AWS Bedrock |
5-10 million | Processing Server: 1 or 2 (depending on business demand) DB Server: 1 or 2 API Server (Linux): 1+ GenAI Server (Linux): 1 | Processing Server: vCPUs: 16 (per server) Memory: 128 GB+ (per server) Storage: 1 TB (per server) DB Server: vCPUs: 16 (per server) Memory: 128 GB+ (per server) Storage: 1 TB (per server) API Server: vCPUs: 8 (per server) Memory: 32 GB (per server) Storage: 84 GB (per server) *GenAI Server: GPU: Minimum 4 x A100 Nvidia (70b model requirement) vCPUs: 64 - 96 Memory: 440 - 880 GB Storage: 768 GB *Not required if leveraging AWS Bedrock |
10 million+ | Processing Server: 2 or 3 (depending on business demand) DB Server: 2 or 3 API Server (Linux): 1+ GenAI Server (Linux): 1 | Processing Server: vCPUs: 16 Cores (per server) Memory: 128 GB+ (per server) Storage: 1 TB (per server) DB Server: vCPUs: 16 (per server) Memory: 128 GB+ (per server) Storage: 1.5 TB (per server) API Server: vCPUs: 4 - 8 (per server) Memory: 16 - 32 GB (per server) Storage: 84 GB (per server) *GenAI Server: GPU: Minimum 4 x A100 Nvidia (70b model requirement) vCPUs: 64 - 96 Memory: 440 - 880 GB Storage: 768 GB *Not required if leveraging AWS Bedrock |