Probably my question is very basic for most users of this forum, but I am new to the Linux / Centos world and my doubts are from a beginner.
I work in science and currently I have to prepare a computer with operating system CENTOS7 for a new data processing software that I will use. These are the characteristics of the equipment that I will use:
- 2 x 2,4 GHz Quad-Core Intel Xeon
- 8 cores, 256 KB L2 cache per core, 12 MB L3 cache per Processor
- 32 GB 1066 MHz DDR3 (8 slots x 4 Gb)
- 250 Gb SSD disk for the operating system
- (2 x 2 Tb) HD disk in a RAID for data storage
On the other hand these are the recommendations of the software developer for the installation:
- Supported on 64 bit Linux using Intel and AMD processors
- Disk space required for installation: 91,345,683 Bytes
- "The amount of memory needed for compute nodes is highly dependent on the type and size of data being processed and the algorithms being applied. A traditional rule of thumb has been to configure compute nodes with at least 2 Gb of RAM per core with a 32 Gb minimum. This configuration has worked well for systems with fewer than 32 cores and that are primarily focused on time processing."
- "For login nodes we recommend at least 64 Gb of RAM if being utilized by a single users."
- "Swap should equal 2x physical RAM for up to 2 GB of physical RAM, and then an additional 1x physical RAM for any amount above 2 GB, but never less than 32 MB. For systems with really large amounts of RAM (more than 32 GB) you can likely get away with a smaller swap partition (around 1x, or less, of physical RAM)"
The team will only use it to process data (volumes around 100 Gb of data), so I need a very basic desktop environment but the installation is optimized to get the best performance for handling this data.
My question is what base environment and what add-ons should I select in the installation so that it is most advisable for me?
Thank you very much
Issues related to applications and software problems
1 post • Page 1 of 1