Splice Machine Requirements
This topic summarizes the hardware and software requirements for Splice Machine running on a cluster or on a standalone computer, in these sections:
- Cluster Node Requirements summarizes requirements for running Splice Machine on a cluster, including these subsections:
- Standalone Version Prerequisites summarizes requirements for running the standalone version of Splice Machine.
- Java JDK Requirements summarizes which Java JDK requirements for running Splice Machine.
Cluster Node Requirements
The following table summarizes the minimum requirements for the nodes in your cluster:
|Cores||Splice Machine recommends that each node in your cluster have 8-12 hyper-threaded cores (16-32 hyper-threads) for optimum throughput and concurrency.|
|Memory||We recommend that each machine in your cluster have at least 64 GB of available memory.|
Your root drive needs to have at least 100 GB of free space.
Splice Machine recommends separate data drives on each cluster node to maintain a separation between the operating system and your database data. You need capacity for a minimum of three times the size of the data you intend to load; the typical recommended configuration is 2 TB or more of attached storage per node.
Your data disks should be set up with a single partition and formatted with an ext4 file system.
|Hadoop Ecosystem||The table in the next section, Hadoop Ecosystem Requirements, summarizes the specific Hadoop component versions that we support in each of our product releases.|
|Software Tools and System Settings||The [Linux Configuration](#LinuxConf) section below summarizes the software tools and system settings required for your cluster machines.|
Amazon Web Services (AWS) Requirements
If you’re running on AWS, your cluster must meet these minimum requirements:
|Minimum Cluster Size||
The minimum cluster size on AWS is 5 nodes:
|Minimum Node Size||Minimum recommended size of each node is m4.4xlarge.|
Minimum recommended storage space:
Note that the required number of data drives per node depends on your use case.
Hadoop Ecosystem Requirements
The following table summarizes the required Hadoop ecosystem components for your platform:
|CDH 5.14.2, CDH 5.14.0, CDH 5.13.3, CDH 5.13.2, CDH 5.12.0||
|HDP 2.6.4, HDP 2.6.3, HDP 2.6.1, HDP 2.5.5||
Linux Configuration Requirements
The following table summarizes Linux configuration requirements for running Splice Machine on your cluster:
|Configure SSH access:||Configure the user account that you're using for cluster administration for password-free access, to simplify installation.|
echo 'vm.swappiness = 0' >> /etc/sysctl.conf
|If you are using Ubuntu:||
rm /bin/sh ; ln -sf /bin/bash /bin/sh
|If your using CentOS or RHEL:||
sed -i '/requiretty/ s/^/#/' /etc/sudoers
Verify that the following set of software (or packages) is available on each node in your cluster:
|Additional required software on CentOS or RHEL||
If you're running on CENTOS or RHEL, you also need to have this software available on each node:
|Services that must be started||
You need to make sure that the following services are enabled and started:
|Time zone setting||Make sure all nodes in your cluster are set to the same time zone.|
Standalone Version Prerequisites
You can use the standalone version of Splice Machine on MacOS and Linux computers that meet these basic requirements:
Mac OS X, version 10.8 or later.
CentOS 6.4 or equivalent.
|CPU||Splice Machine recommends 2 or more multiple-core CPUs.|
|Memory||At least 16 GB RAM, of which at least 10 GB is available.|
|Disk Space||At least 100 GB of disk space available for Splice Machine software, plus as much space as will be required for your data; for example, if you have a 1 TB dataset, you need at least 1 TB of available data space.|
You must have JDK installed on your computer.
Java JDK Requirements
Splice Machine supports the following versions of the Java JDK:
Oracle JDK 1.8, update 60 or higher
We recommend that you do not use JDK 1.8.0_40
Splice Machine does not test our releases with OpenJDK, so we recommend against using it.