Splice Machine Requirements

    Learn about our products

This topic summarizes the hardware and software requirements for Splice Machine running on a cluster or on a standalone computer.

Cluster Node Requirements

The following table summarizes the minimum requirements for the nodes in your cluster:

Component Requirements
Cores Splice Machine recommends that each node in your cluster have 8-12 hyper-threaded cores (16-32 hyper-threads) for optimum throughput and concurrency.
Memory We recommend that each machine in your cluster have at least 64 GB of available memory.
Disk Space

Your root drive needs to have at least 100 GB of free space.

Splice Machine recommends separate data drives on each cluster node to maintain a separation between the operating system and your database data. You need capacity for a minimum of three times the size of the data you intend to load; the typical recommended configuration is 2 TB or more of attached storage per node.

Your data disks should be set up with a single partition and formatted with an ext4 file system.

Hadoop Ecosystem The table in the next section, Hadoop Ecosystem Requirements, summarizes the specific Hadoop component versions that we support in each of our product releases.
Software Tools and System Settings The Linux Configuration topic in each section of our Installation Guide that pertains to your installation summarizes the software tools and system settings required for your cluster machines.

Amazon Web Services (AWS) Requirements

If you’re running on AWS, your cluster must meet these minimum requirements:

Component Requirements
Minimum Cluster Size

The minimum cluster size on AWS is 5 nodes:

  • 1 master node
  • 4 worker nodes
Minimum Node Size Minimum recommended size of each node is m4.4xlarge.
Disk Space

Minimum recommended storage space:

  • 100GB EBS root drive
  • 4 EBS data drives per node

Note that the required number of data drives per node depends on your use case.

Hadoop Ecosystem Requirements

The following table summarizes the required Hadoop ecosystem components for your platform:

Hadoop platform Linux Hadoop HBase ZooKeeper
CDH 5.14.0, CDH 5.13.2, CDH 5.12.0

CentOS/RHEL 6

2.6.0 1.0.0 3.4.5
HDP 2.6.3, HDP 2.5.5

CentOS/RHEL 6

2.7.1 1.1.2 3.4.5
MapR 5.2.0

CentOS/RHEL 6

2.7.0 1.1.1 3.4.5

Java JDK Requirements

Splice Machine supports the following versions of the Java JDK:

  • Oracle JDK 1.8, update 60 or higher

    We recommend that you do not use JDK 1.8.0_40

Splice Machine does not test our releases with OpenJDK, so we recommend against using it.

Standalone Version Prerequisites

You can use the standalone version of Splice Machine on MacOS and Linux computers that meet these basic requirements:

Component Requirements
Operating System

Mac OS X, version 10.8 or later.

CentOS 6.4 or equivalent.

CPU Splice Machine recommends 2 or more multiple-core CPUs.
Memory At least 16 GB RAM, of which at least 10 GB is available.
Disk Space At least 100 GB of disk space available for Splice Machine software, plus as much space as will be required for your data; for example, if you have a 1 TB dataset, you need at least 1 TB of available data space.
Software

You must have JDK installed on your computer.