Managing Queries with the DB Console

The Splice Machine Database Console allows you to view queries that are currently running and have completed running in your database. You typically start at the top level, viewing jobs, and then drill down into individual job details, job stages, and task details, as described in these sections:

Viewing Summary Pages

The console includes five summary pages, each of which can be accessed from the tab bar at the top of the console window:

The Jobs Summary page

The Jobs Summary Page is the top-level view in the Splice Machine Database Console, It shows you a summary of any currently active and all completed jobs.

You land on this page when you first view the Database Console in your browser, and you can view it at any time by clicking the Jobs tab in the tab bar at the top of the page.

Query Management Overview Screen

A stage is shown as skipped when the data has been fetched from a cache and there was no need to reexecute the stage; this happens when shuffling data because the Spark engine automatically caches generated data.

You can click the a job description name (in blue) to view job details of any job in the Active Jobs or Completed Jobs sections.

The Stages Summary Page

The StagesSummary Page shows you the available scheduling pools, and a summary of the stages for all active and completed jobs. You can access this page by clicking the Stages tab in the tab bar at the top of the window.

Spark UI stages summary for all
jobs

You can click the descriptive name of a stage (in blue) to view the stage details.

The Fair Scheduler Pools section at the top of the page shows the name and weighting value for each of the scheduler pools that have been defined for your database jobs.

The Storage Summary Page

The Storage Summary Page displays information about any RDDs that are currently persisted or cached. You can access this page by clicking the Storage tab in the tab bar at the top of the window:

The Environment Summary Page

The Environment Summary Page displays information about which software versions you’re using, and shows the values of the Spark-related environment variables. You can access this page by clicking the Environment tab in the tab bar at the top of the window:

Spark UI environment summary page

The Executors Summary Page

The Executors Summary Page shows you the Spark executors that are currently running. You can access this page by clicking the Executors tab in the tab bar at the top of the window:

Spark UI executors summary page

You can click Thread Dump to display a thread dump for an executor, or you can click a log name to see the contents of the log.

Viewing Job Details

If you click a job to see its details, you’ll see a screen like the following displayed, which shows the stages of the job:

Splice Database Console Spark Job Details
screen

You can expand the job detail display by selecting the Event Timeline and/or DAG Visualization buttons.

Job Details Event Time Line View

The job details time-line view looks like the following screen shot:

Event timeline view of a Spark
job

 

Job Details Graphical Visualization View

The DAG Visualization view for a job looks like this:

DAG view of a Spark job

Some key things to know about the DAG view are:

  • You can click in the box representing a stage to view the detailed tasks within that stage. For an example, see Graphical View of the Tasks in a Stage, in the next section.

  • You can hover over any of the black dots inside a task box to display information about the task. For example:

    Spark UI hovering over a task node

Viewing Stage Details

Viewing stage details is very much the same as viewing job details. If you click the name of a stage in another page, the detailed view of that stage displays:

Details of a job stage

The Event Time Line View of a Stage

The Event Timeline view of a stage looks like this:

Timeline view of a Spark job
stage

Graphical View of the Tasks in a Stage

The DAG Visualization view of a stage looks like this:

Graphical view of a Spark job
stage

Terminating a Stage

If you conclude that an active job stage is not performing the way you think it should, you can terminate a stage by clicking the Kill button shown in the description of every active stage. The following image highlights the kill buttons that you’ll find in the console display:

Finding the Kill button in the Spark
UI

You’ll be prompted to verify that you want the stage terminated:

Verifying that a stage should be terminated in the Spark
UI

You can access the Kill button by drilling down into a job’s stages, or by selecting the Stages tab in the tab bar, which displays all stages for all jobs.

See Also