How to Install Apache Spark on Mac (2021) – Sanyo Digital

You are here: Home / How To / How to Install Apache Spark on Mac

By David Leave a Comment

How to Install Apache Spark on Mac

If your work involves Python development or analytics, you will find that PySpark is an integral part of your daily life. It doesn’t matter if you are analyzing data, using machine learning or using Python in other areas of development, it is important to have the pre-requisites. Apache Spark installation on your Mac is more than a single package. It can be done in multiple steps and requires prior checks. These steps will show you how to install Apache Spark on your Mac.

Spark on Mac

  • Checking prerequisites:

Before installing Apache Spark on your Mac, Java and Homebrew are required. Let’s start by installing Java:

    • Homebrew Installation: Before installing Java you need to install Homebrew and you can do that by visiting: The page opens to show you a command to use in the terminal and copy and paste it in the terminal:

      /bin/bash -c “$(curl -fsSL”

    • Now use the following command to update homebrew:

      brew upgrade && brew update

    • Now we’re good to install Java and you can do so by first checking the java version installed in your Mac. Type the following command in the terminal:
    • Use this command to update and install the latest Java 8 package: or if you’re looking for even more recent package:
  • Install XCode:

XCode provides a complete Mac development repository that will help you install additional packages. Use

to install XCode

  • Installing Scala and other prerequisite packages:

Follow the steps below to enter the commands one-by-one in the terminal:

    • Scala Installation:
    • Apache Spark Installation:

      brew install apache-spark

    • Spark terminal:
    • To check if it is active use the command:
    • Run pyspark to start pyspark shell
  • Adding Spark to bash:

Type the commands under different points on new lines, and then add the path for the profile:

nano ~/.profile

export SPARK_HOME=/usr/local/Cellar/apache-spark/2.4.4/libexec
export PYTHONPATH=/usr/local/Cellar/apache-spark/2.4.4/libexec/python/:$PYTHONP$
source ~/.bash_profile

cd /usr/local/Cellar/apache-spark/2.4.4/libexec/sbin

  • Finalize installation:

Now that you have run all required scripts and prerequisite package, you can now start all PySpark messages using:

  • Using Spark:

Now, you can use Spark in your browser by following the steps:

    • Spark Master UI : http://localhost: 8080/
    • Spark Application UI : http://localhost: 4040/


After running all the commands and installing prerequisite package, it is time to verify that versions are correct and that packages are still working. If you still have missing packages, we recommend that you install Apache Spark through the Anaconda Python repository. You should also remember to use the code line-to-line instead of the code in the same line. Python and Mac Terminal don’t compile what you write, but execute commands on demand depending on the compile environment.

Reader Interactions

Leave a Reply

Your email address will not be published.