Install Zeppelin 0.7.3 in Windows

24 views 0 comments last modified about 16 days ago Raymond Tang

lite-log zeppelin spark

This post summarizes the steps to install Zeppelin 0.7.3 in Windows environment.

Tools and Environment

  • GIT Bash
  • Command Prompt
  • Windows 10

Download Binary Package

Download the latest binary package from the following website:

In my case, I am saving the file to folder: F:\DataAnalytics

UnZip Binary Package

Open Git Bash, and change directory (cd) to the folder where you save the binary package and then unzip:

$ cd F:\DataAnalytics

fahao@Raymond-Alienware MINGW64 /f/DataAnalytics
$ tar -xvzf  zeppelin-0.7.3-bin-all.gz

After running the above commands, the package is unzip to folder: F:\DataAnalytics\zeppelin-0.7.3-bin-all

Run Zeppelin

Before starting Zeppelin, make sure JAVA_HOME environment variable is set.

JAVA_HOME environment variable

JAVA_HOME environment variable value should be your Java JRE path.


Start Zeppelin

Run the following command in Command Prompt (Remember to the path to your own Zeppelin folder):

cd /D F:\DataAnalytics\zeppelin-0.7.3-bin-all\bin


Wait until Zeppelin server is started:



In any of your browser, navigate to http://localhost:8080/

The UI should looks like the following screenshot:


Create Notebook

Create a simple note using markdown and then run it:



If you got this error when using Spark as interpreter, please refer to the following pages for details:

Basically, even you configure Spark interpreter not to use Hive, Zeppelin is still trying to locate winutil.exe through environment variable HADOOP_HOME.

Thus to resolve the problem, you need to install Hadoop in your local system and then add one environment variable:


After the environment variable is added, please restart the whole Zeppelin server and then you should be able to run Spark successfully.


You should also be able to run the tutorials provided as part of the installation:


Add comment

Please login first to add comments.  Log in New user?  Register

Comments (0)

No comments yet.