Tuesday, March 15, 2016

Run a Simple App with Spark


1. Install sbt


echo "deb https://dl.bintray.com/sbt/debian /" | sudo tee -a /etc/apt/sources.list.d/sbt.list
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv 642AC823
sudo apt-get update
# the line below could be failed, I tried it several times
sudo apt-get install sbt

2. Prepare a simple project

as this one at my repo

3. Package the project


# go to the project folder, package it
# note it may failed several times
# saying cound not resolve dependency sbt 0.13.11
# for me it succeed after I viewed the error log and
# empty the error log file
# not sure whether this is a required step...
cd /home/benbai/Things/github/Spark/projects/SimplTesteApp
sbt package

4. run application by spark-submit

It will display lots of spark log, if you want to turn off the info logged by Spark you can add this log4j.properties into your SPARK_HOME/conf folder


$ YOUR_SPARK_HOME/bin/spark-submit \
--class "SimpleApp" \
--master local[4] \

Lines with a: 58, Lines with b: 26


Installing sbt on Linux

Spark Quick Start

Monday, March 14, 2016

Install Spark 1.6.0 on Ubuntu 14.04


Ubuntu 14.04 installed in VirtualBox


1. Install JDK 8

# install oracle jdk 8
sudo apt-get -y install python-software-properties
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get -y update
sudo apt-get -y install oracle-java8-installer

# find out the path of your Java installation:
sudo update-alternatives --config java

# for me it returns
# There is only one alternative in link group java (providing /usr/bin/java): /usr/lib/jvm/java-8-oracle/jre/bin/java

# edit /etc/environment
sudo gedit /etc/environment

# add this line into it, save and close
# JAVA_HOME="/usr/lib/jvm/java-8-oracle/jre/bin/java"
# then reload /etc/environment
source /etc/environment

2. Install scala 2.10

download scala 2.10.6 from http://www.scala-lang.org/download/2.10.6.html

extract it somewhere (I extract it to ~/Things/scala/scala-2.10.6)

again edit /etc/environment


after this the /etc/environment as below


reload /etc/environment and check scala

source /etc/environment
scala -version
# for me it returns
# Scala code runner version 2.10.6 -- Copyright 2002-2013, LAMP/EPFL

3. Get Spark 1.6.0

download spark from http://spark.apache.org/downloads.html

I choose 1.6.0 pre-build for hadoop 2.6 and later

extract it somewhere (I extract it to ~/Things/spark/spark-1.6.0-bin-hadoop2.6)

go to spark directory and test it

cd ~/Things/spark/spark-1.6.0-bin-hadoop2.6
# then test some command

Testing result:


How To Install Java on Ubuntu with Apt-Get

Scala Download Page

Scala Getting Started

Spark Quick Start

Sunday, March 13, 2016

Front-End Interview Questions and Answers

I have finished answering the Front-end-Developer-Interview-Questions, please refer to the answers at my git repo.

In my opinion, not only for interview, they really contains useful information/knowledge for Front-End/Full-Stack developers.

Hope this helps.

Monday, March 7, 2016

I've joined ZK Programmers Network at toptal dot com

As title, I was a developer/consultant of ZK at zkoss.org (2012 ~ 2015), I have joined ZK Programmers Network as a freelancer at toptal.com, hope I can help people who currently using ZK and make ZK more famous.

Honestly, it is because I have so much free time with current job so need to find something to do, currently I am answering the Front-end-Developer-Interview-Questions but almost done, so....

I am able to help with customize ZK Components, build new components,  build app/environment architecture, debug or performance tuning.

Grab me if needed, I believe I can give you a powerful hand :D

Monday, February 16, 2015

Add SyntaxHighlighter to Blog Post in One Minute

1. Edit your blog template.

2. Find the

3. Paste the fragment below before the tag you found above
(NOTE: this is my custom helper and will load js from my github cdn,
 you can modify it to create your own as needed)

4. Save template


(It seems there is an autoloader feature but that page not works well so
I wrote my own.)


My testing files



Friday, February 13, 2015