The WebSphere Notes

The WebSphere Notes (  is a blog that has my study notes about WebSphere Application server administration and WebSphere Portal Server developer and administration certification.


Sunil has been in the IT industry for 10 years, worked with IBM Software Labs and was part of WebSphere Portal Server Development team for 4 years, and is now working for Ascendant Technology. Sunil has been working with WebSphere Portal since 2003. He is author of "Java Portlets 101" book and more than 25 articles and has a popular blog about portlet development and administration (

Letzte Blogeinträge

  • Problem with scala version mismatch in Spark application

    Samstag, 16. Januar 2016

    I was developing a spark program on my machine and it worked ok. But when i tried to deploy it in Spark running in my Hadoop sandbox i started getting this error

    java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef; at com.spnotes.enrich.CSVFieldEnricher.enrich(CSVFieldEnricher.scala:31) at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:59) at com.spnotes.PMDriver$$anonfun$1$$anonfun$apply$2.apply(PMDriver.scala:58) at scala.collection.immutable.List.foreach(List.scala:318)
  • Spark error class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package

    Mittwoch, 6. Januar 2016

    I had this Spark Program that was working from both IDE and when i built a .jar file and deployed it in Spark. But it suddenly stopped working in IDE, whenever i tried executing in IDE, i was following error

    16/01/05 14:34:50 INFO SparkEnv: Registering OutputCommitCoordinatorException in thread "main" java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package at java.lang.ClassLoader.checkCerts(
  • How to use Hadoop's InputFormat and OutputFormat in Spark

    Montag, 4. Januar 2016

    One of the things that i like about Spark is, that it allows you to use you MapReduce based InputFormat and OutputFormats for reading from and writing to. I wanted to try this i built the InputFormatOutputDriver class, that uses TextInputFormat for reading a file.

  • How to use ZooInsepector

    Montag, 4. Januar 2016

    The Zookeeper has a ZooInspector GUI that you can use for inspecting your zNode structure, you can use it with these steps

    1. First go to the ZooInsepector directory (I am assuming that you already have ZooKeeper on your machine, if not download it from Zookeeper home page)

    cd <ZOOKEEPER_HOME>/contrib/ZooInspector
  • You can start the ZooInspector by using following command which makes sure that the necessary jars are on the classpath
  • How to read and write JSON files with Spark

    Donnerstag, 31. Dezember 2015

    I wanted to build a Spark program that would read text file where every line in the file was a Complex JSON object like this. I wanted to parse the file and filter out few records and write output back as file. You can download the project from GitHub

    {   "first":"Sachin",   "last":"Tendulkar",   "address":{      "line1":"1 main street",      "city":"mumbai",      "state":"ms",      "zip":"12345"   }}
  • Hello Spark Streaming

    Dienstag, 29. Dezember 2015

    In the WordCount program built using Apache Spark in Java , i built simple Spark program that takes name of the file as input, reads the file and performs word count on the file. Now Spark also has concept of Spark Streaming which allows you to read file as stream of real time events instead of one time load of input file. But the API for transforming the data, in both cases Spark converts the input in RDD.

  • How to create Spark Scala fat jar with dependencies using Maven

    Dienstag, 29. Dezember 2015

    If your developing a Spark application in Scala or your developing a Standalone Scala application and you want to create a fat jar that includes dependencies you can use following Maven Script as template for your build file. Couple of things are different here you must include scala-library as one of the library and then also include maven-scala-plugin that takes care of compiling the scala code. The value of sourceDirectory specifies the directory that contains your scala code.

  • Configuring Flume to use Twitter as Source

    Dienstag, 29. Dezember 2015

    I wanted to figure out how to Configure Twitter as source for Flume so i tried these steps

    1. First go to Twitter Application Management page and configure application. This should give you consumerKey, consumerSecret, accessToken and accessTokenSecret
    2. Next create, that looks like this. You should create source of org.apache.flume.source.twitter.TwitterSource type and use the 4 values you got in the last step to configure access to twitter
      agent1.sources = twitter1

  • Simple Java Program for publishing Syslog Events

    Sonntag, 13. Dezember 2015

    In the Using Syslog as source in Flume i blogged about how to configure flume to listen for Syslog event on particular UDP port. I wanted to test that configuration so i built this simple java program that can publish Syslog event on given host and port no.

  • Using Syslog as source in Flume

    Sonntag, 13. Dezember 2015

    I wanted to figure out how to use Flume for receiving Syslog message. So i tried 2 different configurations one is using Syslog server on TCP port and other on UDP port. This is the flume configuration for listening on UDP port Copy the file in the conf directory of your flume server and use following command to start flume server

    bin/flume-ng agent --conf conf --conf-file conf/ --name agent1 -Dflume.root.logger=DEBUG,console