• IBM Deutschland GmbH

    Freitag, 20. Februar 2009 - 10:26
    IBM in Deutschland Über uns IBM gehört mit einem Umsatz von 99,9 Milliarden US-Dollar im Jahr 2010 zu den weltweit größten Anbietern im Bereich Informationstechnologie (Hardware, Software und Services) und B2B-Lösungen. Das Unternehmen beschäftigt derzeit knapp 400.000 Mitarbeiter und ist in über 170 Ländern aktiv.
    Partner
    Ehningen
  • IBM Deutschland GmbH

    Freitag, 20. Februar 2009 - 10:26
    IBM in Deutschland Über uns IBM gehört mit einem Umsatz von 99,9 Milliarden US-Dollar im Jahr 2010 zu den weltweit größten Anbietern im Bereich Informationstechnologie (Hardware, Software und Services) und B2B-Lösungen. Das Unternehmen beschäftigt derzeit knapp 400.000 Mitarbeiter und ist in über 170 Ländern aktiv.
    Partner
    Stuttgart
  • How to drain/delete/expire existing messages in Kafka

    Mittwoch, 8. November 2017 - 23:34
    Some time you might have a bad record in Kafka topic that you want to delete. Kafka does not provide direct option to delete specific record. Only way to delete records is to expire them. You can achieve this by setting data retention to say 1 second that expires all the old messages. You can follow these steps
    Blogeintrag
  • How to run Sqoop command from oozie

    Donnerstag, 22. Dezember 2016 - 22:31
    In the Importing data from Sqoop into Hive External Table with Avro encoding updated i blogged about how you can use sqoop to import data from RDBMS into Hadoop. I wanted to test if i can use Oozie for invoking Sqoop command and i followed these steps for doing that. First i tried executing this command from my command line on Hadoop cluster to make sure that i can actually run sqoop without any problem sqoop import --connect jdbc:mysql://localhost/test
    Blogeintrag
  • Importing data from Sqoop into Hive External Table with Avro encoding updated

    Donnerstag, 22. Dezember 2016 - 20:06
    In the Importing data from Sqoop into Hive External Table with Avro encoding i had details on how you can import a table from RDBMS into Hive using Sqoop in Avro format.
    Blogeintrag
  • Sending and Receiving JSON messages in Kafka

    Montag, 26. Dezember 2016 - 6:14
    Sometime back i wrote couple of articles for Java World about Kafka Big data messaging with Kafka, Part 1 and Big data messaging with Kafka, Part 2, you can find basic Producer and Consumer for Kafka along with some basic samples. I wanted to figure out how do i pass JSON message using Kafka.
    Blogeintrag
  • Importing data from RDBMS into Hive using Sqoop and oozie (hive-import)

    Freitag, 23. Dezember 2016 - 2:18
    In the How to run Sqoop command from oozie entry i talked about how you can use Oozie and Sqoop to import data into HDFS. I wanted to change it to use sqoop's hive-import option, which in addition to importing data into HDFS also creats Hive table on top of the data.
    Blogeintrag
  • Installing ElasticSearch on existing docker container

    Sonntag, 1. Januar 2017 - 0:34
    I was using a Cloudera Quickstart docker image for one experiment and wanted to install ElasticSearch on it but i had trouble in accessing from my host, but i found workaround by following these steps First i installed ElasticSearch by downloading and unzipping ElasticSearch version 2.4.3 and unzipping it in /opt/elastic folder Then i started elasticsearch by executing /bin/elasticsearch, and it started ok.
    Blogeintrag
  • Spark Streaming Kafka 10 API Word Count application Scala

    Donnerstag, 12. Januar 2017 - 2:30
    In Spark Kafka Streaming Java program Word Count using Kafka 0.10 API blog entry i talked about how you create a simple java program that uses Spark Streaming's Kafka10 API using Java. This blog entry does the same thing but using Scala.
    Blogeintrag
  • Spark Kafka Streaming Java program Word Count using Kafka 0.10 API

    Donnerstag, 12. Januar 2017 - 1:39
    Kafka API went through a lot of changes starting Kafka 0.9. Spark Kafka Streaming API also was changed to better support Kafka 0.9. i wanted to try that out so i built this simple Word Count application using Kafka 0.10 API. This blog entry does the same thing but using Scala.
    Blogeintrag