Get Blockchain statistics details using python and automate it.

Hi Readers,

In this post you will be learning how to get blockchain statistics using python and automate it. I used ‘statistics’ module for explanation.

I have just written a simple python program to get current bitcoin trade volume, number of bitcoin mined till now, number of transactions etc.

You can use the above script to self-automate send email, use twilio service to sms etc.

The below scripts helps you to send email about bitcoin statistics like bitcoin trade volume, number of bitcoin mined till now, number of transactions etc. You can also use cron job to schedule so that the script can run at particular interval.

Machine Learning APIs quest from qwiklabs

Hi Readers,

This month I got a free Machine Learning APIs quest (45 credits) from Qwiklabs. I just attended ‘Cloud OnAir: The Journey From Big Data to AI’. I need to complete all seven labs in the Machine Learning APIs Quest to earn the ML badge.


By this quest I had a hands-on experience with Google Cloud API, Natural Language API, Cloud Vision API, AI chatbot with DialogFlow etc.

Quest Link:

Quora – crossed 1 million answer views

Hi Readers,

I am happy to share that I have just crossed 1 million answer views in on 12/12/2017. I used to write answer for question which are asked by fellow quorains. Mostly I write answer for topics like windows, OS, java, big data, spring boot etc.


Feel free to follow my quora profile or use ‘Ask Me’ option to ask me a question which you have.

Error initializing SparkContext : ‘sparkDriver’ failed after 16 retries!

Hi All,

I just came across this error ‘sparkDriver’ failed after 16 retries!’ while starting the spark shell.


From the above screenshot, I came to know the error occurred due to the SparkUI port was not available. For this scenario I used Cloudera sandbox using Oracle VirtualBox.

I just solved this issue by the following steps

  1. Go to /usr/lib/spark/bin (path for cloudera sandbox)
  2. Open and add this export SPARK_LOCAL_IP=’′ (you need to be root user)
  3. Save the file and type ‘spark-shell’ to get sparkcontext initialized

Thanks for reading this post. Subscribe my blog for  some awesome posts.

Solve for India – Chennai

Hi Readers,

Today, I went to solve for India workshop conducted by GDG India. The workshop started with Welcome Note from GDG speakers and continued with sessions like designing for Indian Users target 2020, building for billion framework, kubernetes, firebase, GCP console and a 45 minutes open house session.


Machine Learning Roadshow – Chennai

Hi Readers,

Today, I have attended Machine Learning workshop conducted by Google. It was a full day tech session. Morning session covered machine learning basics, use cases explained, tensorflow 101 and afternoon session covered with machine learning & tensorflow code labs (hands-on in real GCP environment).


Apache Spark Streaming – Listen to a local streaming data using PySpark — Explained