December Tech Talks : GDayX Mumbai & DevFest Ahmedabad

December has been a busy month for talks. Among the various talks that I did, a few of them stood out for the audience, topic and size of the event. Also enclosed are the presentations for each of those talks.

GDayX Mumbai , December 13 2014

This was the first GBG Mumbai (Google Business Group) meeting that I was attending and GDayX Mumbai 2014  was a superb event. More than 15+ talks across the day where one entrepreneur after another gave us practical advise on how they are going about doing business. I learnt a lot from this event.

I did my little part by speaking on “Wearable Computing”. The talk was titled “Getting Business Ready for Wearables”. The talk gave an overview on the current state of the wearable market, the areas in which wearable computing is making a splash, the business factors to consider before bringing in wearables into your organization and user experience.

Google DevFest Ahmedabad, December 14 2014

GDG Ahmedabad always rocks! And I was glad to be a part of DevFest Ahmedabad this year. The event was attended by more than 250-300 developers and held sessions across 2 tracks : Mobile and Web. I gave 2 talks at the event:

  • Powering your Apps via Google Cloud Platform : This talk gave an overview of Google Cloud Platform and then zoomed into developing REST APIs using Google Cloud Endpoints with a final demo showing how we used Arduino, Python and Google Cloud Platform to not just record temperature but also push it into the Google Cloud and use Google Big Query eventually for Analytics.
  • Gradle –> Your friend in Android Studio : This talk addressed 2 important tools that I believe all Android developers will eventually have to learn : Gradle and Android Studio. My goal in the talk was to explain the concepts of Gradle (almost following the 80:20 rule) and how it all fits within Android Studio (multi-module projects, product variants and more).

I wish to thank the chapters (Google Business Group Mumbai and Google Developers Group Ahmedabad) for giving me the opportunity to speak and meet up with fellow professionals.

Please reach out if you have any questions and would like to get more details.

Temperature Logger powered by Arduino +

This blog post gives an overview of how I created a simple IoT project using Arduino, a Temperature Sensor and the excellent Database as a Service.

First up, let me explain what the eventual goal is and how this is a first step in that process. The goal is to setup a series of low cost climate/environment modules that capture various types of data like temperature, humidity and more. Then take all this data and put it in the cloud where we can eventually build out dashboards, alerts and more. This blog post explains one single setup where we create a system comprising an Arduino Uno, a temperature sensor, a Python application that can read the data from the Arduino Uno (yes, I did not use an Arduino Internet Shield) and post that data to the cloud.

Towards this goal, chosing Arduino as a microcontroller is a no-brainer though I do plan to look into other controllers in the near future. Once our system is collecting data, ,the important thing is where do I put this data. The Cloud comes up as a rational choice to eventually have all Temperature Sensor stations push their data and we can monitor and build dashboards from a single place. The choice was to build a backend using tools like Google Cloud Platform’s App Engine but that would mean quite a bit of extra work for something that is not immediately central to this project. I went with, a Database as a Service provider that makes it dead simple to funnel your data from sources into your database in the cloud, as the post will eventually show.

The Hardware Setup

I used the following:

  • Arduino Uno microcontroller
  • LM35 Temperature Sensor
  • Eventually we will have the Raspberry Pi that interfaces with the Arduino to read and transmit off the values but to validate things for now, the Uno was powered via a laptop/desktop with Python installed on it. The communication between the Uno and the PC is via serial port communication.

Arduino Uno + Temperature Sensor Setup

Here is how the LM35 sensor is connected to the Arduino Uno board.


Ofcourse, we used a breadboard to connect all this together but I am simplifying the diagram here so that you know what is connected to which pin. The LM35 has 3 pins. The first one goes to the 5V power pin on Arduino, the 3rd one is the GND and the middle pin is the VOUT where it emits out the values that we need to capture. We connect this to the Analog Pin (A0) on the Arduino. We can then write our Arduino code to read that value, as is shown next.

Arduino Code

The Arduino Code is straight forward as given below:

float temp;
int tempPin = 0;

void setup()

void loop()
 temp = analogRead(tempPin);
 temp = temp * 0.48828125;

You will notice in the loop that every 10 seconds, we are printing out the temperature value that was read from the Analog Pin (#0)

If you run the Serial Port Monitor that comes with the Arduino IDE and if the Arduino is powered up and connected as per the diagram shown, then you will find the Temperature value being printed on the Serial Monitor as given below:


Once this happens, we know that the Arduino setup is looking good and all we need to do now is to write a client program on the PC that interfaces with this Arduino, read the values via the Serial port and then push them to the Cloud (Database Service).

But first, let us take a look at the Database Service setup that we did. Setup is a Database as a Service provider. One of my reasons of going with is to eventually make use of their strengths in providing a solid REST API around your data. What this means is that I am just not interested in making sure that it is dead simple to store my data in the cloud. Eventually I will need an API that allows me to do sophisticated searches, time ordered events and more and fits that model.

One can get started for free with to check out their service and which is what I did. It gives you one application that you can create for free along with a 50K API calls limit / month, which is good enough for my prototype for now.

Once logged into, I created the Application by clicking on New Application. It popped up a dialog where we provided the Application name (Temperature_Log) and selected a data center as shown below:


That’s it. The Application is created and the next step was to just define the Collection. A collection is similar to a Table into which all the data records would go. Orchestrate will figure out from the data that you upload and create the schema accordingly for you. So all you need to do for now is to define your collection and which is what I did. 

Simply click on New Collection and then I gave my collection name as given below:




That’s all I needed to do to setup my Application and one collection in

If you go to the Dashboard, you will see the collection for your application along with the API KEY that we shall use soon. The API KEY will be used when posting data into Orchestrate from your client application and is used to identify yourself to the Orchestrate service.

Python Code

Now, let us move on to the Python code that interfaces over Serial port to read the temperature values from the Arduino setup and posts that data into

The steps are simple:

1) We initialize 2 things: the API Client using our Key. And then we initialize the serial port communication via which we will be reading the Temperature values that the Arduino unit will be emitting every 10 seconds. You need to figure out which Serial Port on your machine is interfaced to the Arduino.

2) Every 10 seconds, the code will read the value from the Serial Port. We can obviously build in more validations in the code, but this is good for now to demonstrate how all the pieces come together.

3) Once we get the data, it will use the REST service to post data into your Application data collection. To make things easier, has a solid list of client language APIs available that you can use to ease your task. I used the Python client library for To setup that library all one had to do was

pip install porc

The important thing to note in the POST to Service is the code. Here, all we are doing is specifying the collection (Temperature_Data) that we had created for our Application in and the Data. The Data is specified in JSON format and the fields that I am specifying are:

  • Temperature in Centigrade
  • Date of the recording
  • Time of the recording
  • Location Name (Weather Station Name).

Take a look at the Python client program below:

import serial
import time
from porc import Client

API_KEY = "Your API Key"
# create an client using the default AWS US East host:
client = Client(API_KEY)

# make sure our API key works

#Connect to Serial Port for communication
ser = serial.Serial('COM15', 9600, timeout=0)

#Setup a loop to send Temperature values at fixed intervals
#in seconds
fixed_interval = 10
while 1:
  #temperature value obtained from Arduino + LM35 Temp Sensor
  temperature_c = ser.readline()
  #current time and date
  time_hhmmss = time.strftime("%H:%M:%S")
  date_mmddyyyy = time.strftime("%d/%m/%Y")

  #current location name
  temperature_location = "Mumbai-Kandivali"
  print temperature_c + ',' + time_hhmmss + ',' + date_mmddyyyy + ',' + temperature_location

  #insert record
  response ='Temperature_Data',
                          "date" : date_mmddyyyy,
                          "time" : time_hhmmss,
                          "value" : temperature_c})
  print "Record inserted. Key = " + response.key
except ser.SerialTimeoutException:
  print('Error! Could not read the Temperature Value from unit')

Checking our data

The final step was to validate if our data was being transmitted successfully and into Orchestrate. All you need to do is to ensure that the Python code executes , you do get the Record inserted message.

A sample run of the Python code is shown below:


Once that is done, come back to Orchestrate Dashboard and visit your collection. A sample snapshot of the Search Query for my collection is shown below:


This validates the end to end working of the project.

In Summary

Arduino makes electronics prototyping fun. With languages like Python and services like, the process of collecting / transmitting  / saving the data in the cloud is made simple too. I hope to explore in more projects for their other powerful features like Time Events and Graphs.

You can use the steps outlined below to build your own version of a Data Collection IOT Project that uses the Cloud to store and analyze its data.

Till next time.



Google Cloud Monitoring API Tutorial

If you have any projects running on Google Cloud Platform, one of the things that you want to do is to understand various metrics like response times, latencies, database statistics and more. Recently Google released Cloud Monitoring API that allows you to get information on various metrics. These metrics are only expected to grow with time and will help you in numerous ways.These include building dashboards, writing custom alerts, understanding your monthly bills and more.

Screen Shot 2014-09-20 at 3.15.54 pm

I covered a short tutorial from a developer perspective on Google Cloud Monitoring API on ProgrammableWeb recently. Here is the link to the article.


Hello Developer

There has never been a better time to be a developer. I believe this statement will remain true irrespective of the times we live in.


As a developer today, it is important to look at the full stack when developing applications today. This includes :

  • Web client
  • Native client (Android / iOS / etc)
  • Server side. Yes. You need the server too to power your mobile application functionality.

And not just that, it is important to understand design. Design is increasingly becoming an area that Developers need to understand and be part of, to create applications that not just wow your users but also make it intuitive to use.

As part of its Developer outreach, Google has teamed up with Udacity to provide several courses that address each of the above areas. These courses in my opinion are supported by best in class tools, teachers and materials and if you were looking to start off, these are great resources.

Here are the courses:

  1. Android Development
  2. UX Design for Mobile Developers
  3. Developing Scalable Apps on App Engine (PaaS)
  4. Website Performance Optimization
  5. There are couple of other courses, which have been there for a while and are still available:
    1. Mobile Web Development
    2. HTML5 Game Development

All the materials have been completely opened up for access by anyone. You do not have to pay the $150 / month. Just sign up and go for the Courseware link.

I recommend these courses strongly and hope you do take them. For inspiration, check out Reto Meier’s article titled “Enabling the next 50 Million Developers” and build apps for all.

Dive into HP IDOL OnDemand APIs

The future lies in processing data and deriving some value from it. Often, the process is tedious and could involve multiple sources of data, images, videos and more to link together.


HP IDOL On Demand is a great set of APIs that is made available by the HP Cloud Platform that make things much easier for the developer.

Check out my article at ProgrammableWeb that goes into the details of the HP IDOL OnDemand APIs and code snippets to get started on them today.

Google Cloud Endpoints Tips #6 : The @APIResourceProperty Annotation

In this tip, we are going to discuss the @APIResourceProperty annotation, which is a very handy annotation that you could use for your API.

The documentation states “@ApiResourceProperty provides provides more control over how resource properties are exposed in the API. You can use it on a property getter or setter to omit the property from an API resource. You can also use it on the field itself, if the field is private, to expose it in the API. You can also use this annotation to change the name of a property in an API resource.”

I have intentionally highlighted some of the words in bold to make you focus on what this annotation can be used for. The first two points should be clear enough i.e. you can omit any property from appearing in the API response or to expose a private property.

The third one is something that I believe is very valuable and can make a big difference to your API. Changing the name of a property is useful in two specific cases (there might be others, so do chime in with your comments to augment my points!) :

  • Make the API Property more understandable
  • Reduce the length of the API property name to a short form. This is useful if you really care about the number of bytes in the response. For e.g. you might have an API Property name “AddressLine1″ and you might just want it to be “AL1″ and so on.

Let us understand it with an example:

@PersistenceCapable(identityType = IdentityType.APPLICATION)
public class Task {

@Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
Long id;

String description;

String status;

// Getter / Setter methods

Note the following :

  • For the id property, we are using the @APIResourceProperty annotation and providing it a name =”tid”. This means that in the response the field label will be provided as tid.
  • For the description property, we are shortening the property name to desc.
  • For the status property, we are making the ignored attribute as TRUE, so that it does not appear in the result.

A sample listTask method invocation yields the following JSON response for the test data that I have in my system:


Hope this makes things clear. Go ahead and employ @APIResourceProperty !

List of Cloud Endpoints Tips 

  1. Check Endpoint Deployment Status
  2. Throw the Right Exception Classes
  3. Understand Injected Types
  4. Understand @API and @APIMethod Annotations
  5. Using Cursor and Limit parameters
  6. The @APIResourceProperty Annotation

Google Cloud Endpoints Tips #5 : Using Cursor and Limit parameters

One of the design decisions that you need to take with your API is how much of data to return. This is especially important for the list*() methods in your Endpoints implementation.

For e.g. if your API is dealing with Blog comments, there is a possibility of a few blog posts being popular and having hundreds of comments. The correct approach for the API would be to provide the client application using the API, with mechanisms to control the following two aspects of any list*() call:

  • Specify the number of results per API call. This is similar to results per page.
  • Allow subsequent calls to specify the particular page or location from where to return the next N results of the API.

Google Cloud Endpoints addresses this via the cursor and limit parameters. The limit parameter tells how many results to retrieve and the cursor parameter tells the API, from which point should it retrieve the results.

To better understand this, we will need to work with a simple example. It will also demonstrate to us that Google API Explorer via which we can test our API, makes it easier for us to understand these parameters.

Let us start with an empty App Engine project and create an Entity say Task as shown below:

@PersistenceCapable(identityType = IdentityType.APPLICATION)
public class Task {

@Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
Long id;

String description;

String status;

// Getter / Setter methods


Now, generate the Cloud Endpoints code for this Task class via the Eclipse plugin (or maven / command line – depending on your preference). The Endpoint class generated with the class. Simply focus on the listTask signature that is shown below:


public CollectionResponse<Task> listTask(
 @Nullable @Named("cursor") String cursorString,
 @Nullable @Named("limit") Integer limit)

You will notice the 2 parameters that we have discussed before i.e. cursor and limit.

Go ahead and start the application. Go to the API Explorer locally via the /_ah/api/explorer url and insert at least 4-5 Task Records.

Now, go to the API Explorer and listTask method. The form is shown below:



If you simply click Execute here without entering any values in the cursor and limit fields, then you will get the result shown below. In my case , I have create 4 Task Records and hence all the 4 records are returned. gcep-limits2

Now, let’s say that we want to just retrieve 2 records per request and not more. Simply provide the value in the limit parameter as shown below and click on Execute.


This will return you just the first two records. It starts from the beginning since we have not provided any value in the cursor i.e. from where we should start. gcep-limits4

Notice in the above screen i.e. the API response, that a value nextPageToken is passed. This is the value that you need to use in your subsequent calls to the listTask method, to get the next set of records as per the limit and from which point. So, if we provide that value in the cursor as shown below and click Execute.



We get the next 2 records as shown below:



Remember that the cursor and limit parameters are optional. So if you do not provide these values, the Endpoint generated code takes care of it in the listTask implementation. 

Hope this makes things clear on how you can ask only for partial results per API Request.

List of Cloud Endpoints Tips 

  1. Check Endpoint Deployment Status
  2. Throw the Right Exception Classes
  3. Understand Injected Types
  4. Understand @API and @APIMethod Annotations
  5. Using Cursor and Limit parameters
  6. The @APIResourceProperty Annotation