Research Interests

  • Mobile & Context-aware Computing
  • Artificial Intelligence
  • Privacy-preserving machine learning

Mar - Apr, 2017


iRoam - Intelligent Real-time Observation & Analysis of Movement

Placed 1st - 1000$ prize by Philips - Sponsored by Dr. Sumi Helal

Converted crutches and walkers into IoT devices to optimize rehabilitation for leg injuries, and assisting the elderly by providing real-time monitoring and activity tracking services. Used ARM MBED boards to integrate the pressure and heart rate sensors and relay their readings to our cloud based NodeJS server where we analyze the data and present it in intuitive visualizations to medical practitioners and caretakers of the elderly.

Feb - Mar, 2017


Campus Heat Map - IoT app with ARM MBED, Bluetooth Beacons, NodeJS and MQTT

Self Initiated Project

A campus app and an administrative dash board with the purpose of creating a dynamic, real-time heat map showing where student crowds are on a campus environment at any time. Beacons are deployed in gathering areas to serve as a physical web that initiates the reporting of presence in a given area. An Android App is developed to perform as a service receiver of beacons, which on receipt, reports to a node server. The server keeps track of present users via reports from their cell phones, and also from other sensors in the built environment, for instance capacity counters. Mbed boards are used to simulate the sensing of a user by means of a capacity counter installed at classroom entrances. When this number exceeds a threshold, the LED light on the mbed boards start glowing.

Feb - Apr, 2017


KinectED Living

Natural User Interaction (Course Project) - Kinect 2.0, Arduino, Kinect Speech SDK, Kinect Discrete Gesture, Microsoft Speech SDK, C#

An automated home application that allows multimodal interaction using Kinect’s gesture and speech recognition capability to control devices in a living space. The gesture aids in identification of the device whereas the speech command is responsible for changing the state of the device. Moreover, the user is also empowered to control any device via speech or gesture solely.

Sep - Dec, 2016


Weather Sensor Network for Disease Prediction in High Value Crops using GPRS & Bluetooth Radio

Wireless AdHoc Networks (Course Project) - Javascript, NodeJS- ExpressJS framework, Android Programming, Embedded C, Wireless - Bluetooth, GSM

We employed temperature, wind speed, and relative humidity sensors. Their measurements are used to predict onset of diseases in crops so that the farmer could spray pesticides only when prompted to do so instead of having to do it all year around. This helps farmers to avoid excessive use of pesticides on crops and save on costs. The sensor data is analyzed and presented in a graphical view to the farmers either on web or an Android application. The server pushes an email notification to alert the farmers about the onset of diseases in their crops.

Nov - Dec, 2016


Cloud-Edge-Device Architecture based Internet of Things Platform

Distributed Operating System Principles(Course Project) - Javascript, NodeJS- ExpressJS framework, Android Programming, Embedded C, BeagleBone Black, MQTT-SN

Developed a Cloud-Edge-Device architecture based Internet of Things platform. The project involved developing augmentations for Xinu OS to support the cloud. My responsibility included developing RESTful APIs using Node.JS. This facilitated energy efficient access to sensor data through push/pull queries along with caching. Generation of device drivers was automated at the cloud through means of a Device Description Language based description of the sensor/actuator. Platform was tested using a BeagleBone Black, a local Edge PC and a cloud server.

Jan - Apr, 2013


Voice Controlled Robot Using An Android Device

Senior Year Thesis - Android Programming, JAVA, Embedded C, Wireless - Bluetooth, Machine Learning

The project uses an Android device to remotely control a robot. This is achieved by developing an android application that recognizes user's speech, and transmits the commands wirelessly over to a microcontroller based robot. The robot is equipped with a wireless camera to provide live feed of its surroundings. In the developed prototype we command a Microcontroller based robot to move forward, backward, turn left or right while simultaneously providing a live visual feedback to the controller. Its application is in surveillance, places where humans find difficult to reach. E.g. in a small pipeline, in fire-situations, and in highly toxic areas.

Jul - Sep, 2011


Wireless Shopping Assistant For The Visually Impaired

Self Initiated Project - Microcontroller- Embedded C, Background GUI - VB, DB - MS Access

Wireless Shopping Assistant for the Visually Impaired is a system designed to the blind or the sight impaired shopper in independently identifying and selecting products off the store shelf without the need of human assistance. It has been built upon the fact that although the sight impaired may lack full use of their visual senses, they still retain their functional tactile and auditory perceptions as their strengths.

May - Jul, 2014


Teacher Assistant

Giving back to the college

Android app with an easy to use UI for professors to track lecture attendance. It replaced the manual register prone to slip-ups, saved time and effort to copy the data from their physical logs to excel sheets, automatically synced logs to the server, and indicated students short on attendance.

You can download my resume here


Resume (PDF)

Email

Do send me a mail in case you have any queries, and I will make sure to get back to you.
siddharth.4.gupta@gmail.com
siddharthgupta.er@gmail.com

Address

California