Esha Dasgupta

Logo

About Me

Education

View My LinkedIn Profile

View the Project on GitHub EshaDasgupta/portfolio

Education

2020-present: Ph.D. in Computer Science, University of Birmingham
2019-2020: M.Sc. in Advanced Computer Science, University of Birmingham
2016-2019: B.A. in Computer Science, University of Cambridge


Projects

Ph.D. project: Vision-based 3D Human Posture Estimation and Musculoskeletal Regression

A continuation of the Masters Project, outlined below, where the goal is to estimate the muscle activations for a body from a simple RGB video.

Masters Project: Scientific implementation of machine learning algorithms in motion modelling pipelines

This project tries to estimate the inverse dynamics of a person from a video taken by a RGB camera. The 3D pose is calculated from a 2D image along with the ground forces acting upon the subject. OpenSim is used to customise a personalized body model before its motion in video is converted into motion capture records and used for calculating the inverse kinematics as well as dynamics for the body.

Mini Project: Classifying Facial Emotion to explore a Hybrid Model

This work combined the categorical and continuous models of emotion by treating compound emotions as vectors to be projected upon a vector space where the basis vectors are the categorical emotions. This was done in a decoupled manner where a multi-input convolutional network broke down an image into its resultant Facial Action Coding System features and a secondary linear network classified the features into a probability vector for the basic Ekman emotions and neutral. This system was used to categorise more compound emotions and check if there was any relation between similar emotions in the vector space.

Robotics Final Project: A Personal Assistant

This project designs and implements an autonomous Personal Assistant which can use voice recognition to receive tasks and a recipient to deliver a message or an object to. Machine learning algorithms are used to detect and classify objects as well as people, and the assistant schedules its tasks dynamically based upon a known room’s location, a probabilistic location of a person, task priority and future tasks. The robot uses AMCL localisation as well as RRT Route Planning to navigate between locations.

Part II Project: Escape from a Hostile Terrain with Agents using Reinforcement Learning

A Deep Q-Learning algorithm is used to make heterogenous agents learn optimal prey policies when faced with a pseudo-intelligent opponent. It investigates the impact of different reward functions and the resulting agent behaviours when building from a ‘blank state’ to see if they are comparable to real world interactions and is written in Python.

IB Group Project: Anthropometrics today

This project takes two images of a person’s head and uses edge detection along with facial structure decomposition to get structural measurements and then finds the closest match to the person in the provided database.