Developing an investment assistant tool for novice entrants in the financial markets. Initially developing a risk profile for each client
which incorporated his/her expected return, risk aversion, asset class biases, basket size & holding period. Shortlisting a set of assets from universe of assets
for an individual risk profile using tick data and the immedeate and far sighted news based financial sentiment on the asset. Performing dynamic portfolio optimisation using deep
reinforcement learning techniques to maximise the investment expectation for each client.
Statistical Arbitrage based Trading Strategy
Developed a pairs trading strategy by implementing the Engle Granger procedure to test for
cointegration analysis. Fitted the resultant spread from pairs to an Ornstein Uhlenbeck process and designed a
trading strategy from it.
Employed Kalman Filters to compute adaptive cointegration weights and an adaptive trading strategy to tackle
loss of robustness.
Developed an algorithm to simulate backtesting for the strategy, implemented 18 performance metrics and
used PyFolio tearsheet to analyze the structure of P&L from the strategies.
Interest Rate Modeling for Counterparty Risk
Computed the Credit Valuation Adjustment charge as specified in Basel III standards for a vanilla interest rate swap.
Bootstrapped the term structure of the hazard rates for the counterparty from its market Credit Default Swap prices.
Simulated multiple interest rate curves using the Heath Jarrow Merton framework to find the expected exposure of the contract.
Developed a 3 factor model, where the volatilities were boostrapped by performing Principal Components Analysis on the BOE yield curve.
The drift was then computed from the implied volatilities to satisfy the no arbitrage principle.
Pricing a Basket Credit Default Swap
Priced a kth-to-default Basket Credit Default Swap for 5 securities as an expectation over the joint distribution of their default times.
Intially, found the correlation matrix from the equity returns for each reference and used this to calibrate a Student-t and Gaussian Copula.
Simulated default times for each reference from co-dependent uniform variables sampled from the copulas.
Converted these random variables to default times and simulated defaults for each kth reference.
Computed premium and default leg payoffs for kth to default instruments based on their simuulated time of default and computed their fair spread.
Portfolio Allocation through Meta Heuristic Optimization
Performed portfolio allocation on asset baskets of sizes 12, 24, 48 & 96 assets. Computed the return and covariance matrix for the basket from historic data, which greater weightage given to more recent data.
Used the Sharpe Ratio and the 99% Value-at-Risk weighted return as the objective functions.
Employed meta heuristic optimisation techniques to optimize allocations and compared its results with a traditional mean variance optimization approach.
Drew comparisions based on convergence and performance and analysed the potency of these methods to solve non-convex optimizations in non-differentiable settings like that in including transaction costs and other market microstructure elements.
Credit Card Fraud Detection through GAN's
Tackled the class imbalance problem in a credit card fraud dataset to improve the training performance of an artificial neural network where the fraudulent to non-fraudulent records were numbered 1 to 6000.
Used generative adversarial networks to augment the dataset to improve the performance of the ANN. Compared the performance of vanilla, Margin Adaptive, Least Squares, Wasserstein, Relaxed Wasserstein implementations of GAN's for this use case.
Studied the distribution of the data produced by each network and its training convergence. Further analysed the strenghts and weakness of each network and its potential use cases.
Stock Price Prediction using LSTM and GRU
Implemented GRU and LSTM based deep neural networks for predicting t+5th day adjusted close price.
OHLCV data was obtained and subjected to initial pre-processing and then 45 technical indicator based attributes were computed from the data and appended to the dataset.
Feature reduction was performed to curb the curse of dimensionality and the dataset was reduced to 12 attributes by applying Independent Components Analysis.
This dataset was inputted into the neural network model and output was then tested for accuracy, effectiveness and optimism and pessimism in its predictions.
Self Driving Smartcab
Developed a self driving smatcab in a city network environment based on the US right of way traffic rules.
The driving agent was developed to ensure that the passenger reached his/her destination safely without any traffic violation and within a specified deadline.
Used a Q-Learning based agent to accomplish this task. Developing a custom threshold based exploration and exploitation function to optimise the learning procedure for the policy.
Dog Breed Classification
Developed a dog breed classification algorithm which was capable of detecting whether an inputted image was a dog, human or an alien object. If the image were of a dog, it outputted
its breed and if it were an alien or human it outputted the closest resembling breed. The algorithm was capable of detecting 133 breeds. Used haar cascade algorithm implementation in OpenCV to detect a human face.
Used a transfer learning based deep convolution neural network to detect the image. Used an Xception architecture pretrained on the Imagenet database for transfer learning.
Analysis of Datasets through Data Mining Algorithms
Developed a methodology to analyse large datasets and find the inherent substructure and relationships between different attributes in a dataset.
Accomplished this by dicing the dataset into attributional subsets and subjecting it to a set of classification and clustering analysis.
Visualised these results and drew inferences on the correlation of attributes and importance of each attribute in analysing the dataset.
Analyzing Customer Segments
Segmented customers based on their spending volume and the itemset they purchase. Mapped these segments to a representative buying class predominant for the wholesaler.
Analysed these purchase patterns to understand the impact of a potential changes in supply chain from the manufacture. Performed A/B tests to understand it and then a
cost benefit analysis to make recommendations to a wholesaler to ensure maximum rentention of his traditional consumer base while minimizing his supply costs.
Created a exploration based game on Unreal Engine where a player had to explore through different terrains, navigate through preset challenges and collect objects to score points.
The player needed to explore the landscape majorly on a paraglider but could walk and boat to accomplish specific targets. Developed a rocky mountain terrain, and ice capped terrain
and urban terrain.
Developed a dynamic web portal for students to find the most relevant academic tutorials and learning content for a subject.
Pooled learning resources and videos based on different levels of complexity from multiple openly sourced platforms like youtube and added links to the top paid and upaid tutorials on those subjects.
Created a web dashboard for individual student where he can track his/her learning progress, his/her courses taken and his/her improvement in performance.