Startup, technology, and Programming Enthusiast
Welcome to my site! I am a recent graduate with a degree in Finance and a minor in Computer Science from the University of Cincinnati. I am passionate about collaborating with fellow developers and have worked on a variety of web development projects, both within the finance sector and beyond. While I specialize in Python, C++, and JavaScript, I am always eager to expand my skill set and explore new programming languages.
When I'm not coding, you can find me in the gym, reading, or experimenting with new technologies. Please reach out if you would like to work together!
This project focuses on the concepts of Value at Risk (VaR) and Expected Shortfall (ES) in financial portfolios. These values are crucial in portfolio management and risk assessment. The project includes the implementation of various methods to calculate VaR and ES, such as the Historical Method, and Monte Carlo Simulation. Additionally, I am currently working with UC's Kaus Shankar to build a ML model that will update the portfolio weights according to market VaR and ES calculated earlier in the project.
Deploy Box hosts a set of precontainerized tech stacks that are automatically cloud hosted with integrated Databases prior to being served to users. Optionally users can also opt to add secure authentication, payment integration, blob storage integration, and a meriad of other features to rapidly develop websites and apps in the languages and frameworks they are most comfortable with.
A chrome extension and webstie that pops up on checkout to show users how much they lose in future value by completing their purchase. Additionaly users can invest instead via Alpacas oAuth, the funds are automatically invested in VOO, the vanguard 500, index fund.
An autoclicker which completes the game Stimulation Clicker by Neal Agarwal.
A feed forward nueral netword with one hidden layer to classify hand-written digits from the MNIST dataset. Built entierly from scratch using only Numpy to implement the architecture, backpropigation, as well as the the ReLU and softmax activation function.The ultimate final architecture consisted of 784 inputs, a hidden layer with 100 perceptrons, and an output layer of 10 perceptrons. ReLU was used for the hidden layer to prevent the issue of vanishing gradients and Softmax was usesd for the output.
I'm always open to discussing new projects, creative ideas or opportunities to be part of your vision.