Aakash Mishra's Webpage


  • email
  • github
  • instagram
  • twitter
  • linkedin
  • google scholar

Job Experience


[view current resume]

meta

Software Engineer, Meta

I start my full-time work on July 24th, 2023! Feel free to reach out to me via aakamishra@meta.com! I will update this section soon.

Timeline: July, 2023 - Current


solesca

Machine Learning Engineer, Solesca

In my previous role, I successfully trained a sentiment analysis model to accurately predict community responses to solar project development. Additionally, I took the initiative to develop a pipeline specifically designed to scrape web-data, including town hall meeting notes and news articles, for sentiment analysis purposes. This allowed us to gather valuable insights and stay informed about public sentiment. Furthermore, I fine-tuned a variant of the GPT-2 model to effectively summarize article notes relevant to renewable energy. Lastly, I constructed a geo-spatial graph neural network model that proved instrumental in predicting the potential effects of solar projects on incidence lighting. Through these efforts, I demonstrated my expertise in leveraging advanced technologies to support renewable energy initiatives and informed decision-making processes.

Timeline: March, 2023 - June, 2023 (Part-time)


meta

Software Engineering Intern

During my tenure, I had the opportunity to design a comprehensive privacy detection framework aimed at identifying and preventing violations related to User-Identifying Information during ML platform training. This framework was instrumental in ensuring the security and privacy of user data. Furthermore, I successfully integrated a feature engineering framework into the serving platforms, aligning it with the internal ML platform logic. This integration improved the efficiency and effectiveness of the overall system. To streamline privacy compliance processes, I automated privacy insight and compliance checks for the ML model deployment system, working alongside data warehousing tools. This automation not only enhanced the speed and accuracy of compliance checks but also reduced manual effort. In a leadership role, I spearheaded an independent development project focused on implementing privacy compliance throughout the internal end-to-end ML pipeline applications. This initiative involved working closely with cross-functional teams, driving the adoption of privacy best practices and ensuring regulatory compliance at every stage. Additionally, I contributed to the company hackathon by adding the Instagram Product Community Review feature. This enhancement showcased my innovative thinking and ability to deliver tangible solutions within a limited timeframe. Overall, my experience in designing privacy frameworks, integrating platforms, automating compliance checks, leading development projects, and contributing to company-wide initiatives led to my return offer.

Timeline: May, 2022 - August, 2022

REX real estate

Software Engineering Intern

In my previous role, I was responsible for assembling a robust cloud infrastructure using Terraform, specifically for AWS EMR and S3 data warehousing. This involved designing and implementing the necessary resources and configurations to support big data processing and storage in a scalable and efficient manner. Additionally, I gained hands-on experience working with microservices deployed using docker images on Kubernetes (k8s) within EC2 nodes. This allowed for containerized deployment, management, and scaling of the microservices, contributing to improved reliability and scalability of the overall system. To ensure fine-grained access control, I successfully configured Apache Ranger Integration with Apache Hive Databases. This integration enabled the implementation of comprehensive access policies and restrictions, ensuring secure and controlled data access within the environment. Furthermore, I took an active role in developing Airflow DAGs (Directed Acyclic Graphs) for the purpose of pushing terabytes of data to Salesforce using Kafka/Confluent. These DAGs served as automated workflows that orchestrated the data transfer process, ensuring efficient and reliable data movement at scale. Through my involvement in these projects, I showcased my expertise in cloud infrastructure setup, containerization using Docker and Kubernetes, access control implementation, and data integration using Apache Kafka and Confluent's platform. My contributions played a crucial role in enabling efficient and secure data processing, storage, and transfer within the organization.

Timeline: May, 2021 - May, 2022 (partially part-time)

mit

Cybersecurity Intern

In my previous role, I successfully fabricated a Django-based REST API that allowed for querying DHCP and ARP data, enabling efficient monitoring of network traffic. This API served as a valuable tool for extracting and analyzing relevant information, facilitating network monitoring and troubleshooting processes. Additionally, I took on the responsibility of maintaining a Kubernetes-based load balancing system using Podman containers. This involved ensuring the availability and optimal performance of the load balancing infrastructure, contributing to the overall stability and scalability of the application. To enhance network security, I designed a 3D visualization application specifically tailored for monitoring a high volume of alerts, averaging over 10,000 alerts per hour, related to network intrusion events. This application provided real-time insights and situational awareness, enabling prompt responses to potential security threats. Furthermore, I devised parsing scripts to effectively organize and manage over 30 terabytes of data. These scripts automated the data parsing and organization process, improving efficiency and enabling quick and accurate data retrieval for analysis and reporting purposes. Through my involvement in these projects, I demonstrated my proficiency in Django-based REST API development, containerization using Kubernetes and Podman, application design for large-scale event monitoring, and data parsing and management. My contributions played a critical role in enhancing network visibility, security, and data accessibility within the organization.

Timeline: May, 2020 - May, 2021 (partially part-time)