Samya Nandy
HashiCorp Certified Terraform Associate
Oracle Certified Java Developer With AWS Cloud Experience
Mobile: -
E-Mail :-Docker : https://hub.docker.com/u/samya123
GitHub : https://github.com/samya123456
Summary:
9+ years of Overall experience in Software Development.
4+ years of Experience in Creating Machine Learing Models , Deep Learing Neural Network, Creating model using Python tensorflow, keras , Pandas, NumPy, Seaborn, ,NLP
4+ years of Experience in Creating Object Detection Model usings YoloV8 .
2+ years of Experience in Generative AI using OpenAI Services ChatGPT 4.x using LangChain, Prompt Engineering , Embedding and Vector DB ,Evaluation of the LLM Model , LangChain Agents (Database Agent, Pdf Agent), AWS Lex
4+ years of Experience in MLOps , Creating Machining Learning pipeline to train the model and Deploying the Model in AWS Sagemaker.
4+ years of Experience in Data Analytics using in Python , Pyspark
7+ years of Experience in Data Analytics using AWS cloud , AWS Lambda , DynamoDB ,AWS Kinesis , AWS Glue, AWS Athena and AWS Quicksight.
5+ years of Experience working with Node JS , Core Java, Collections & Design Patterns.
5+ years of Experience RestFul webservices ,SpringBoot Microservices.
7+ years of Experience for designing and Development of Large scale distributed system using Docker , Kubernates , AWS ROSA OpenShift Cluster , Amazon EKS , ,Amazon EKS Fargate
8+ years of Experience in Cloud Development with AWS Services(PaaS), ,Amazon API gateway,Amazon Lambda,AWS Route 53,PagerDuty, SQS,SNS,EventBridge,VPC,Subnet,IAM,S3,Security Group,EC2,SSM parameter,Cloudwatch,Route 53,Cloud Trail, Cloudwatch log,Route Table
7+ years of Experience in DevOps with Jenkins CI/CD , Ansible, Spinnaker ,Helm, Jenkins, Tekton, GitlabCI, AWS CodePipeline
7+ years of Experience in building the Cloud Infrastructure using Terraform (IAC), AWS CloudFormatiom
7+ years of Experience in Relational Database using RDBMS,Amazon RDS,Oracle 11g, Mysql Database,and PostGres
2+ years of Experience in Cache Service using Redis.
Professional Experience:
Organisation
Role
Location
Period
IBM
Developer
Kolkata, India
Aug 2021 - Present
Accenture
Developer
Kolkata, India
Apr-2020 - Aug 2021
Cognizant
Associate
Kolkata, India
Dec 2017 - Mar 2020
Tata Consultancy Services
Software Engineer
Kolkata, India
Sept’2014 - Nov 2017
Software Skills:
Technologies
Python , PySpeark, tensorflow, keras , Pandas, NumPy, Seaborn FastAPI , AWS , Docker , K8s
Database
Oracle 10g ,My Sql, AWS RDS Aurora,NoSQL MongoDB
Cloud Services
AWS Services EKS, EC2,S3,RDS,ALB,VPC,Event,SNS,SQS,Kinesis,QuickSight,Athena,Glue,KMS,Secret Manager,AWS Lambda,API Gateway,RedHat Openshift,PagerDuty, Prometheus, Grafana
DevOps
Docker, Kubernetes, Jenkins CI/CD ,Ansible, Terraform,Spinnaker,Helm,Cloudformation,linux,AWS Code pipeline
Educational Qualification:
Degree and Date
Institute
Major and Specialization
Marks(%)
B-Tech , 2014
Haldia Institute of Technology
Electronics and Communication Engineering
84.2
Higher Secondary (12th), 2009
VidyasagarVidyapith, Midnapore
Science (Physics , Chemistry , Mathematics)
76
Secondary(10th) , 2007
NirmalHriday Ashram, Midnapore
Science
86
Projects Summary:
Project
Delta Airlines
Customer
Delta Airlines
Organisation
IBM
Team Size
5
Period
Aug 2022 – Present
Description
Developing Machine Learing model to predict Flight could be cancelled or delayed depending on the weather condition.
Role
Software Developer
Technology Used
Python TensorFlow , karas , pandas, NumPy, Fast API , docker , Kubernetes, Aws Sagemake,sklearn , AWS Code pipeline
Responsibilities
Developing Machine Learning model to predict Flight could be cancelled or delayed depending on the weather condition .
Dataset containing Temperature , Humidity , Precipitation, wind speed , surface pressure etc use to come as CSV file from different Airport location.
We used to store those in cache and train our model using those values 15 days data.
After training the model , we save it in HDF5 format and push it in nexus antifactory . This is our training pipeline.
Implemented a robust model versioning strategy, storing diverse model versions in Amazon S3 and managing associated metadata in AWS SageMaker Artifactory, ensuring traceability, reproducibility, and easy access to historical model iterations for auditing and regulatory compliance purposes.
Deployed machine learning models on AWS SageMaker, leveraging its scalable infrastructure and managed services to deploy, manage, and host predictive models, ensuring seamless integration with production applications.
Implemented end-to-end machine learning pipelines on AWS SageMaker, encompassing data preprocessing, model training, hyperparameter tuning, and deploying models as RESTful APIs, optimizing for high availability and low latency
Implemented an end-to-end Continuous Integration (CI) pipeline utilizing AWS CodePipeline, orchestrating seamless model versioning in Amazon S3 and comprehensive model version metadata management in AWS SageMaker Artifactory. Integrated continuous monitoring solutions within AWS SageMaker to track model performance, enabling proactive insights into model behavior and facilitating timely retraining or redeployment, ensuring sustained accuracy and reliability. Leveraged AWS SageMaker for model deployment, configuring automatic hosting through AWS Lambda, backed by API Gateway, ensuring scalable, real-time access to predictive models with optimal performance and reliability
Project
Delta Airlines
Customer
Delta Airlines
Organisation
IBM
Team Size
3
Period
Aug 2021 – Aug 2022
Description
Developing Realtime Analytics Dashboard Using Amazon Quicksight, Amazon Athena, Amazon Glue, Amazon Kinesis Datastream
Role
Software Developer
Technology Used
AWS Cloudformations,S3 ,Athena, Glue, Kinesis, Quicksight,Lambda,Events,Cloudwatch
Responsibilities
Build the Analytics Dashboard using AWS Quick sight.
Collect the Realtime Stream data using AWS Kinesis and Load into S3
Create the Metadata Table using AWS Glue
Query S3 bucket data using AWS Athena
Deploy the entire set up using AWS Code pipeline.
Implement event-driven architecture utilizing AWS Lambda and EventBridge for automated triggers and actions based on real-time data ingestion from AWS Kinesis, ensuring timely and responsive system behavior.
Develop scalable and cost-effective data processing pipelines leveraging AWS Glue for ETL (Extract, Transform, Load) operations, ensuring data quality and consistency for analytics and reporting purposes.
Design and configure AWS CloudWatch for monitoring and logging across the entire infrastructure, establishing proactive alerts and alarms to ensure system health and performance
Project
GSK Colony
Customer
GSK Colony
Organisation
Accenture
Team Size
6
Period
Apr 2020 – Jul 2021
Description
Creating Image Detection model using YoloV6
Role
Software Developer
Technology Used
YoloV8 , pandas, NumPy, AWS CodePipeline, SageMaker
Responsibilities
Cell tissue image use to come from the laboratory ,
Using the dataset, we need to identify the percentage of the infection in the tissue.
If the percentage is high, then the person will be vaccinated.
Implemented a robust model versioning strategy by storing various model versions in Amazon S3, ensuring traceability and reproducibility. Managed associated metadata in AWS SageMaker Artifactory, enabling efficient tracking of model iterations for auditing and regulatory compliance
Orchestrated a Continuous Integration and Continuous Deployment (CI/CD) framework using AWS CodePipeline, automating model versioning in S3 and metadata management in AWS SageMaker Artifactory. Deployed these models seamlessly in AWS SageMaker and hosted them via AWS Lambda, backed by API Gateway, guaranteeing scalable, real-time access to predictive models while optimizing performance and reliability.
Personal Dossier:
Date of Birth
24thJune 1991
Linguistic Abilities
English, Bengali and Hindi
Declaration :
I hereby declare that the information furnished above is true to the best of my knowledge.
Date : ….....................................
Place : (SamyaNandy)