Avigyan Kumar

$25/hr
I am Data Scientist with more than one year of experience.
Reply rate:
-
Availability:
Part-time (20 hrs/wk)
Age:
25 years old
Location:
Pune, Maharastra, India
Experience:
1 year
About

I have more than one year of experience in machine learning and deep learning. I have been worked on many project of machine learning and deep learning domain. I have good experience in computer vision or image processing too. I have worked on statistical machine learning problem where i am able to full fill my client requirement statistical learning my experience in classification problem. I have also worked on content mining project so that i am able to get exposure of that area too. Now in computer vision i have worked on object detection and image mask using opencv. After that i have been also worked on object detection using yolo of convolutional neural network. I have also worked on classification problem of white blood cell in which i have to classify picture of blood image into five different type of white blood cell, in this case convolutional neural network has been used. Now in natural language processing i have worked of language modelling problem and also worked on spelling correction. Apart from this i also replicate my statistical machine learning model on spark for bigdata problem because in case bigdata sklearn got crash won't able train model properly, so that me and my team shifted to use spark mllib for our problem. For one of my client requirement i am using spark mllib kmeans but spark kmeans not giving result as sklearn giving so i wrote whole code in scala then it start giving result as sklearn. Now for deep learning i am using tensorflow and keras framework. I have also worked on dask for distributed processing of python function and processing. I have also experience in python django framework. I have deliver many project timely to my client. I have completed my many certification of machine learning and deep learning from coursera and edureka. I have also experience of hadoop because in spark i used to keep my data on hadoop cluster, for dask distributed processing i am also using hadoop for sharable data location. I have also experience of map reduce programming in scala functional language. I have good understanding of mongodb database because i have been using this data base for content mining . i have also experience in devops so i am using docker for that in production so many time i distribute the app by using docker image of existing project. I am knowledge of shell scripting also but not at fluent level.

Languages
Simple time tracking. Easy payments. Hubstaff streamlines the way you work with freelancers. Make life easier.
No more hot potato projects. Tru Hubstaff Tasks