Motius GmbH

Contributed to the development of a MLP project for BMW using Python and Terraform for AWS infrastructure. Utilized Agile methodologies, Jira, and Confluence for efficient project management and collaboration resulting in successful deployment in production. Adept at working with a team, with a strong background in Python, Machine Learning, and AWS.

Motious R&D

Gibbs Software Solutions

Implemented a data pipeline to gather daily data from hundreds of US Government websites using Python, AWS, Airflow for crown jobs, Docker, and Power BI for efficient data analysis and display on website. Demonstrated expertise in utilizing various technologies to design and implement data pipeline and data analysis, and proven ability to work with big data and handle large scale data collection and analysis, making me a valuable asset for any organization looking for data-driven solutions.

GIBBS SOFTWARE SOLUTIONS

Snapthat AI

Contributed to the development of end-to-end ETL data pipeline and web scraping for Snapthat, a start-up company specialized in data collection from fashion websites such as Zalando, Namshi and Nordstrom using Python, AWS, Airflow and Docker technologies. Utilized Jira progress board to manage and track progress on the project. Adept at working with a team, with a strong background in Python and a track record of delivering data-driven solutions, making me a valuable addition to any organization looking to implement similar solutions.

Snapthat AI

Projects

Expertise in delivering data-driven solutions with a proven track record of success in both company and freelance projects utilizing Python, AWS, Airflow, Docker, and more.

SnapThat International Fashion

Implemented an end-to-end data pipeline for fashion stores, utilizing Python, AWS, Airflow, Dockers, CI/CD, MongoDB, and data sources such as Zalando, Nordstrom, Zalora, and SthSweet. Expertise in utilizing various technologies to design and implement data pipeline and data analysis.

AI and Deep Learning Final year project (FYP)

Pioneered the development of a Deep Learning model for jewelry prediction, utilizing Python, Tensorflow, Google Cloud Platform (GCP), Docker, Jupyter, and Colab. Expertise in scraping and downloading millions of jewelry images from different fashion stores and assuring no Overfitting and Underfitting.

Cashier

That was the financial base startup, so I built a daily scheduler to get payment data from different payment methods APIs and centralize it in Cashier. Tools and Techniques Python, AWS (Lambda, Serverless, ECR, S3 ...), Apache Airflow, Jenkins, Dockers, GitLab CI/CD, SQL, etc

Local Leads 365

Local Lead 365 is a Gibbs Software Solution Startup, and we were getting Deed data on daily bases for different US states using hundreds of scrapes, for selling and for different confidential visualization and analysis. Uses Python, selenium, Beautifulsoup, AWS, Airflow, Jenkins, Jupyter Notebook, GitLab, PostgreSQL, etc.

Data Visualization and Analysis

Visualization for sold properties in the US, according to price, counties, states, property type, and forecast also. (Visualization Link), using PowerBI, Jupyter Notebook, Python. • I did visualization on health data which I scraped from the WHO website during the corona virus pendamic.

Data Collection for ML

• We scraped data and download more than 300 Million Images and audio and videos for a German AI-base company for their ML model training, we use techniques of parallel programming to scrape data more quickly, using Python, Selenium, Beautifulsoup, AWS (EC2, S3), and CSV to store millions of records, the data source was gettyimages, storyblock, etc. • We scraped Millions of ranking lists for ML training, ranking lists include the ranking of movies, food, players, etc