Passionate AI Engineer with a love for pushing the boundaries of machine learning, deep learning, and data-driven innovation. My journey started with a BSc in Computer Engineering (TED University) and an MSc in Engineering with Management (King’s College London). I’m currently focused on designing, training, and deploying advanced AI models — from Transformers and GNNs to time-series forecasting networks.
- Building AI solutions from the ground up (data collection, cleaning, modeling, deployment)
- Fine-tuning cutting-edge architectures (LLMs, CNNs, RNNs, GNNs)
- Integrating MLOps best practices for scalable, robust deployments
- Fueled by continuous learning, a hands-on approach, and open collaboration
I hold a UK Graduate Visa (valid until January 2027) and am open to relocation for AI, ML, or Software Engineering roles that tackle exciting, high-impact challenges. Let’s shape the future together!
- Deep Learning: PyTorch, CNNs, RNNs, Transformers, GNNs
- NLP & Language Models: Hugging Face Transformers, GPT variants, text generation pipelines
- Reinforcement Learning: Custom RL algorithms, Gymnasium environments
- MLOps: Model deployment (Docker, GCP, Hugging Face Spaces), CI/CD pipelines, monitoring
- Languages: Python, Java, JavaScript, Dart, SQL, C
- Frameworks & Tools: Flask, FastAPI, React, Docker, Git, Google Cloud
- Data Wrangling: Pandas, NumPy, Matplotlib, scikit-learn
- Databases: PostgreSQL, Firebase
- Clear technical communication (articles, notebooks, demos)
- Team leadership and collaboration
- Analytical mindset with data-driven decisions
Tech: PyTorch, PostgreSQL, Flask, React
- Objective: Fine-tuned IBM’s Granite-TimeSeries-TTM-R2 to predict Binance Coin (BNB) prices in 5-minute intervals
- Highlights:
- Deployed a Flask REST API with CI/CD for seamless model updates
- Interactive React dashboard to visualize real-time price trends vs. model predictions
- Scalable data management via PostgreSQL
Tech: PyTorch, Transformers
- Objective: Built a 9M-parameter NanoGPT architecture from scratch to generate Shakespeare-inspired text
- Highlights:
- Character-based tokenization for efficient training on the TinyShakespeare dataset
- Showcases custom attention mechanisms and batching optimizations
Tech: Custom TinyVGG (PyTorch)
- Objective: Classify Alzheimer’s MRI data (4 classes) with a 23k-parameter CNN
- Highlights:
- Achieved 96% accuracy on a large dataset of 86k samples
- Utilized preprocessing and weighted losses for balanced performance
Tech: Logistic Regression, scikit-learn
- Objective: Sentiment classification of 3.6M Amazon reviews
- Highlights:
- Achieved 92% accuracy with thorough data preprocessing and feature engineering
- Documented methodology in a published Medium article
Tech: PyTorch, Graph Attention Networks (GAT), Beta Distribution
- Objective: Reinforcement Learning model for robust articulated robot control
- Highlights:
- Combined GAT layers with Beta policies for improved sample efficiency
- Enhanced resilience in continuous control tasks
Mobile Application Developer Intern – FDNSoft
June 2022 – July 2022
- Developed a secure login/register interface and backend with Flutter + Firebase
- Collaborated with UI/UX designers to optimize cross-platform responsiveness
Systems Engineer Intern – Solus
August 2021 – September 2021
- Optimized Nginx server performance and Linux configurations
- Coordinated security protocols and high-load management for multiple web applications
- LinkedIn: linkedin.com/in/mertarcan
- GitHub: github.com/Arjein
- Personal Website: mertarcan.com
- Email: mertarcan8@gmail.com
I’m always up for a chat about AI, deep learning, or any groundbreaking tech. Let’s connect and explore how we can collaborate on the next big thing!
“Pushing the boundaries of AI—one model at a time.”

