Full Stack Developer · Desktop & Extension Engineer · AI-Integrated Systems · Self-Hosted Infra · Software Architect
I started writing code to solve problems, and somewhere along the way I got obsessed with how systems are built — not just whether they work, but whether they'll still make sense six months from now when someone else (or future me) has to touch them.
These days I work across the full stack: web, desktop, mobile, browser extensions, IoT, and everything in between. I build my own infrastructure instead of renting it — auth, storage, realtime, per-tenant isolation — because I like understanding what's actually happening inside the box.
I care a lot about architecture. Not in a theoretical way, but in a "this decision will hurt us in three months" kind of way. Clean code, SOLID principles, CQRS, event sourcing, hexagonal architecture — these aren't buzzwords to me, they're the difference between a codebase that stays healthy and one that slowly becomes a burden.
When the problem calls for it, I go lower in the stack. I've written Rust and C++ compiled to WebAssembly, running inside a browser or Node.js at near-native speed. I enjoy learning new languages — not for the resume, but because every language forces you to think differently.
I've also spent time at the edges: Solidity on Ethereum, smart contracts on NEAR, IoT pipelines with Raspberry Pi and Arduino, digital signatures with PKI infrastructure, AI pipelines with LLMs wired directly into application logic.
I don't use tools because they're popular. I use them because they're the right fit — and I'm comfortable admitting when something simpler would work better.
If you're building something ambitious and you need someone who thinks about the whole system, not just their slice of it — let's talk.
Swift & iOS — foundational level; familiar with the ecosystem, UIKit basics, and Xcode toolchain
I bridge software with the physical world — building smart automation systems that connect hardware, sensors, and cloud-connected logic:
- 🧠 Raspberry Pi — full Linux environment on the edge; runs Node.js, Python, or compiled binaries as a local server, data collector, or automation controller
- ⚙️ Arduino — low-level microcontroller programming for sensor reading, motor control, and real-time hardware interaction
- 📡 Sensor Integration — temperature, humidity, motion, distance, light — read, process, and act on physical data in real time
- 🌐 IoT Architecture — devices publish data via MQTT or HTTP to a central broker; processed server-side, stored in time-series or relational DBs, visualized on dashboards
- 🏠 Smart Automation — trigger-based logic connecting hardware events to software actions: alerts, actuators, notifications, remote control
- 🔗 Edge + Cloud hybrid — local processing on the device for low-latency decisions, synced to cloud for storage, analytics, and remote management
- 📲 Hardware ↔ App bridge — IoT devices communicate with web dashboards, mobile apps, or desktop clients through the same API-first architecture I apply everywhere
I design systems with maintainability and scalability in mind from day one:
Architectural Styles
- 🧅 Onion Architecture — domain-centric layering, dependency inversion at every boundary
- 🧼 Clean Architecture — use-case driven, framework-independent core
- 🔷 Hexagonal Architecture (Ports & Adapters) — pluggable infrastructure, testable domain
- 🧱 Layered Architecture — structured separation of presentation, business, and data layers
- 🔀 Microservices — independently deployable services, per-service databases
- 🌐 Modular Monolith — monorepo-scale apps structured as independently evolvable bounded contexts before splitting
Code Quality & Craft
- ✍️ Clean Code — readable, intention-revealing code where naming, structure, and responsibility boundaries are first-class concerns; code that communicates clearly without needing explanation
- 📏 SOLID principles — Single Responsibility, Open/Closed, Liskov Substitution, Interface Segregation, Dependency Inversion; applied consistently at every layer, not just cited
- 🚫 DRY / YAGNI / KISS — no premature abstractions, no redundant logic, no complexity that doesn't earn its place
- 🔁 Refactoring discipline — continuous improvement without changing behavior; treating legacy codebases as opportunities rather than obstacles
- 🧪 Test-driven mindset — tests as a design tool that forces clean interfaces and honest boundaries
Platform-Agnostic Architecture & Shared Core
When a system needs to run across multiple platforms — web, desktop, mobile, API — I don't write separate logic for each. I design a shared core engine that runs on the server or as a portable module, and each platform connects to it through the most natural interface available:
- 🏛️ Single source of truth — business logic, validation rules, and domain models live in one place; no duplication across platforms
- 🔌 Core exposed as API — the engine runs server-side or as a local service; web, desktop (Electron/WinForms), and mobile clients each talk to it through HTTP, WebSocket, IPC, or gRPC — whichever fits the platform
- 📱 Platform-optimized clients — each client is thin and speaks the platform's native language: React/Next.js for web, Electron + native IPC for desktop, React Native for mobile — but none of them own the logic
- 🔄 Shared types & contracts — TypeScript interfaces, protobuf schemas, or OpenAPI specs shared across all clients so a change in the core propagates everywhere without drift
- ⚙️ Runtime portability — core modules can be compiled to WASM for in-browser execution, packaged as a native addon for Node.js, or run as a standalone binary; the interface stays the same regardless
- 🧩 Adapter pattern at platform boundaries — each platform integration is an adapter, not a fork; swapping or adding a new platform means writing one adapter, not rewriting the system
Design Patterns
- 📨 CQRS — separate read/write models for complex domains
- 📋 Event Sourcing — state derived from an immutable event log
- 🗃️ Repository Pattern — abstracted data access, swappable storage backends
- 📣 Mediator / MediatR — decoupled request/response pipeline
- 🏭 Factory & Abstract Factory — controlled object creation
- 🎭 Decorator & Strategy — runtime behavior composition
- 👀 Observer / Event-Driven — loose coupling through domain events
- 🔌 Dependency Injection — IoC containers across .NET, Node.js, NestJS
- 🔒 Specification Pattern — composable, reusable business rule encapsulation
- 🗺️ Saga Pattern — distributed transaction management across microservices
- 🔁 Outbox Pattern — reliable event publishing with guaranteed at-least-once delivery
What these patterns enable me to build:
- 📦 Replaceable infrastructure — swap DB, queue, storage provider without touching business logic
- 🔄 Zero-downtime migrations — evolve schemas and services independently
- 🧪 Fully testable cores — unit test domain logic without any framework or DB dependency
- 🏢 Multi-tenant SaaS platforms — isolated data layers, per-tenant config, shared codebase
- 🔍 Full audit trails — every state change traceable via event log
- ⚡ High-throughput read models — denormalized projections optimized for query performance
I have hands-on experience designing and operating the kind of infrastructure that powers developer platforms and BaaS products — built from scratch, self-hosted, and production-grade:
- 🔐 Auth layer — custom authentication service with JWT, OAuth2, session management, role-based access control; isolated per tenant
- 📦 Storage layer — self-hosted object storage with S3-compatible APIs, per-tenant bucket isolation, access policy enforcement
- ⚡ Realtime layer — WebSocket / pub-sub infrastructure for live data sync across clients, scoped per project instance
- ⚙️ Edge functions — lightweight, stateless compute units deployed close to the client; used for auth hooks, webhooks, request transformation, and low-latency custom logic without spinning up a full backend
- 🧩 Per-tenant instance model — each customer or project receives its own isolated stack: dedicated database, storage bucket, auth namespace, and runtime config — no shared state between tenants
- 🔌 Modular service composition — services are independently deployable and replaceable; a project can opt into only the modules it needs
- 📡 Internal API gateway — unified entry point routing traffic to the correct tenant's services, with rate limiting, auth validation, and observability built in
- 🏢 SaaS platform architecture — the full stack required to offer infrastructure-as-a-service to end users: provisioning, isolation, billing hooks, usage metering, and tenant lifecycle management
Containerization & Orchestration
Self-Hosted PaaS & Deployment Platforms
I self-host deployment platforms instead of relying on managed PaaS — full control over infra, zero vendor lock-in:
- 🚀 Dokploy — self-hosted Heroku/Render alternative, Docker & Compose deployments
- 🐳 Portainer — container management UI for Docker & Kubernetes environments
- 🔁 Watchtower — automated container update & redeployment pipelines
- 🌐 Coolify / Caprover style self-hosted app platforms for isolated project deployments
Monitoring, Observability & Alerting
- 📊 Prometheus — metrics scraping, custom exporters, time-series storage; exposes app internals as queryable data
- 📈 Grafana — dashboards built on top of Prometheus, Loki, and Elasticsearch; from infra health to business metrics in one view
- 📋 Loki — log aggregation without indexing overhead; paired with Grafana for correlated logs + metrics in the same timeline
- 🔔 Alerting pipelines — threshold-based and anomaly alerts routed through Grafana Alertmanager, PagerDuty-style escalation chains, or custom notification endpoints
- 🧵 Distributed tracing — request tracing across microservices to pinpoint latency and failure points
Webhooks & Event Integration
- 🪝 Inbound webhooks — receive and process events from third-party platforms (payment gateways, Git providers, form tools, IoT devices) with signature verification and idempotency handling
- 📤 Outbound webhooks — notify external systems on internal state changes; retry logic, delivery guarantees, dead-letter queues
- 🔗 Event bus integration — Kafka and RabbitMQ as the backbone for decoupled, async communication between services
- ⚡ Realtime event pipelines — WebSocket broadcasting, SSE streams, and pub/sub patterns for live UI updates triggered by backend events
- 🔁 Webhook orchestration — chaining multiple services through event triggers; one action in system A automatically propagates through B, C, and D without tight coupling
I stay current with the modern JavaScript/TypeScript tooling landscape and adopt faster alternatives as they mature:
- 🏎️ pnpm is my default — strict dependency resolution, disk-efficient, monorepo workspace support out of the box
- ⚡ Bun for projects where raw runtime speed matters — also doubles as a bundler, test runner, and package manager in one
- 🔧 Vite / esbuild / SWC over Webpack wherever possible — faster dev feedback loops and leaner production bundles
- 📦 Turborepo / Nx for monorepo orchestration — incremental builds, task caching, shared packages across web and desktop targets
- 📝 Smart Contract Development — written and deployed Solidity contracts on Ethereum; understand the EVM execution model, gas mechanics, storage layout, and common vulnerability patterns (reentrancy, overflow, access control)
- 🔐 Wallet & Auth integration — connecting dApps to MetaMask and other Web3 wallets; signing messages for authentication without passwords
- 🌐 NEAR Protocol — familiar with NEAR's account model and its positioning as an AI-friendly chain; WebAssembly-based smart contracts and the cross-shard architecture that sets it apart from EVM chains
- 🤖 Blockchain × AI — understanding of how AI agents interact with on-chain data and smart contracts; decentralized compute and verifiable inference as emerging patterns
- 🧩 dApp architecture — frontend connecting to on-chain state via RPC providers; event listening, transaction lifecycle, and handling chain reorganizations gracefully
I understand the cryptographic and legal foundations of document signing — and I integrate these into software systems correctly:
- ✍️ Electronic & Digital Signatures — understand the distinction between simple e-signatures and cryptographically verifiable digital signatures (PKI-based, X.509 certificates); know where each is legally applicable
- ⏱️ Trusted Timestamping (RFC 3161) — embedding cryptographic timestamps into signed documents to prove a document existed in a specific state at a specific point in time; relevant for legal, notarial, and audit contexts
- 🏛️ Regulatory compliance — work within officially recognized signature infrastructures (e-Devlet, TÜBİTAK BİLGEM, PTT e-imza in Turkey); self-issued certificates have no legal standing and I design systems accordingly
- 🔐 PKI fundamentals — certificate chains, CA trust hierarchies, certificate revocation (CRL / OCSP), and how signing pipelines validate document authenticity end-to-end
- 📄 Document integrity pipelines — hash-based tamper detection, signed PDF generation (PAdES), and audit trail systems that can prove a document has not been altered after signing
- 🔗 Hybrid approaches — combining traditional PKI signatures with blockchain-anchored timestamping for immutable, decentralized proof of existence
I pick the right tool for the job — and I'm comfortable going deep in whichever language a problem demands:
Cross-runtime & polyglot capabilities:
- ⚡ WebAssembly (WASM) — compile systems-level code (Rust, C, C++) to portable binary modules and execute them inside the browser, Node.js, or edge runtimes; used where JavaScript's performance ceiling is a constraint, not a choice
- 🔗 Native Node.js addons — expose performance-critical Rust or C++ logic as first-class npm packages via
napi-rsornode-gyp, keeping the JS/TS developer experience intact - 🧵 Concurrency across paradigms — threads & ownership (Rust/C++), async/await (JS, C#), actor model (Erlang/Elixir), goroutines & channels (Go); I choose the model that fits the problem, not the other way around
- 🌍 Language-agnostic contracts — regardless of what a backend is written in, I design clean HTTP/gRPC/WebSocket interfaces that any client can consume without coupling to the implementation
- 📚 Deliberate language acquisition — learning a new language is something I do with genuine interest; each one introduces different constraints and trade-offs that permanently improve how I think about architecture and system design
I don't just use AI tools — I build systems that are controlled and orchestrated through code:
- 🧠 LLM API Integration — OpenAI, Anthropic, Ollama (local), and custom model endpoints wired into application logic
- 🔁 AI Pipelines — multi-step prompt chains, retrieval-augmented generation (RAG), context management across sessions
- 📋 AI-Powered Admin Panels — dashboards that autonomously collect, summarize, and act on data using language models
- 🗂️ Content Aggregation + Summarization — pipelines that pull articles, links, or documents from multiple sources and produce structured AI summaries
- 🧩 Code-Driven AI Orchestration — zero "no-code" wrappers; everything is implemented as first-class application logic
- 📊 Structured Output Extraction — parsing LLM responses into typed data models for downstream processing
- 🔍 Semantic Search — vector embeddings + similarity search for intelligent content retrieval
I build browser extensions (Chrome, Edge, Firefox) that go beyond simple UI tweaks:
- 🔐 Authenticated Extensions — extensions that log into platforms on behalf of the user and interact with pages in session context
- 🕸️ Web Scraping & Data Collection — extensions that browse pages, extract structured content (links, articles, metadata), and push to a centralized backend
- 📡 Extension ↔ App Bridge — native messaging between browser extensions and local desktop apps or backend APIs
- 🧠 AI-Augmented Extensions — collected data is processed through LLM pipelines (summarization, classification, tagging) inside the extension or server-side
- 📦 Content Injection — injecting custom UI, logic, or tracking code into third-party pages via content scripts
- ⚙️ Background Service Workers — long-running jobs, scheduled scraping, push notification handlers
I implement client-side data collection and embed third-party capabilities into any platform:
- 🔢 Pixel & Event Tracking — custom tracking pixels (similar to Meta Pixel, Google Tag) embedded in any web surface for analytics and conversion events
- 🧲 Embeddable Widgets — self-contained JS snippets that can be dropped into any third-party page (chat widgets, form collectors, live feed embeds)
- 🏷️ Tag Management — dynamic script injection, GTM-style tag firing based on user behavior and page context
- 🔗 Webhook & Postback Systems — server-to-server event delivery for tracking conversions and syncing data across platforms
- 📊 Analytics Pipelines — raw event ingestion → enrichment → dashboard, built entirely in-house without relying on GA or similar SaaS
"If you're building something ambitious and you need someone who thinks about the whole system, not just their slice of it — let's talk."