164 lines
6.6 KiB
Markdown
164 lines
6.6 KiB
Markdown
# NIK AFIQ
|
||
|
||
Tokyo, Japan
|
||
nik@nik4nao.com | github.com/nikafiq | nik4nao.com
|
||
|
||
|
||
---
|
||
|
||
## PROFESSIONAL SUMMARY
|
||
|
||
Backend engineer with 3 years of professional experience designing and
|
||
operating distributed, high-throughput systems on GCP and AWS. Core
|
||
expertise in Go and Python, with hands-on production experience in
|
||
event-driven microservices, Kafka-based pipelines, Kubernetes, and
|
||
cloud-native data infrastructure. Comfortable operating systems at
|
||
hundreds of TPS with reliability and zero-downtime migration
|
||
constraints. Trilingual (English, Japanese N1, Malay) — routinely
|
||
bridges Japanese and overseas engineering teams. Actively integrates
|
||
AI tooling (GitHub Copilot, Gemini, Claude) into daily coding,
|
||
review, and documentation workflows.
|
||
|
||
|
||
---
|
||
|
||
## WORK EXPERIENCE
|
||
|
||
### 株式会社ホープス (Hopes Co., Ltd.) — Tokyo
|
||
**Backend Engineer** | Aug 2025 – Present
|
||
|
||
Designing and operating a distributed consent management pipeline on
|
||
GCP/GKE connecting a high-traffic notification delivery system to a
|
||
downstream fulfillment API.
|
||
|
||
- Proposed and led adoption of a Kafka-based queuing architecture
|
||
to handle concurrent notification fan-out, identifying it as the
|
||
correct solution for account_id ordering under 20–40 TPS load
|
||
- Designed the request coalescing strategy using singleflight to
|
||
suppress duplicate in-flight downstream calls per account_id
|
||
- Architected the full event-driven pipeline: GKE + Managed Kafka
|
||
(8 partitions, keyed by account_id) + Cloud Spanner, with a
|
||
200 TPS global cap and 10-second downstream timeout budget
|
||
- Designed graceful shutdown sequence for the consumer service,
|
||
ensuring in-flight requests complete cleanly before pod termination
|
||
- Designed reliable offset commit ordering: offsets committed only
|
||
after durable Spanner write, ensuring at-least-once delivery with
|
||
no data loss on crash
|
||
- Implemented retry cronjob requeuing up to 5 failed Spanner rows
|
||
back to Kafka every 5 minutes with configurable backoff
|
||
- Designed a zero-downtime interleaved index migration on a Cloud
|
||
Spanner accounts table under 400 TPS sustained read traffic
|
||
- Designed OpenTelemetry integration with Datadog, defining trace,
|
||
span, and metrics strategy across services; integrated with Wiz
|
||
for unified observability and security posture
|
||
- Built CI pipeline with semantic version tag enforcement — prevents
|
||
image tag overwrites while allowing latest to update freely;
|
||
scoped Workload Identity permissions to read-only minimum
|
||
- Led performance testing with Locust (40 TPS steady / 120 TPS
|
||
burst); applied results to right-size GKE CPU/memory configs
|
||
- Led TDD adoption for the team and authored development guidelines
|
||
covering milestone structure, ticket definition-of-done standards,
|
||
and code review expectations
|
||
- Identified a 1-month deadline slip during mob programming,
|
||
escalated to leadership, facilitated full task breakdown and
|
||
schedule re-baseline across the team
|
||
- Stepped up as informal tech lead during a leadership gap —
|
||
created progression guidelines, maintained ticket quality, and
|
||
kept formal leadership informed of all decisions and scope
|
||
- Tasked with onboarding and upskilling Phase 2 application team
|
||
members to raise codebase quality ahead of next release
|
||
|
||
|
||
### 株式会社ニッポンダイナミックシステムズ — Tokyo
|
||
**Full Stack Engineer, IT Solutions — Pharma Market Team**
|
||
| Apr 2023 – Jul 2025
|
||
|
||
- Built a scalable analytical DWH on Amazon Aurora (RDS) for a
|
||
pharmaceutical client, integrating Salesforce and multiple
|
||
external data sources via daily/weekly ETL batch pipelines using
|
||
ECS/Fargate and Lambda; designed for HA with Multi-AZ failover
|
||
- Constructed a SaaS data lake using AWS CDK + Glue +
|
||
TypeScript/Python, fully automating ETL ingestion across
|
||
heterogeneous data sources
|
||
- Developed an internal AI application using AWS Bedrock (Claude
|
||
Sonnet) + React, implementing RAG-based document retrieval and
|
||
SES-based user matching in a small cross-functional team
|
||
- Built a license authentication service (Node.js + Docker + Azure
|
||
Web Apps + ADB2C), owning requirements definition, auth logic
|
||
design, and client-facing communication
|
||
- Designed and automated monthly maintenance operations: AMI image
|
||
updates, security patching, automated regression testing, and
|
||
blue/green deployments via AWS CodePipeline and Azure Pipelines
|
||
- Conducted Docker image vulnerability scanning as part of CI/CD
|
||
pipeline; managed VPC, WAF, and Security Group configurations
|
||
- Mentored junior engineers on cloud architecture patterns;
|
||
functioned as bilingual (EN/JA) liaison between domestic and
|
||
overseas engineering teams
|
||
|
||
|
||
---
|
||
|
||
## SKILLS
|
||
|
||
**Languages:** Go, Python, TypeScript/JavaScript
|
||
**Frameworks:** Gin, Flask, Next.js, Node.js
|
||
**Cloud — AWS:** ECS/Fargate, Lambda, Aurora/RDS, DynamoDB, Glue,
|
||
CDK, CodePipeline, Bedrock, Secrets Manager
|
||
**Cloud — GCP:** GKE, Cloud Spanner, Managed Kafka, BigQuery,
|
||
Cloud Trace
|
||
**Cloud — Azure:** Web Apps, ADB2C, Azure Pipelines
|
||
**Data:** MySQL, Aurora, PostgreSQL, DynamoDB, Cloud Spanner,
|
||
Kafka, Redis
|
||
**DevOps:** Docker, Kubernetes, ArgoCD, CI/CD, IaC (AWS CDK,
|
||
Ansible)
|
||
**Observability:** OpenTelemetry, Datadog, distributed tracing,
|
||
ELK stack, Kibana
|
||
**AI Tooling:** GitHub Copilot (daily coding + code review),
|
||
Gemini (documentation + research), Claude (architecture
|
||
reasoning + coding), AWS Bedrock RAG (production)
|
||
**Security:** VPC, WAF, Security Groups, Secrets Manager,
|
||
Workload Identity, Wiz, Docker vulnerability scanning
|
||
**Other:** Homelab (k3s, self-hosted services, Ansible/IaC),
|
||
personal dev blog at nik4nao.com
|
||
|
||
|
||
---
|
||
|
||
## CERTIFICATIONS
|
||
|
||
| Certification | Issued |
|
||
|---|---|
|
||
| AWS Certified Solutions Architect – Associate (SAA) | Oct 2024 |
|
||
| AWS Certified Developer – Associate (DVA) | Dec 2024 |
|
||
| AWS Certified Cloud Practitioner (CLF) | Apr 2024 |
|
||
| 基本情報技術者試験 (FE) — IPA Fundamental IT Engineer | Aug 2024 |
|
||
| JLPT N1 — Japanese Language Proficiency | Dec 2022 |
|
||
|
||
*In progress: AWS Solutions Architect – Professional (SAP),
|
||
Applied Information Technology Engineer (AP)*
|
||
|
||
|
||
---
|
||
|
||
## EDUCATION
|
||
|
||
**Tokai University** — Bachelor of Engineering
|
||
Major: Electrical and Electronic Engineering
|
||
Minor: Information Technology
|
||
Graduated: March 2023
|
||
|
||
*During a COVID-related leave of absence (2020–2021), independently
|
||
studied programming and cloud architecture; resumed with an
|
||
added IT minor upon return.*
|
||
|
||
|
||
---
|
||
|
||
## ADDITIONAL
|
||
|
||
- **Languages:** English (business), Japanese (JLPT N1), Malay (native)
|
||
- **Homelab:** Self-hosted k3s cluster, Gitea, Jellyfin, Cloudflare
|
||
Tunnel, Ansible-based IaC on Minisforum UM790 Pro
|
||
- **Dev blog / personal site:** nik4nao.com
|
||
- **Self-hosted Git:** gitea.nik4nao.com (mirrored to github.com/nikafiq)
|