Nik Afiq ddf013b6b3
All checks were successful
CI / build-check (push) Has been skipped
CI / build-and-push (push) Successful in 1m37s
initial hugo site with terminal theme
2026-03-17 23:30:49 +09:00

149 lines
5.5 KiB
Markdown
Raw Blame History

This file contains ambiguous Unicode characters

This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

---
title: "CV"
type: "cv"
date: 2026-03-17
draft: false
---
# NIK AFIQ
Tokyo, Japan
nik@nik4nao.com | github.com/nikafiq | nik4nao.com
---
## PROFESSIONAL SUMMARY
Backend engineer with 3 years of professional experience designing and
operating distributed, high-throughput systems on GCP and AWS. Core
expertise in Go and Python, with hands-on production experience in
event-driven microservices, Kafka-based pipelines, Kubernetes, and
cloud-native data infrastructure. Comfortable operating systems at
hundreds of TPS with reliability and zero-downtime migration
constraints. Trilingual (English, Japanese N1, Malay) — routinely
bridges Japanese and overseas engineering teams. Actively integrates
AI tooling (GitHub Copilot, Gemini, Claude) into daily coding,
review, and documentation workflows.
---
## WORK EXPERIENCE
### 株式会社ホープス (Hopes Co., Ltd.) — Tokyo
**Backend Engineer** | Aug 2025 Present
Designing and operating a distributed RCS consent management pipeline
(SO→FoRCE) on GCP/GKE connecting a high-traffic notice delivery
system to a downstream fulfillment API.
- Architected an event-driven pipeline using GKE + Managed Kafka
(8 partitions, keyed by account_id) + Cloud Spanner, handling a
global cap of 200 TPS with a 10-second downstream timeout budget
- Built the Go consumer service (so-notice-receiver) with
singleflight coalescing to prevent duplicate in-flight requests,
and circuit breaker logic to shed load under downstream failure
- Designed reliable offset commit ordering: offsets committed only
after durable Spanner write, ensuring at-least-once delivery with
no data loss on crash
- Implemented a retry cronjob requeuing up to 5 failed Spanner rows
back to Kafka every 5 minutes, with configurable backoff
- Designed a zero-downtime interleaved index migration on a Cloud
Spanner accounts table under 400 TPS sustained read traffic
- Right-sized GKE resource configs (CPU/memory requests and limits)
from Locust load test data at 40 TPS steady / 120 TPS burst
- Propagated distributed traces across service boundaries for
end-to-end production observability
### 株式会社ニッポンダイナミックシステムズ — Tokyo
**Full Stack Engineer, IT Solutions — Pharma Market Team**
| Apr 2023 Jul 2025
- Built a scalable analytical DWH on Amazon Aurora (RDS) for a
pharmaceutical client, integrating Salesforce and multiple
external data sources via daily/weekly ETL batch pipelines using
ECS/Fargate and Lambda; designed for HA with Multi-AZ failover
- Constructed a SaaS data lake using AWS CDK + Glue +
TypeScript/Python, fully automating ETL ingestion across
heterogeneous data sources
- Developed an internal AI application using AWS Bedrock (Claude
Sonnet) + React, implementing RAG-based document retrieval and
SES-based user matching in a small cross-functional team
- Built a license authentication service (Node.js + Docker + Azure
Web Apps + ADB2C), owning requirements definition, auth logic
design, and client-facing communication
- Designed and automated monthly maintenance operations: AMI image
updates, security patching, automated regression testing, and
blue/green deployments via AWS CodePipeline and Azure Pipelines
- Conducted Docker image vulnerability scanning as part of CI/CD
pipeline; managed VPC, WAF, and Security Group configurations
- Mentored junior engineers on cloud architecture patterns;
functioned as bilingual (EN/JA) liaison between domestic and
overseas engineering teams
---
## SKILLS
**Languages:** Go, Python, TypeScript/JavaScript
**Frameworks:** Gin, Flask, Next.js, Node.js
**Cloud — AWS:** ECS/Fargate, Lambda, Aurora/RDS, DynamoDB, Glue,
CDK, CodePipeline, Bedrock, Secrets Manager
**Cloud — GCP:** GKE, Cloud Spanner, Managed Kafka (Pub/Sub),
BigQuery, Cloud Trace
**Cloud — Azure:** Web Apps, ADB2C, Azure Pipelines
**Data:** MySQL, Aurora, PostgreSQL, DynamoDB, Cloud Spanner,
Kafka, Redis
**DevOps:** Docker, Kubernetes, ArgoCD, CI/CD, IaC (AWS CDK)
**Observability:** Distributed tracing, ELK stack, Kibana
**AI Tooling:** GitHub Copilot (daily coding + code review),
Gemini (documentation + research), Claude (architecture
reasoning + coding), AWS Bedrock RAG (production)
**Security:** VPC, WAF, Security Groups, Secrets Manager,
Docker vulnerability scanning
**Other:** Homelab (k3s, self-hosted services, Ansible/IaC),
personal dev blog at nik4nao.com
---
## CERTIFICATIONS
| Certification | Issued |
|---|---|
| AWS Certified Solutions Architect Associate (SAA) | Oct 2024 |
| AWS Certified Developer Associate (DVA) | Dec 2024 |
| AWS Certified Cloud Practitioner (CLF) | Apr 2024 |
| 基本情報技術者試験 (FE) — IPA Fundamental IT Engineer | Aug 2024 |
| JLPT N1 — Japanese Language Proficiency | Dec 2022 |
*In progress: AWS Solutions Architect Professional (SAP),
Applied Information Technology Engineer (AP)*
---
## EDUCATION
**Tokai University** — Bachelor of Engineering
Major: Electrical and Electronic Engineering
Minor: Information Technology
Graduated: March 2023
*During a COVID-related leave of absence (20202021), independently
studied programming and cloud architecture; resumed with an
added IT minor upon return.*
---
## ADDITIONAL
- **Languages:** English (business), Japanese (JLPT N1), Malay (native)
- **Homelab:** Self-hosted k3s cluster, Gitea, Jellyfin, Cloudflare
Tunnel, Ansible-based IaC on Minisforum UM790 Pro
- **Dev blog / personal site:** nik4nao.com
- **Self-hosted Git:** git.nik4nao.com (mirrored to github.com/nikafiq)