×
Brett Moan

Brett Moan

Senior Platform Engineer

Surprise, AZ, 85387
English

Background


About

About

Experienced Senior Engineer with a decade of experience in architecting, designing, and maintaining data-centric applications. Equally adept in data engineering and application development. Very self-sufficient — when in doubt, I just read the manual.

Work Experience

Work Experience

  • Senior Cloud Engineer

    Jan, 2025 - Present

    Lead team of 4 Developers. Designed and implemented replacement template for various manually and adhoc maintained GCP pipelines. New templated solution ensured proper controls for authorized changes and proper testing before moving to production. It also maintained development teams's ability to deploy when ready.

    Worked with Risk and Compliance to identify proper enforcement of controls for audit.

    Implemented replacement for service account tokens, using Workload Identity using ephemeral OIDC tokens. This improved security and minimized risk of leaked credentials, by allowing long-lived credentials to be retired.

  • Senior Data Engineer

    Oct, 2024 - Jan, 20253 months

    Designed solution for data ingestion for de-identification of PII elements. Stripped out and replaced PII elements with a UUID for de-identification purposes.

    Built AWS Glue Jobs for ingesting de-identified data from the DynamoDB exports into flattened, relational models, and ingesting them into Apache Iceberg.

    Migrated older Scripts hard-coded python scripts to generic scripts parameterized with toml-based config to simplify and minimize tech debt.

  • Senior Software Engineer

    Jan, 2022 - Sep, 20242 years 8 months

    Automated Infrastructure with Terraform, Testing and Deployments with GitHub-actions.

    Built APIs, Pub/Sub, and data movement with Python.

    Designed, built and maintained multiple highly scalable event driven microservices to support business automation. Automation required integrating with customers' point of sale systems. The product automatically ordered new inventory when the customer reported their new inventory positions, allowing the customers to keep user defined minimums on hand.

    Built alerting for customers when automation ran into supply issues, or other configuration limitations enabling users to manually order alternative products.

    Built and production-ized multiple data processing pipelines for various business use cases with combinations of Google Cloud Storage, Google Pubsub, Google Bigquery and Google CloudSQL PostgreSQL.

    Implemented bi-directional syncing between postgres databases and business Salesforce applications.

    Built alerting for both business and fellow engineers when systems are unhealthy.

    Implemented custom fuzzy search to assist with ergonomics and typos as users searched complex product numbers.

  • DevOps/Data Engineer - I

    Jan, 2019 - Jan, 20223 years

    Tech lead for for team of DevOps engineers with for both production support and build of new modernization.

    Support centralized CI/CD pipelines templates for 100’s of Data Engineers.

    Mentored Junior and intermediate Devs in SQL, Python, and DevOps practices.

    Architected and Implemented:

    • docker images and CI/CD pipelines to support internal tools and those for other teams.
    • packaging solution based on setuptools that helps novice python devs write and publish python packages to company hosted pypi in artifactory.
    • key rotation utility for snowflake, to allow custom integrations for SafeNet (vendor key vaults) and CyberArk (vendor password safe) to allow syncing of authentication keys to Snowflake DB.
    • complete modernization re-write of python CLI application self healing ETL. (see Data Engineer I accomplishments)
  • Data Engineer - II

    Jan, 2017 - Jan, 20192 years

    Lead team of Data Engineers for development and maintenance of ETL.

    Continued support of existing Frameworks developed as a Data Engineer I.

    Mentored Jr Devs in Database Design, SQL, ETL design and Python.

    Architected and implemented:

    • ETL using a combination of Control-M, Informatica and pyspark.
    • BI solutions using Tableau and SQL Server Analysis Services
    • .Net and Python framework for syncing SQL Server Analysis Services OLAP Cubes to an underlying Datasource.
  • Data Engineer - I

    Jan, 2015 - Jan, 20172 years

    Developed ETL using a combination of Control-M and Informatica.

    Developed BI solutions in Business Objects Webi.

    Designed and developed python libraries for internal use.

    Co-authored patented approach for “Self Healing” ETL.

    Authored python CLI application for reusable components as part of self healing ETL. that implements full data regression testing on data being reloaded. In practice this:

    • validates intended data changes from ETL logic changes.
    • alerts for source data changes, showing their effects on the data mart.
    • alerts for detection of non-deterministic ETL, finding defects in ETL logic.
Skills

Skills

  • Soft Skills

    Negotiating

    Influencing

    Mentoring

    Critical Thinking

    Teamwork

    Communication

    Work Ethic

  • Software Engineering

    Python

    Terraform

    SQL

    Automation

    Application Design

    Database Design

    DevOps Practices

    Business Intelligence

  • Languages and Tools

    Python

    Terraform

    Kubernetes

    PostgreSQL

    GCP

    CI/CD (GitHub Actions, Azure Pipelines)

    Bash

    Dockerfiles

    Helm

    Hadoop

    Netezza

    Powershell

    SQL Server

    DB2

    Salesforce

    SAP Business Objects

    Tableau

    SSAS