Exploring AWS Transform Custom: Part 1 of 3

Table Of Contents

AWS Transform Custom: AWS CDK TypeScript to AWS CDK Python Conversion

Welcome to Part 1 of my three-part series exploring AWS Transform (ATX Custom) for code modernization. We’ll start with a foundational language conversion, progress to infrastructure-as-code transformations, and end with a complex legacy application modernization to experiment and (hopefully) demonstrate the capabilities of ATX Custom! :)

Looking for the output of the ATX Custom execution? See Here!

The Modernization Opportunity

Organizations face mounting technical debt that consumes 20-30% of development resources. Whether you’re inheriting legacy codebases, upgrading end-of-life runtimes, or standardizing across teams, code transformation is inevitable… traditionally manual, time-consuming, and error-prone.

Common transformation scenarios include:

  • Inheriting Historical Code: Adopting codebases written in unfamiliar languages or frameworks that your team doesn’t actively maintain
  • Runtime Modernization: Upgrading Java 8 → Java 17/21, .NET Framework → modern .NET, or Node.js 14/16 → 18/20
  • Cloud Optimization: Migrating AWS SDK v1 → v2, updating Lambda runtimes, or converting x86 to ARM/Graviton
  • Language Translations: Converting between languages (TypeScript ↔ Python, COBOL → Java, C → Rust)
  • Framework Migrations: Transforming infrastructure code (Terraform → CDK, CloudFormation → CDK) or application frameworks
  • Architecture Evolution: Decomposing monoliths to microservices or modernizing legacy platforms

AWS Transform Custom addresses these challenges through AI-powered automation, reducing transformation time by 60-80% while maintaining accuracy and consistency across enterprise-scale initiatives.

Part 1: Language Conversion Fundamentals

In this installment, we’ll tackle a straightforward but common scenario: converting AWS CDK infrastructure code from TypeScript to Python. This foundational example demonstrates ATX Custom’s systematic approach before we tackle more complex transformations in Parts 2 and 3.

My Scenario: Adopting Infrastructure Code

I wrote portions of the lhci-fargate project back a few years ago in TypeScript (AWS CDK) that deploys Lighthouse CI on AWS Fargate. The infrastructure is solid: VPC, ECS, EFS, Application Load Balancer, WAF, Route53, and ACM certificates spanning 71+ CloudFormation resources.

However, at the time I was writing a decent amount of TypeScript. Today, however, I do not.. and my customers primarily favor Python for infrastructure code. Rather than maintaining TypeScript (a language i’m forgetting) or spending days manually rewriting, I used AWS Transform Custom to automate the conversion.

The benefit here is straightforward: when you inherit code in a language your team doesn’t use, or you’re a hobbyist who’s changed your preferred language: you can convert it to something maintainable rather than learning a new stack or rewriting from scratch.

The Four-Phase Transformation

Phase 1: Planning

AWS Transform analyzed the TypeScript codebase and created a 14-step transformation plan:

  • Identified all infrastructure components and dependencies
  • Mapped TypeScript constructs to Python equivalents
  • Established verification strategy using cdk synth at each step

Key conversions identified:

  • Syntax: camelCasesnake_case, true/falseTrue/False
  • Methods: fromRegistry()from_registry(), tryGetContext()try_get_context()
  • Strings: Template literals → f-strings
  • Environment: process.envos.environ.get()

Phase 2: Execution

The transformation proceeded systematically through 14 steps:

  1. Project Structure: Created requirements.txt, setup.py, updated cdk.json
  2. Entry Point: Converted bin/lhci-fargate.tsapp.py
  3. Stack Definition: Transformed lib/lhci-stack.tslhci_stack.py
  4. Infrastructure Components: Converted VPC, ECS Cluster, EFS, Fargate services, ALB, Route53, ACM, WAF, and monitoring
  5. Tests: Migrated Jest tests to pytest in tests/test_lhci_stack.py
  6. Cleanup: Removed TypeScript-specific files (tsconfig.json, jest.config.js, etc.)

Example transformation:

TypeScript:

const fileSystem = new efs.FileSystem(this, 'LHCIFileSystem', {
  vpc: vpc,
  encrypted: true,
  lifecyclePolicy: efs.LifecyclePolicy.AFTER_14_DAYS
});

Python:

file_system = efs.FileSystem(self, 'LHCIFileSystem',
    vpc=vpc,
    encrypted=True,
    lifecycle_policy=efs.LifecyclePolicy.AFTER_14_DAYS
)

Phase 3: Debugging

AWS Transform identified and resolved several issues:

  • Missing Dependencies: Installed python3-pip
  • Version Corrections: Fixed cdk-watchful version (3.6.0 → >=0.6.233)
  • Virtual Environment: Created .venv and updated CDK configuration
  • Context Caching: Added Route53 hosted zone lookup to cdk.context.json
  • Domain Configuration: Updated to actual AWS values (troydieter.com)

Phase 4: Validation

All 15 exit criteria verified successfully:

✅ CloudFormation synthesis successful with 71+ resources
✅ Python syntax and naming conventions correct
✅ Dependencies properly defined and installed
✅ Tests pass with proper context mocking
✅ Documentation updated
✅ Deployment ready

Results

  1. Time Savings: What would typically take 2-3 days of manual conversion was completed in under an hour with AWS Transform.

  2. Accuracy: All infrastructure components correctly converted with proper Python idioms and CDK patterns.

  3. Verification: Automated validation ensured the transformed code produces identical CloudFormation templates.

Observations

  1. Systematic Approach: The four-phase workflow (Plan → Execute → Debug → Validate) ensures reliable transformations
  2. Incremental Verification: Each step validated independently prevents cascading errors
  3. Context Awareness: AWS Transform handles complex scenarios like Route53 lookups and environment-specific configurations
  4. Production Ready: The output isn’t just syntactically correct—it follows Python best practices and CDK conventions
  5. Continual Learning: Each transformation execution improves future results, capturing organizational expertise in reusable patterns

What’s Next in This Series

This programmatic way of converting code establishes the foundational patterns we’ll build upon:

  • Part 2: Converting Terraform to AWS CDK—transforming infrastructure-as-code paradigms while preserving functionality
  • Part 3: Modernizing a large, complex legacy application architecture—demonstrating ATX Custom’s capabilities at enterprise scale

Each installment increases in complexity, showcasing how the same systematic approach scales from straightforward language conversions to comprehensive architectural transformations.

Getting Started

To try AWS Transform Custom for your own CDK conversions:

  1. Install the AWS Transform CLI
  2. Configure AWS credentials with appropriate IAM permissions
  3. Run atx transform in your TypeScript CDK project directory
  4. Follow the interactive prompts through the four phases

AWS Transform Custom accelerates code modernization by automating the tedious, error-prone work—letting your team focus on innovation rather than manual conversions. Whether you’re inheriting legacy code, upgrading runtimes, or standardizing across teams, ATX Custom delivers 60-80% time savings on transformation tasks.

Share :

Related Posts

Deployment of HashiCorp Vault using Terraform

Deployment of HashiCorp Vault using Terraform

aws-vault Terraform deployment of HashiCorp Vault. This is a work in progress write-up and will change.

Read More
Programmatically backup your Amazon Route53 zones deployed via AWS CDK

Programmatically backup your Amazon Route53 zones deployed via AWS CDK

Programmatically backup your Amazon Route53 zones Table of Contents Overview Deployment Code Requirements Deployment Overview Deployment Diagram Outputs Amazon S3 Bucket Operating Cost Recap Overview Looking for an easy way to backup your Amazon Route53 records to Amazon S3 with proper lifecycle rules and retention? Look no further, as we cover how to deploy this solution using AWS CDK!

Read More
A migration to Github as your SCM for use with cdk-devsecops-cicd-pipeline

A migration to Github as your SCM for use with cdk-devsecops-cicd-pipeline

In this post, we’ll walk through changing an existing pipeline to GitHub. This change integrates into your (new) development workflow but may simplify your repository management by leveraging GitHub’s widely adopted interface and collaboration tools. We’ll lose features like AWS IAM (Identity and Access Management) features, so if those are levied for provisioning your access to an existing AWS CodeCommit repository.. please be mindful of that.

Read More