Nexla→
Forward Deployed Engineer - AI
Entry LevelHybridFull-time
Location
San Mateo, CA
Salary
$130k–$160k/yr
Experience
1+ years
Posted
1 month ago
Skills
pythonai systems deploymentdata infrastructure knowledgecustomer-facing skillscloud-native experiencehigh agencyoutcome ownershiptechnical discovery calls
Job Description
Summary: Nexla is a leading integration platform focused on AI-driven solutions for data management. As a Forward Deployed Engineer - AI, you will work directly with strategic customers to understand their data challenges and implement tailored solutions using the Nexla platform, while also contributing to product development based on customer insights.
Responsibilities:
- Embedded with strategic customers to understand their data infrastructure, integration requirements, and business goals then architect and implement solutions using the Nexla platform
- Build production data pipelines, connectors, and workflows tailored to each customer's environment. This includes working across their data warehouses, SaaS apps, streaming systems, and internal APIs
- Design and deploy AI-powered data flows leveraging Nexla's intelligent connectors, context-aware transformations, and agentic capabilities to solve problems that traditional integration tools can't
- Own the technical delivery end-to-end. You scope the work, build it, test it, deploy it, and ensure it runs reliably. When something breaks at the customer site, you're the first responder
- Navigate complex enterprise data environments - messy schemas, undocumented APIs, legacy systems, regulatory constraints. You figure it out and make it work
- Identify repeatable patterns across customer engagements. What you build for one customer should inform what we build into the platform for everyone
- Surface product gaps, UX friction, and missing capabilities directly to the Product and Engineering teams. You're our best source of ground truth on what customers actually need
- Contribute back to the Nexla platform build reusable connectors, templates, and reference architectures that accelerate future deployments
- Write the deployment playbooks, integration guides, and best practices that make the next engagement faster and smoother
- Support Sales during high-stakes technical evaluations. Help prospective customers see what's possible by scoping solutions and building proof-of-value demonstrations
- Act as the trusted technical partner for customer stakeholders from individual data engineers to VPs of Data. You translate between their world and ours
- Collaborate with Customer Success to ensure long-term adoption and expansion. The deployment doesn't end when the pipeline goes live
Required Qualifications:
- 1+ years of software engineering experience, with a track record of building and deploying production systems. Former technical founders or engineers with consulting experience are a great fit
- Experience building or deploying agentic AI systems in enterprise environments
- Strong Python skills. You're comfortable building data pipelines, writing integrations, working with APIs, and scripting your way through ambiguous problems
- Knowledgeable about enterprise data infrastructure data warehouses (Snowflake, Databricks, BigQuery), streaming platforms (Kafka), ETL/ELT patterns, and the modern data stack
- Hands-on experience with AI/LLM technologies prompt engineering, agent development, orchestration frameworks, or building AI-powered workflows. You don't need to be an ML researcher, but you should be able to build with these tools
- Customer-facing confidence. You can run a technical discovery call, whiteboard an architecture with a customer's data team, and explain trade-offs to non-technical stakeholders without dumbing things down
- High agency. You thrive in ambiguous situations where the problem isn't well-defined and the playbook doesn't exist yet. You figure it out, make decisions, and move
- You've owned outcomes, not just tasks. You've been the person responsible for making something work in a real environment, not just in a staging cluster
Preferred Qualifications:
- Experience at a data integration, iPaaS, or ELT company (Fivetran, MuleSoft, Informatica, Airbyte) or in a customer-facing engineering role
- Familiarity with data protocols and patterns: CDC, GraphQL, gRPC, Webhooks, REST
- Cloud-native experience (AWS, GCP) including working with managed data services, IAM, networking
- Comfort with travel (up to 25%) to embed with customer teams when needed
Required Skills: Python, AI systems deployment, Data infrastructure knowledge
Important Skills: Customer-facing skills, Cloud-native experience
Nice-to-Have Skills: High agency, Outcome ownership, Technical discovery calls
Benefits: Medical, Dental, Vision, 401k, Flexible PTO
Benefits
Medical
Dental
Vision
401k
Flexible PTO