Search

Data Platform AI Enablement Engineer

Classic Vacations, LLC
locationTallahassee, FL, USA
PublishedPublished: 6/14/2022
Technology
Full Time

Job Description

Job Description

Role Overview

Classic Vacations is transforming its digital backbone to deliver world-class, AI-powered experiences for both internal stakeholders and external customers. This pivotal role is responsible for stabilizing critical data flows, leading the modernization of our data architecture, and enabling scalable, intelligent systems that power our next generation of products and insights. The Data Platform & AI Enablement Engineer is not just a builder of pipelines—they are an architect of possibility.

Why This Role Matters

At Classic Vacations, we believe that clarity, alignment, and purpose are the foundation of a healthy, high-performing organization—and this role is central to advancing that mission. The Data Platform & AI Enablement Engineer plays a pivotal role in shaping the future of how we leverage data to support our partners, delight our customers, and empower our colleagues. This is more than a technical role; it’s a strategic opportunity to drive meaningful impact.

You will empower decision-making across core departments such as Finance, Product, and Customer Experience by delivering trusted, real-time data. By modernizing our data infrastructure and migrating legacy systems to a scalable, cloud-native platform, you’ll eliminate inefficiencies and reduce technical debt. At the same time, you’ll help lay the groundwork for robust, ethical, and high-performing AI and machine learning pipelines—capabilities that will transform how we serve the travel industry.

This role also offers a chance to lead through influence, sharing your expertise with others and fostering a culture of excellence, performance, and shared learning. If you thrive in a high-trust environment and are driven by the tangible outcomes of your work, you won’t just be part of Classic Vacations’ transformation—you’ll be instrumental in leading it.

Key Responsibilities

Reasonable accommodations may be made to enable individuals with disabilities to perform these essential functions.

In alignment with Classic’s commitment to building a cohesive and high-performing team, your responsibilities will include:

Phase 1: Stabilize and Maintain Today’s Ecosystem

  • Administer and tune Microsoft SQL Server environments, supporting business-critical schemas.
  • Manage Tableau Server/Cloud, ensuring reliable dashboard refreshes and robust permissions.
  • Monitor and support Finance, Product, and CX reports with foundational data-quality checks.
  • Respond to ad-hoc data and performance needs from business stakeholders and developers.

Phase 2: Modernize the Platform

  • Design schemas in Amazon Redshift or Google BigQuery, preparing for cloud-native scalability.
  • Build, test, and document dbt pipelines; manage them with Terraform and Git-based CI/CD.
  • Automate Tableau flows that align tightly with dbt and semantic layer standards.
  • Replace manual, error-prone feeds with validated ingestion pipelines using tools like Great Expectations.

Phase 3: Enable AI at Scale

  • Stand up streaming architectures (e.g., Kafka, Kinesis) and manage ML feature stores.
  • Embed analytics into customer-facing applications via APIs and securely expose governed datasets.
  • Implement data observability solutions (e.g., Monte Carlo, Metaplane) to uphold SLAs and data trust.
  • Lead initiatives to improve data engineering culture: performance, documentation, scalability, and AI-readiness.

The above is intended to describe the general content of and requirements for the performance of this position. It should not be construed as an exhaustive statement of duties, responsibilities, or physical requirements. Nothing in this job description restricts management’s right to assign or reassign duties and responsibilities to this position at any time.

Qualifications & Skills

In line with Classic’s values of continuous improvement and organizational health, we seek a leader who exemplifies both technical expertise and a collaborative spirit.

Required Technical Experience:

  • Candidates should have a minimum of five years managing Microsoft SQL Server environments with proven expertise in performance tuning and production support. Additionally, at least three years of experience administering Tableau Server or Tableau Cloud is required, including familiarity with Tableau Prep workflows and Hyper extract pipelines. The ideal candidate will be comfortable working with Tableau's REST and Metadata APIs and managing role-based access.

Required Data Engineering Skills:

  • Strong proficiency in Python and modern SQL is essential, along with disciplined use of Git workflows for code management. Hands-on experience with cloud data services in AWS or GCP—such as Redshift, BigQuery, S3, Glue, or Pub/Sub—is highly valued. Candidates should have production-level experience with dbt (data build tool), Terraform, and automated data validation frameworks such as Great Expectations.

Required Platform & Infrastructure Expertise:

  • A solid understanding of CI/CD principles, infrastructure-as-code, and scalable data platform design is required. Familiarity with streaming architectures (Kafka, Kinesis) and feature stores is a plus, especially for those passionate about enabling machine learning use cases.

Preferred Qualifications:

Ideal candidates may also bring experience optimizing Tableau performance (including VizQL tuning and materialized views), implementing embedded analytics with row-level security, and experimenting with modern AI data patterns such as vector stores or retrieval-augmented generation (RAG) for LLM applications

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...