The Charter Technical Engineering Center (CTEC) facilities in Englewood, CO oversees the design and architecture of Charters multi-billion dollar network infrastructure. We investigate, select, develop, and integrate technologies and solutions that meet the needs of the company for short, medium and long term initiatives. This includes the delivery of the technology plan and future architecture for Voice, Video, Data, Optical, Commercial, Cloud, CPE, Network and Access.
Responsible for overseeing the development of data-driven solutions to Charters business problems. Utilizes analytical, statistical, and programming skills to clean, aggregate, and analyze large data sets and interpret results. Create and maintain scalable, reliable, consistent and repeatable platforms, systems, and models that support data, data products, and analytical products. This position requires a strong command of statistical techniques and machine learning algorithms, as well as a demonstrated practical ability to determine where to invest time, synthesize actionable findings across diverse assignments, and present findings to audiences with diverse agendas and varying levels of technical expertise.
MAJOR DUTIES AND RESPONSIBILITIES
- Actively and consistently supports all efforts to simplify and enhance the customer experience.
- Oversee and execute the complete analytics life-cycle for problem solving, including: requirements gathering, problem formulation, data grooming, data exploration, model prototyping, model validation, and algorithm productionalization.
- Frame and model meaningful business and engineering scenarios that impact critical business and engineering processes, architectures, and/or decisions.
- Make strategic recommendations and implementations for data architectures, data collection processes, and analytics platform integration.
- Research and develop machine learning models and algorithms for data analysis and discovery.
- Develop innovative and effective approaches to solve analytics problems and communicates results and methodologies.
- Oversee large-scale exploratory data analyses for new data sources or new uses for existing data sources. Establish links across existing data sources and find new, interesting data correlations.
- Apply data mining, data discovery, and machine learning techniques to large structured and unstructured datasets for exploratory data analysis and data product creation.
- Create data, operational, and analytics architectures that enable data engineers to maintain scalable, reliable, consistent and repeatable systems that support data operations for analytics.
- Balance workload and operational demands with open source technologies, cloud services, and commercial solutions while optimizing cost and time-to-solution demands.
- Architect self-monitoring, robust, scalable interfaces and data pipelines for 24/7 operations.
- Create highly reusable code modules, templates, and packages that can be leveraged across the data and analytics lifecycle.
- Increase speed to delivery by architecting and implementing automated solutions across the data and analytics lifecycle.
- Mentor, educate, and provide senior leadership to data analysts, data engineers, and business intelligence analysts.
REQUIRED QUALIFICATIONS: Skills/Abilities and Knowledge
- Ability to read, write, speak and understand English.
- DBA, user, and programming with SQL-based, NoSQL-based, and columnar database technologies.
- Visualization or BI tools, such as Tableau, Sigma Computing, Microstrategy, RapidMiner, or anything Microsoft Power BI.
- Creating proof of concept experiments for analytics, machine learning, or visualization tools that includes hypothesis, test plans, and outcome analysis.
- Experience receiving, converting, and cleansing big data.
- Solid statistical knowledge and techniques.
- Program, product, or project management experience delivering analytics results.
- Strong background in Linux/Unix/CentOS and Windows installation and administration.
- Ability to identify and resolve end-to-end performance, network, server, cloud, and platform issues.
- Excellent pattern recognition and predictive modeling skills.
- Experience with Hadoop, Spark, and/or Snowflake.
- Keen attention to detail with the ability to effectively prioritize and execute multiple tasks.
- Ability to read, write, speak and understand English.
Bachelors degree in a data science, engineering discipline, computer science, statistics, applied math, or related field preferred.
Related Work Experience / Number of Years
Data manipulation and statistical modeling as a Scientist, Consultant, Architect, DBA, or Engineer / 12
Lead the design, development and deployment of data platforms or data fabrics to support data scientists, data analysts, and ML engineers. / 7+
Experience delivering one major system where candidate was responsible for designing the architecture, implementing, operating, supporting, and managing the release lifecycle of releases to end of life.
PREFERRED QUALIFICATIONS: Skills/Abilities and Knowledge
Experience with Snowflake/Data Clouds/Cloud Analytics Databases.
Experience creating workflows and data pipelines with Python, Jinja, SQL, dbt, and YAML in any combination.
Familiarity with data workflow/data prep platforms, such as Alteryx, Pentaho, or KNIME.
Knowledge of best practices and IT operations in an always-up, always-available service.
Masters degree or PhD in related field.
Highly collaborative and innovative work space
Occasional Travel BDA830 302630 302630BR
Here, employees dont just have jobs, they build careers. Thats why we believe in offering a comprehensive pay and benefits package that rewards employees for their contributions to our success, supports all aspects of their well-being, and delivers real value at every stage of life.
The pay for this position has a minimum of $151,300.00. The actual salary offer may be higher as we carefully consider a wide range of factors, including your skills, qualifications, experience and location. Also, certain positions are eligible for additional forms of compensation such as bonuses.