Data Services Engineer - Remote
As focused as we are on our customer, we take that same focus on finding the right talent for the right opportunities within our organization. Across the nation, from our home office and operations centers to our retail locations and reconditioning centers, we are looking for talented individuals like yourself to join our ever-growing team!
- Data Platforms - Snowflake, SQL Server 2016-2019, Azure (SQL Database, Cosmos, Mongo)
- Data Integration/ETL - Kafka with Confluent, Python, Stitch, SQL Server Integration Services (SSIS), Informatica.
- Data Orchestration - Argo, YAML, Azure DevOps
- Reporting - Tableau, Excel, SQL Server Reporting Services (SSRS), SharePoint, PowerBI.
- Developer Tools - GitHub
- Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
- Managing database container life cycles through building and hosting data driven docker file creation, developing against internal and external APIs.
- Facilitating data movement between OLTP databases and cloud based data repositories, using CDC, Kafka/Confluent, Argo, python and others.
- Push the limits of SQL by working with the latest tech including Snowflake, NoSQL, Big Data, Polybase, and JSON parsing.
- Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues.
- Review diagnostics and assess the functionality and efficiency of systems
- Offer technical support to company staff and troubleshoot system and application problems
- Identifying areas for improvement in current systems
An ideal candidate has most of these skills:
- 5 + years Administration knowledge of Cloud based database platforms: Snowflake, Azure SQL Managed Instance, Cosmos DB, Azure BLOB
- 3+ Years Design and Administrative knowledge of Security concepts, Backup automation, High Availability Architecture and Disaster Recovery Planning.
- Ability to perform optimization strategies on database systems and processes to proactively manage the health of the environment.
- 3+ years using scripting programs such as PowerShell and python to create automation routines optimize tasks at scale
- Familiar with data pipeline Technolgies and concepts (Kafka, SSIS, Informatica, ARGO)
- Proficient in Microsoft Azure Cloud ecosystem or similar Cloud provider
- Understanding of common DevOps, DataOps and CICD processes, methodologies, and technologies like Azure DevOps, GitHub, Terraform etc.
- Ability to take ownership and facilitate consensus among a diverse group of stakeholders by collaborating and problem solving in a flexible and adaptable open team environment.
- Highly self-motivated and directed, excellent critical thinking skills, time management, and troubleshooting abilities, with proven ability to learn and understand modern technologies.
- Bachelor’s Degree or higher in Computer Science or related field.
And when it comes to hiring, we don't just look for the right person for the job, we seek out the right person for DriveTime. Buckle up for plenty of opportunities to grow in a professional, fun, and high-energy environment!