Position Summary
As a Senior Software Engineer specializing in Data Platforms, your role will involve transforming requirements into practical solutions through coding, fostering collaboration across diverse teams, and offering strategic insights to drive innovation. You will lead small to medium-sized project teams, serving as a trusted advisor and subject matter expert. Your expertise will guide fellow engineers, equipping them with the necessary tools to achieve project objectives effectively.
About Team
Common Data Platform
In the data platform team, we're pioneering the next wave of advancements and adoption in Big Data technology. Utilizing an open-source software stack and cutting-edge cloud infrastructure, we're developing new capabilities to handle vast data volumes and adapt to dynamic data environments. We're looking for innovative individuals who can lead the charge in driving these data platform capabilities from inception through implementation.
What You'll Do
- Design and develop the next-generation big data platform on public cloud infrastructure.
- Prototype proof-of-concept solutions with rapid iteration and experimentation.
- Manage forks of essential open-source tools for platform maintenance.
- Create tools for self-service data management and cluster administration.
- Collaborate with architects and senior technical leads to enhance complex software components.
- Design, configure, and implement scalable systems capable of processing terabytes of data across hybrid on-premise and cloud environments.
- Collaborate with business customers, product managers, and engineers to design and implement feature-based solutions in an agile manner.
- Design architecture for data integration pipelines, ensuring efficient extraction, transformation, and loading of data from diverse sources using Spark, SQL, and other technologies.
- Develop robust and scalable applications using SQL, Scala/Python, and Spark.
- Implement real-time data streaming and processing using Kafka and/or Spark streaming.
- Build data ingestion processes to maintain Data Lakehouse on Google Cloud Platform and Azure.
- Maintain design documentation, test cases, and performance monitoring using Git, Jenkins, and Confluence. Implement continuous integration and deployment workflows using Git and Jenkins.
What You'll Bring
- 5-10 years of relevant experience in designing and building resilient, scalable systems.
- Comprehensive knowledge of big data technologies such as Apache Hadoop, Hive, Sqoop, Kafka, and Spark.
- Proven ability to translate requirements into project parameters with agility.
- Expertise in writing complex, optimized queries for large datasets.
- Strong skills in developing robust, scalable applications using SQL, Scala/Python, Shell Scripting, and Spark.
- Experience in developing data pipelines using Airflow or similar tools.
- Hands-on proficiency with Python, Spark, Java, Azure/GCP, Kafka (or other messaging services), and relational databases (MySQL, Oracle etc.).
- Proficiency in CI/CD development environments/tools including Git, Maven, and Jenkins.
- Experience with public cloud platforms such as Azure or Google.
- Demonstrated ownership of end-to-end design, development, and deployment of solutions.
- Strong initiative to drive change and adapt quickly to new and emerging technologies.
- Ability to balance long-term strategic vision with short-term implementation goals.
- Excellent communication and interpersonal skills.
- Track record of mentoring and coaching junior engineers.
- Focus on reusability, frameworks, patterns, and configuration tools to accelerate development.
- Practitioner of Agile (Scrum) methodology, adept at working in distributed teams.
Minimum Qualifications
Outlined below are the required minimum qualifications for this position:
- Bachelor's degree in computer science, computer engineering, computer information systems, software engineering, or related area and 3 years' experience in software engineering or related area.
- 5 years' experience in software engineering or related area.
Preferred Qualifications
Outlined below are the optional preferred qualifications for this position:
- Master's degree in Computer Science, Computer Engineering, Computer Information Systems, Software Engineering, or related area and 1 year's experience in software engineering or related area.
- Knowledge in implementing Web Content Accessibility Guidelines (WCAG) 2.2 AA standards, assistive technologies, and integrating digital accessibility seamlessly.
Primary Location
1670 W University Dr, Mckinney, TX 75069-3444, United States of America
#J-18808-Ljbffr