Company:
Sigmaways Inc
Location: San Francisco
Closing Date: 31/10/2024
Hours: Full Time
Type: Permanent
Job Requirements / Description
In this role, you can contribute to several high-quality data solutions and enhance your technical skills across many disciplines.
Key Responsibilities
- Design, develop, and maintain end-to-end data solutions using open source, modern data lake, and enterprise data warehouse technologies (Hadoop, Spark, Cloud, etc.)
- Contribute to multiple data solutions throughout their entire lifecycle (conception to launch).
- Partner with business stakeholders to understand and meet their data requirements.
- Design, build, and maintain machine learning data pipelines.
- Maintain security in accordance with Bank security policies.
- Participate in an Agile development environment.
Responsibilities:
- Bachelor’s degree in Computer Science, Engineering, or Information Management (or equivalent)
- 8+ years of relevant work experience supporting Big Data platform applications and solutions.
- Hands-on experience with big data (Hadoop / Cloudera, Cloud, etc.)
- Experience with object-oriented scripting languages: Java (required), Python (nice to have), etc.
- Experience in Big Data, Hadoop, Spark, HDFS, Hive, HBase, Kafka, Impala, Spark, and AWS, EMR.
- Advanced knowledge of SQL and experience with relational databases.
- Experience with UNIX shell scripts and commands.
- Experience with version control (git), issue tracking (jira), and code reviews.
- Cloud/AWS is nice to have.
- Proficient in agile development practices.
- Ability to communicate effectively (both verbal and written).
- Ability to work collaboratively in a team environment.
- Ability to balance competing priorities and expectations.
Share this job
Sigmaways Inc